http: Implement robots.txt
To prevent search engines from being reported and blocked, implement and serve
/robots.txt via get that will prevent valid search engines from trying to reach resources on fake http server.
To upload designs, you'll need to enable LFS and have admin enable hashed storage. More information