Hi Leon. Thanks for making all these tools. I’m very close to having the exact wordpress set up that I want. I’m just stuck on a tiny thing at the moment.
I’ve got a multi-wordpress set up so each site is edited at a url like site1. mywordpress. com and deployed to S3 and served from site1. com. I would like the site1. mywordpress. com to have a robots.txt file that blocks searchbots, but I need the robots.txt in the S3 bucket to be the proper one. Unfortunately it uploads the robots.txt that will block my real site from traffic.
The easiest solution would be a way to just ignore robots.txt and to manually put a different one in the bucket. I could also overwrite the crawled robots.txt using a hook, if you can point me in the right direction. Or maybe there is another options I’m not thinking of.