AWS: Modifying Robots.txt within a Bitnami EC2 instance
Bitnami offers Wordpress for AWS Cloud which is great for developers whom are not interested in focusing on DevOps topics such as installing php, Apache and Wordpress.
The Bitnami package does it all but there are a few things you'll need to modify including robots.txt files.
In this blog entry, I will show you how to modify your robots.txt file for a Wordpress staging server. The goal will be to create a document that prevents search engines from showing your staging server on their search results.
When you create a Bitnami Wordpress site, by default it comes with a /robots.txt file but it's not easy to find. If you're interested in understanding its path, run a find command with grep.
sudo find / | grep robots.txt
Although the we were able to find a working copy of robots.txt above, this is not where we should modify the file. Instead, we should create a new robots.txt the "Bitnami Way".
Depending on how you intend to install Wordpress, there are two possible places to add a robots.txt.