I've got a script that is about 99% done that clones our production site to a dev site. The dev site will be public, and I don't want Google to crawl it so I want to change the robots.txt config for the dev site. I can create a robots.txt file in the appropriate location but that appears to be overriden by Magento since we have that configured in production.
It doesn't look like there is a command line option to change that setting. But, I assume I can do it through the database. Anyone know where the info is stored?
Hello @charles_gibson1
Run the below query to get the robots.txt data:
SELECT * FROM `core_config_data` WHERE `path` LIKE '%design/search_engine_robots/custom_instructions%'
Hope it helps.