Global dynamic spoofing of robots.txt file

I am setting up a test server for the needs of developers. I would like nginx to override the robots.txt file for all hosts on all users. Instead of the actual robots.txt file placed in the root of the host, the default file should be displayed. Can you tell me how and where to configure this logic?

Create a new template see Web domains and SSL Certicates — Hestia Control Panel documentation

And add:

location = /robots.txt {
   add_header Content-Type text/plain;
   return 200 "User-agent: *\nDisallow: /\n";
1 Like

Thank you for your reply. I would like to apply my rule to all templates. I’m trying to add your rule in /etc/nginx/nginx.conf, but the validator says that the location directive is not allowed outside the server directive. How can I use this rule without adding it to every server block?

Depending on the template you are using you modify the templates:

The best thing is then to create a copy of this template…

This topic was automatically closed 30 days after the last reply. New replies are no longer allowed.