The robots.txt file is an additional file for search robots, such as Google, Bing, Yahoo, and others. It is important to compile this file correctly to avoid negative consequences.
In this file, you need to specify:
- Directive "Host" with the protocol and domain of your site.
- It is necessary to list the sections that should be closed from indexing. This can be done with the help of the "Disallow" directive. You should close the login page, personal cabinet, admin panel, etc.
- In the robots.txt file, you need to provide a link to the sitemap using the "Sitemap" directive.
- The file itself must have a UTF-8 or ASCII encoding.