By using site services you agree to our Cookies Use. We and our partners operate globally and use cookies, including for analytics, personalisation, and ads.Accept
X

/The sitemap.

For each site, it is useful to generate a sitemap. This will help search robots to crawl your site more deeply and to put more pages into the index. A large number of pages in the index means that more pages get into the search results.

The sitemap must be specified in the robots.txt file by "Sitemap" directive.

Sitemap: https://example.com/sitemap.xml

In general, the sitemap file should look like this:

<?xml version="1.0" encoding="UTF-8"?>
<urlset xmlns="http://www.sitemaps.org/schemas/sitemap/0.9">
<url>
    <loc>http://www.example.com</loc>
    <priority>1.00</priority>
    <changefreq>weekly</changefreq>
</url>
...
</urlset>

Here are the recommendations for sitemap compiling:

  1. A <priority> element is required. This tag indicates how often to check a particular page. For example, the page with recent news is updating often. At the same time the page with the news itself remains unchanged for years. Which priority should be specified for the pages:
    0.8-1.0 - Homepage, subdomains, product info, major features, major category pages.
    0.4-0.7 - Articles and blog entries, minor category pages, sub-category pages, FAQs.
    0.0-0.3 - Outdated news and info that has become irrelevant.
  2. Sitemap shouldn't contain any pages that are prohibited from indexing in robots.txt. It will look weird if you specify a page in the sitemap that you have prohibited to be indexed.
  3. The protocol for the links in the sitemap and the site itself should be the same.If the site works on https, then links in the file also must be https.
  4. The file size shouldn't exceed 2 Mb. If you have a lot of pages, you need to split the sitemap into several files.

If your website consists of a large number of pages, the sitemap will increase the traffic from search robots.