Setting up robots.txt and XML Sitemap

robots.txtsitemap

Robots.txt and sitemap.xml files help search engines index your site correctly.

robots.txt

What is it

The robots.txt file tells search robots which pages on a site can be indexed and which cannot.

Standard settings

In ecom.md robots.txt is configured automatically and usually includes:

  • Allowing indexing of main pages (products, categories)
  • Prohibition of indexing of service pages (shopping cart, personal account, search)
  • Link to sitemap.xml

When to edit

  • If you need to close certain sections from indexing
  • To add additional directives
  • To control crawl speed (Crawl-delay)

XML Sitemap

What is it

XML Sitemap is a file with a list of all pages on a site that helps search engines find and index your content faster.

Automatic generation

In ecom.md sitemap.xml is generated automatically and includes:

  • Home page
  • Product categories
  • Product cards
  • Content pages (About us, Delivery, etc.)
  • News and promotions

Update

The sitemap is updated automatically when:

  • Adding new products
  • Creation of new categories
  • Changing content pages

Submitting to search engines

  1. Copy the URL of your sitemap: https://your-site/sitemap.xml
  2. Add it to Google Search Console
  3. Add Webmaster to Yandex (if relevant)
  4. Wait for new pages to be indexed

article.helpfulQuestion