Example of setting up a robot exclusion standard Tools for checking : visual inspection, robots.txt + meta bookmarklet , checking for a ban on indexing pages in webmaster tools ( link 1 and link 2 ). With the help of the bookmarklet "robots.txt + meta" with one click on the bookmarks bar you can check the ban in robots.txt, as well as the presence of the tag in the code 2.
CorrectThe link to the xml version of the sitemap is indicated in list of brazil whatsapp phone numbers robots.txt. Contains pages with only 200 OK server response code. Webmaster analyzers do not show errors. Page priorities (priority element) and last page update dates (lastmod element) are correctly set. Does not contain pages prohibited from indexing. The protocol of the addresses in the sitemap corresponds to the real one (it happens that when moving=URLs with the http protocol remain in the sitemap).
" Analysis of sitemap files " in "Yandex.Webmaster" and " Sitemap files " in Google Search Console. 3. No technical duplicates of pages One page opening at two different addresses means two different pages for search engines. This should not happen. Popular reasons for such duplicates: 301 redirect from www (or vice versa) is not configured so that the site is accessible only via one protocol.
Tools for checking : visual inspection, sections
-
- Posts: 248
- Joined: Sun Dec 22, 2024 10:27 am