Search Engines > Search Engine optimization and Link Exchange

What? If Robot.txt blocked?

(1/2) > >>

Anyone explains to me What? If Robot.txt tester blocked a website or webpage?

Robots.txt is a text file that lists webpages which contain instructions for search engines robots. The file lists webpages that are allowed and disallowed from search engine crawling.

it is file to guide google crawlers which one needs index and de-index

even if any URL is blocked by robots still it will appear in search engine results

Whatever the new updates will not able to index to the google. It will block the Google bot to review the changes in the page to be index over the Google search result.


[0] Message Index

[#] Next page

Go to full version