Why would I block specific Web pages and/or search engines?
You may want to block search engines or Web pages from crawling specific areas or pages of your website. Crawling is the process of an application scanning the Internet and indexing information to return in search query results. While you want many crawlers to access your site, here are a few reasons to consider blocking Web pages or search engines using Search Engine Visibility:
- To avoid crawlers such as
spambots trying to collect your private information for spamming purposes.
- To avoid traffic overload by spiders accessing your site at high speeds or bots that index your scripts.
- To protect information that you want to remain private such as addresses or contact information, printer-friendly pages that contain duplicate content, testing or experimental pages, and advertisement links.
There is no textbook answer for when or why you should block Web pages or search engines, but it is important that you watch for problems and correct them as needed. For information on blocking techniques, see Controlling Web Site Crawling Using Search Engine Visibility.
Your message was successfully sent.
There was an error in sending your message, please try again.
Have a question about the content of this article?