PassMark Logo
Home » Forum

Announcement

Collapse
No announcement yet.

meta robots = extremely slow indexing?

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • meta robots = extremely slow indexing?

    Overall Zoom is working well for us, but we have a number of pages where we want the spider to follow the links but not index the page itself. In those cases we used the robots meta tag and set zoom to enable robot.txt support. The thing is this seems to make the indexing extremely slow, so that what took approximately 1 hour before is taking over 10 hours now. Is that normal?

    If so is there any other alternatives to getting the spider to follow the links but not index the page itself? Maybe through content filtering?

  • #2
    My guess would be that you have requested a slow crawl in the robots.txt file itself.

    There is a setting called Crawl-delay: that can be used.

    See this page for more details.
    http://www.wrensoft.com/zoom/support/useragent.html

    Another option would be to use the noindex meta tag.
    <meta name="robots" content="noindex">

    Comment


    • #3
      Yea, you're right it was as simple as that. We had a crawl delay set in the robots.txt file. Thanks.

      Comment

      Working...
      X