[nycphp-talk] Looking for ideas on how to allow spiders to crawl authenticated pages
ophir at prusak.com
Mon Feb 24 15:47:11 EST 2003
On the non-technical side of this, I think it's a bad idea.
Lets say technically you could allow the google spider to visit certain
pages without registration.
1. A user does a search on google and one of the results is a "restricted"
page google indexed.
2. The user clicks on the link
3. The user gets redirected or a note saying that in order to view the page,
they need to register.
(4. I'd then go back and look for the Cached link to view the page, but lets
say you used the noarchive meta header so google won't cache the page.)
At this point, I'd be very frustrated, feel like you tricked me and never
visit your site again.
IMHO you should re-think how to improve your ranking without giving the
google spider special privileges.
Many sites do this by showing just a teaser of the information to non-regged
users. http://safari.oreilly.com is a good example of this. You could then
still use all the meta tags you want and hopefully get most of the benefit
without deceiving people who get to the site via google.
----- Original Message -----
From: "DeWitt, Michael" <mjdewitt at alexcommgrp.com>
To: "NYPHP Talk" <talk at nyphp.org>
Sent: Monday, February 24, 2003 1:45 PM
Subject: [nycphp-talk] Looking for ideas on how to allow spiders to crawl
> For some of our sites, many of the pages require registration and login in
> order to view the page. I would like to open up those pages to spiders in
> an effort to improve our web rankings.
> I have some ideas, but I am curious what others have done to allow spiders
> to crawl a page which normally would not be available to them (since they
> would have been redirected to a login page for access).
> --- Unsubscribe at http://nyphp.org/list/ ---
More information about the talk