[nycphp-talk] Looking for ideas on how to allow spiders to cra wl authenticated pages
mjdewitt at alexcommgrp.com
Mon Feb 24 21:56:05 EST 2003
We have tried a number of ways to raise the ranking of one site, but nothing
seems to make much of a difference. Our thinking is that the site appears
small since so much is hidden behind free registration and that we are up
against tough competition (the US government) who offers alot of pages under
several domains and incestuous links between related sites.
I was thinking that if we opened up the free content to spiders, more people
would be aware of us, and even if they were faced with free registration,
that we offer a unique service in the field and that it would be ultimately
The noarchive meta header is a great suggestion and one that I will
Thanks for your input.
> -----Original Message-----
> From: Ophir Prusak [SMTP:ophir at prusak.com]
> Sent: Monday, February 24, 2003 3:48 PM
> To: NYPHP Talk
> Subject: Re: [nycphp-talk] Looking for ideas on how to allow spiders
> to crawl authenticated pages
> Hi Michael,
> On the non-technical side of this, I think it's a bad idea.
> Lets say technically you could allow the google spider to visit certain
> pages without registration.
> 1. A user does a search on google and one of the results is a "restricted"
> page google indexed.
> 2. The user clicks on the link
> 3. The user gets redirected or a note saying that in order to view the
> they need to register.
> (4. I'd then go back and look for the Cached link to view the page, but
> say you used the noarchive meta header so google won't cache the page.)
> At this point, I'd be very frustrated, feel like you tricked me and never
> visit your site again.
> IMHO you should re-think how to improve your ranking without giving the
> google spider special privileges.
> Many sites do this by showing just a teaser of the information to
> users. http://safari.oreilly.com is a good example of this. You could
> still use all the meta tags you want and hopefully get most of the benefit
> without deceiving people who get to the site via google.
> ----- Original Message -----
> From: "DeWitt, Michael" <mjdewitt at alexcommgrp.com>
> To: "NYPHP Talk" <talk at nyphp.org>
> Sent: Monday, February 24, 2003 1:45 PM
> Subject: [nycphp-talk] Looking for ideas on how to allow spiders to crawl
> authenticated pages
> > For some of our sites, many of the pages require registration and login
> > order to view the page. I would like to open up those pages to spiders
> > an effort to improve our web rankings.
> > I have some ideas, but I am curious what others have done to allow
> > to crawl a page which normally would not be available to them (since
> > would have been redirected to a login page for access).
> > TIA
> > Mike
> --- Unsubscribe at http://nyphp.org/list/ ---
More information about the talk