NYCPHP Meetup

[nycphp-talk] Looking for ideas on how to allow spiders to crawl authenticated pages

Jaz-Michael King JMKing at ipro.org
Mon Feb 24 14:01:55 EST 2003


This is an awful suggestion, and I do not recommend doing it, but one way would be to set up a google user/pass, submit the url http://user:pass@www.domain.com/protected to google. That way google would get in, get a cache and an index. Then you'd have to refuse access for that user/pass pair based on user-agent.

Messy, and bad practice, but there for informational purposes only. Please don't do it.

Jaz

******************************
Jaz-Michael King
Online Services Manager
IPRO
http://ipro.org
******************************


>>> mjdewitt at alexcommgrp.com 02/24/03 01:45PM >>>
For some of our sites, many of the pages require registration and login in
order to view the page.  I would like to open up those pages to spiders in
an effort to improve our web rankings.  

I have some ideas, but I am curious what others have done to allow spiders
to crawl a page which normally would not be available to them (since they
would have been redirected to a login page for access).

TIA

Mike


--- Unsubscribe at http://nyphp.org/list/ ---







More information about the talk mailing list