[nycphp-talk] Looking for ideas on how to allow spiders to cra wl authenticated pages
steven at sohh.com
Mon Feb 24 22:19:06 EST 2003
Here's what I did to improve my site in the rankings.
I made an invisible link on my opening page. (TOP LEFT, right below the WIRE
image) and that image links to:
This file contains all the links to my site that I want indexed by spiders.
And when I submit to search engines, I submit:
I'm usually in the top 5 when people search for Hip-Hop.
From: DeWitt, Michael [mailto:mjdewitt at alexcommgrp.com]
Sent: Monday, February 24, 2003 9:47 PM
To: NYPHP Talk
Subject: RE: [nycphp-talk] Looking for ideas on how to allow spiders to
cra wl authenticated pages
This was along the lines of what I was thinking and possibly using the
remote_address in conjunction to further limit the access. I don't know if
this is feasible since from what I have heard of Google, their bots can come
> Sure, you can check the User-Agent header to see if it matches a known
> but your authentication is effectively reduced to someone sending this
> and if you can find User-Agent strings for known spiders, so can an
--- Unsubscribe at http://nyphp.org/list/ ---
More information about the talk