NYCPHP Meetup

NYPHP.org

[nycphp-talk] Google/Adsense -> PHP

Jeff Siegel jsiegel1 at optonline.net
Fri Dec 19 15:18:35 EST 2003


I believe you can go here 
http://tool.motoricerca.info/robots-checker.phtml to check the syntax. I 
can't speak about the code...but if were trying to do this I'd look 
inside some of the weblogs to see what the googlebot looks like in terms 
  of the user agent.

Jeff

jon baer wrote:

> Does/would this work w/ robots effectively?
> 
> User-Agent: googlebot
> Disallow:
> 
> User-Agent: Mediapartners-Google*
> Disallow:
> 
> @ top of the page:
> 
> if (preg_match("/Google/i", $_SERVER['HTTP_USER_AGENT'])) {
>     header("Location: google.php");
>     exit();
> }
> 
> - jon
> 
> ----- Original Message -----
> From: "Jeff Siegel" <jsiegel1 at optonline.net>
> To: "NYPHP Talk" <talk at lists.nyphp.org>
> Sent: Friday, December 19, 2003 5:23 AM
> Subject: Re: [nycphp-talk] Google/Adsense -> PHP
> 
> 
> 
>>I've always gone here (http://www.robotstxt.org/wc/robots.html) to deal
>>with robots questions.
>>
>>Jeff Siegel
>>
>>jon baer wrote:
>>
>>
>>>greetings ...
>>>
>>>sounds like i missed a great party :-\
>>>
>>>i have a quick question ... below is something i currently have on 2
>>>websites which now want to integrate Google AdSense and im trying to
> 
> figure
> 
>>>it out ... first both sites use Session URL rewrite ...
>>>
>>>-snip-
>>>Your website is using session ID's in the URL.
>>>If your web pages use session ID's, you may not receive targeted ads on
>>>those pages. Since this session ID - and therefore the URL - changes
> 
> every
> 
>>>time a different user views a page, the URL will not be in the index and
>>>will be queued to be crawled. Once the URL is crawled, however, the
> 
> session
> 
>>>will most likely have expired. This means that pages seen by the users
> 
> are
> 
>>>never in the index. You will need to remove the session ID's in order to
>>>display targeted ads.
>>>-snip-
>>>
>>>However their crawler lists a User-Agent as:
>>>Mediapartners-Google*
>>>
>>>So does anyone think I could create a single dynamic page/catalog
> 
> (acting
> 
>>>like static) for that user agent and then redirecting all other requests
> 
> to
> 
>>>the main site? (http://www.website.com/google.php) ...
>>>
>>>Offhand (I forgot the robots.txt syntax to direct all requests to that
>>>location, anyone know?)
>>>
>>>Will research more but wanted to see how others were handling the same
>>>situation ...
>>>
>>>- jon
>>>
>>>pgp key: http://www.jonbaer.net/jonbaer.asc
>>>fingerprint: F438 A47E C45E 8B27 F68C 1F9B 41DB DB8B 9A0C AF47
>>>
>>>_______________________________________________
>>>talk mailing list
>>>talk at lists.nyphp.org
>>>http://lists.nyphp.org/mailman/listinfo/talk
>>>
>>
>>--
>>Found on the Simpson's Website:
>>"Ooooooh, they have the internet on computers now!"
>>
>>_______________________________________________
>>talk mailing list
>>talk at lists.nyphp.org
>>http://lists.nyphp.org/mailman/listinfo/talk
>>
> 
> 
> _______________________________________________
> talk mailing list
> talk at lists.nyphp.org
> http://lists.nyphp.org/mailman/listinfo/talk
> 

-- 
Found on the Simpson's Website:
"Ooooooh, they have the internet on computers now!"




More information about the talk mailing list