NYCPHP Meetup

NYPHP.org

[nycphp-talk] fsockopen / scrape / html mail

Lynn, Michael (DCS) MLynn at exchange.ml.com
Sat Jan 11 07:09:21 EST 2003


The Newsletter / Web Site I'm scraping is on an intranet, the recipients are outside the firewall on the internet.



-----Original Message-----
From: CHUN-YIU LAM [mailto:chun_lam at hotmail.com] 
Sent: Friday, January 10, 2003 11:53 PM
To: NYPHP Talk
Subject: Re: [nycphp-talk] fsockopen / scrape / html mail


Why not read it in to a buffer, and then parse the href or src to add the 
server doc root url in front?  What do you think?

Matthew

----Original Message Follows----
From: "Lynn, Michael " <MLynn at exchange.ml.com>
Reply-To: talk at nyphp.org
To: NYPHP Talk <talk at nyphp.org>
Subject: [nycphp-talk] fsockopen / scrape / html mail
Date: Fri, 10 Jan 2003 10:45:29 -0500
Received: from mc10-f12.bay6.hotmail.com ([65.54.166.148]) by 
mc10-s18.bay6.hotmail.com with Microsoft SMTPSVC(5.0.2195.5600); Fri, 10 Jan 
2003 07:56:38 -0800
Received: from parsec.nyphp.org ([66.250.131.26]) by 
mc10-f12.bay6.hotmail.com with Microsoft SMTPSVC(5.0.2195.5600); Fri, 10 Jan 
2003 07:56:35 -0800
Received: from nyphp.org (parsec.nyphp.org [66.250.131.26])by 
parsec.nyphp.org (8.12.6/8.12.6) with ESMTP id h0AFjTSt044492;Fri, 10 Jan 
2003 10:45:31 -0500 (EST)(envelope-from listmaster at nyphp.org)
Message-Id: <200301101545.h0AFjTSt044492 at parsec.nyphp.org>
X-Paralist-Archived: <http://nyphp.org/list/paralist_archive.php?L_mid=2231>
X-List-Software: Paralist 0.6
List-ID: <nyphptalk.nyphp.org>
List-Owner: <mailto:listmaster at nyphp.org>
List-Archive: <http://nyphp.org/list/paralist_archive.php?L_lid=1>
List-Subscribe: <http://nyphp.org/list/>
List-Unsubscribe: <http://nyphp.org/list/>
Organization: New York PHP
X-Mailer: Paramail 0.5
Return-Path: listmaster at nyphp.org
X-OriginalArrivalTime: 10 Jan 2003 15:56:37.0477 (UTC) 
FILETIME=[DB72F150:01C2B8C0]

While the fsockopen / scrape issue is hot...

I have a Newsletter mailing script that uses fsockopen to do just as you 
say:
Scrape a url and mail the contents to a list of recipients.

The only problem is that if I mail this to a user that doesn't have access 
to the original url, the resultant email will contain only text with errant 
referances to the images.  This is because the
email merely contains hrefs and src's back to the original server.

Is anyone aware of how to scrape and grab content and mail it so that the 
email is wholey contained and independent of the original url/server?

aTdHvAaNnKcSe,
Mike






_________________________________________________________________
MSN 8 helps eliminate e-mail viruses. Get 2 months FREE* 
http://join.msn.com/?page=features/virus



--- Unsubscribe at http://nyphp.org/list/ ---






More information about the talk mailing list