NYCPHP Meetup

NYPHP.org

[nycphp-talk] What's a good way to handle this?

Rolan Yang rolan at omnistep.com
Mon May 10 23:53:23 EDT 2010


On 5/6/2010 11:36 AM, Anthony Papillion wrote:
> ...
> Now, the TweetFree Network Servers maintain a network block list. So 
> every time the Relay Servers send a post, the Network Server checks to 
> make sure the client that the relay is posting for isn't blocked from 
> the network. If it is, it says no and the relay tells the client that 
> it couldn't post its message.  The problem with this, of course, is 
> that you might have thousands of Relay Servers hitting the Network 
> Servers (as happened during the Iranian election) and each of those 
> requests have to be processed. That puts a bit of a load on the server 
> that I'd like to alleviate.
>
> So my thought is to maintain a blacklist of client keys on the Network 
> Servers and have the Relay Servers download this list every few 
> minutes. Then, clients could be blocked at the RELAY level instead of 
> at the Network level and less load would be put on the Network Servers 
> (of which there are only about 10).
>
> My problem is that I'm not sure how to protect this list. The list is 
> a simple text file that contains client keys. No identifying 
> information, but client keys nonetheless. If it's a .txt file then the 
> contents are viewable publicly which *could* pose a security risk in 
> highly volatile environments. If I name it with the .php extension, 
> it's handled like a PHP file and, thus, the text in it can't be read.
> ...


I would suggest mirroring the infrastructure of spam DNS Real-time 
Blackhole lists. The technology has already been proven and you would 
just be re purposing it for your own application. Rather than have the 
relay servers download an entire list, have them query a pool of private 
DNS Blacklist servers on demand and maybe even cache the data for a 
period of time.

~Rolan



More information about the talk mailing list