eventlet设计模式

1. 客户端模式(Client Pattern)

一个权威的客户端模式就是网络爬虫,下面例子列出一些站点URL,并尝试检索他们的网页内容以做后续操作

import eventlet
from eventlet.green import urllib2

urls = ["http://www.google.com/intl/en_ALL/images/logo.gif",
       "https://wiki.secondlife.com/w/images/secondlife.jpg",
       "http://us.i1.yimg.com/us.yimg.com/i/ww/beta/y3.gif"]

def fetch(url):
    return urllib2.urlopen(url).read()

pool = eventlet.GreenPool()
for body in pool.imap(fetch, urls):
    print("got body", len(body))

一个稍微复杂的爬虫例子:web crawler example

2. 服务端模式(Server Pattern)

下面列出一个简单的echo服务端

import eventlet

def handle(client):
    while True:
        c = client.recv(1)
        if not c: break
        client.sendall(c)

server = eventlet.listen(('0.0.0.0', 6000))
pool = eventlet.GreenPool(10000)
while True:
    new_sock, address = server.accept()
    pool.spawn_n(handle, new_sock)

一个更加完善的例子:echo server example

3. 分派模式(Dispatch Pattern)

所谓派遣模式就是指,作为服务器端的同时也是客户端,Proxies(代理), aggregators(聚合器), job workers(任务执行) 均可以实现。

import eventlet
feedparser = eventlet.import_patched('feedparser')

pool = eventlet.GreenPool()

def fetch_title(url):
    d = feedparser.parse(url)
    return d.feed.get('title', '')

def app(environ, start_response):
    pile = eventlet.GreenPile(pool)
    for url in environ['wsgi.input'].readlines():
        pile.spawn(fetch_title, url)
    titles = '
'.join(pile)
    start_response('200 OK', [('Content-type', 'text/plain')])
    return [titles]

一个更加完善的例子:Feed Scraper

references

原文地址:https://www.cnblogs.com/forilen/p/5168049.html