paco
¶
Small utility library for generic coroutine-driven, asynchronous-oriented programming in Python +3.4.
Built on top of asyncio, paco provides missing capabilities from Python stdlib to write asynchronous cooperative multitasking in a nice-ish way + some convenient functional helpers.
Note: paco is still beta.
Features¶
- Simple and idiomatic API, extending Python stdlib with async coroutines gotchas.
- Built-in configurable control-flow concurrency support.
- Useful iterables, decorators and functors.
- Provides coroutine-ready compose, throttle, partial, until, race and other functional helpers.
- Asynchronous coroutine port of Python built-in functions: filter, map, dropwhile, filterfalse, reduce...
- Coroutines control flow and higher-order functions goodness.
- Better asyncio.gather() and asyncio.wait() implementations with optional concurrency control and ordered results.
- Good interoperability with asyncio and Python stdlib functions.
- Partially ports Python stdlib higher-order functions and iterables to be used in async coroutines world.
- Works with both async/await and yield from coroutines syntax.
- Small and dependency free.
- Compatible with Python +3.4.
Installation¶
Using pip
package manager:
pip install paco
Or install the latest sources from Github:
pip install -e git+git://github.com/h2non/paco.git#egg=paco
API¶
- paco.run
- paco.partial
- paco.apply
- paco.constant
- paco.throttle
- paco.compose
- paco.wraps
- paco.once
- paco.times
- paco.defer
- paco.timeout
- paco.wait
- paco.gather
- paco.each
- paco.series
- paco.map
- paco.filter
- paco.reduce
- paco.some
- paco.every
- paco.filterfalse
- paco.dropwhile
- paco.repeat
- paco.until
- paco.whilst
- paco.race
- paco.ConcurrentExecutor
Examples¶
Asynchronously execute multiple HTTP requests concurrently.
import paco
import aiohttp
import asyncio
async def fetch(url):
async with aiohttp.ClientSession() as session:
async with session.get(url) as resp:
return resp
async def fetch_urls():
urls = [
'https://www.google.com',
'https://www.yahoo.com',
'https://www.bing.com',
'https://www.baidu.com',
'https://duckduckgo.com',
]
# Map concurrent executor with concurrent limit of 3
responses = await paco.map(fetch, urls, limit=3)
for res in responses:
print('Status:', res.status)
loop = asyncio.get_event_loop()
loop.run_until_complete(fetch_urls())
License¶
MIT - Tomas Aparicio
Contents¶
- Examples
- API documentation
- paco.map
- paco.run
- paco.each
- paco.some
- paco.race
- paco.once
- paco.wait
- paco.wraps
- paco.defer
- paco.apply
- paco.every
- paco.until
- paco.times
- paco.series
- paco.gather
- paco.repeat
- paco.reduce
- paco.filter
- paco.whilst
- paco.partial
- paco.timeout
- paco.compose
- paco.throttle
- paco.constant
- paco.dropwhile
- paco.filterfalse
- paco.concurrent
- paco.ConcurrentExecutor
- History