Asynchronous task queue for consuming asynchronous IO tasks, green IO tasks, blocking IO tasks and long running CPU bound tasks.
| Badges: | |
|---|---|
| Master CI: | |
| Dev CI: | |
| Documentation: | https://github.com/quantmind/pulsar-queue/blob/master/docs/index.md |
| Downloads: | http://pypi.python.org/pypi/pulsar-queue |
| Source: | https://github.com/quantmind/pulsar-queue |
| Mailing list: | google user group |
| Design by: | Quantmind and Luca Sbardella |
| Platforms: | Linux, OSX, Windows. Python 3.5 and above |
| Keywords: | server, asynchronous, concurrency, actor, process, queue, tasks, redis |
1 - Create a script which runs your application:
vim manage.py
from pq.api import TaskApp task_paths = ['sampletasks.*'] def app(): return TaskApp(config=__file__) if __name__ == '__main__': app().start()2 - Create the modules where Jobs are implemented
It can be a directory containing several submodules.
mkdir sampletasks cd sampletasks vim mytasks.py
import asyncio import time from pq import api class Addition(api.Job): def __call__(self, a=0, b=0): return a + b class Asynchronous(api.Job): concurrency = api.ASYNC_IO async def __call__(self, lag=1): start = time.time() await asyncio.sleep(lag) return time.time() - start3 - Run the server
Run the server with two task consumers (pulsar actors).
NOTE: Make sure you have Redis server up and running before you start the queue.
python manage.py -w 2
4 - Queue tasks
Launch a python shell and play with the api
>>> from manage import app >>> api = app().backend >>> task = api.queue_task('addition', a=4, b=6) >>> task <TaskFuture pending ID=i26ad5c14c5bb422e87b0f7ccbce5ba06> >>> task = task.wait() task.addition<i24ab99ddf2744902a375e039790dcbc4><SUCCESS> >>> task.result 10 >>> task.status_string 'SUCCESS'This software is licensed under the BSD 3-clause License. See the LICENSE file in the top distribution directory for the full license text. Logo designed by Ralf Holzemer, creative common license.