python - Stream results in celery -


i'm trying use celery schedule , run tasks on fleet of servers. each task long running (few hours), , involves using subprocess call program given inputs. program produces lot of output both in stdout , stderr.

is there way show output produced program client in near real time? stream output, client can watch output spewed task running on server without logging server?

you did not specify many requirements , constraints. i'm going assume have redis instance somewhere.

what can read output other process line line , publish through redis:

here's example can echo data file /tmp/foo testing:

import redis redis_instance = redis.redis() p = subprocess.popen(shlex.split("tail -f /tmp/foo"), stdout=subprocess.pipe)  while true:     line = p.stdout.readline()     if line:         redis_instance.publish('process log', line)     else:         break 

in separate process:

import redis  redis_instance = redis.redis() pubsub = redis_instance.pubsub() pubsub.subscribe('process log')  while true:     message in pubsub.listen():         print message  #  or use websockets comunicate browser 

if want process end, can e.g. send "quit" after celery task done.

you can use different channels (the string in subscribe) separate output different processes.

you can store log output in redis, if want to,

redis_instance.rpush('process log', message) 

and later retrieve in full.


Comments

Popular posts from this blog

Perl - how to grep a block of text from a file -

delphi - How to remove all the grips on a coolbar if I have several coolbands? -

javascript - Animating array of divs; only the final element is modified -