python - Stream results in celery -


i'm trying use celery schedule , run tasks on fleet of servers. each task long running (few hours), , involves using subprocess call program given inputs. program produces lot of output both in stdout , stderr.

is there way show output produced program client in near real time? stream output, client can watch output spewed task running on server without logging server?

you did not specify many requirements , constraints. i'm going assume have redis instance somewhere.

what can read output other process line line , publish through redis:

here's example can echo data file /tmp/foo testing:

import redis redis_instance = redis.redis() p = subprocess.popen(shlex.split("tail -f /tmp/foo"), stdout=subprocess.pipe)  while true:     line = p.stdout.readline()     if line:         redis_instance.publish('process log', line)     else:         break 

in separate process:

import redis  redis_instance = redis.redis() pubsub = redis_instance.pubsub() pubsub.subscribe('process log')  while true:     message in pubsub.listen():         print message  #  or use websockets comunicate browser 

if want process end, can e.g. send "quit" after celery task done.

you can use different channels (the string in subscribe) separate output different processes.

you can store log output in redis, if want to,

redis_instance.rpush('process log', message) 

and later retrieve in full.


Comments

Popular posts from this blog

c++ - Function signature as a function template parameter -

algorithm - What are some ways to combine a number of (potentially incompatible) sorted sub-sets of a total set into a (partial) ordering of the total set? -

How to call a javascript function after the page loads with a chrome extension? -