Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Capturing stdout output of jobs #886

Open
anandsaha opened this issue Oct 4, 2017 · 9 comments · May be fixed by #1515
Open

Capturing stdout output of jobs #886

anandsaha opened this issue Oct 4, 2017 · 9 comments · May be fixed by #1515
Labels

Comments

@anandsaha
Copy link

anandsaha commented Oct 4, 2017

Hi,

I would like to somehow capture the data dumped by my job onto stdout. Is there a way to access it from the Job object?

I am spawning new processes using Popen in my job function.

Thanks,
Anand

@theodesp
Copy link
Contributor

theodesp commented Oct 5, 2017

You can. Take a look at the custom worker classes here
http://python-rq.org/docs/workers/#custom-worker-classes

You can implement your own output capturing worker for example by overriding the execute_job method here
/rq/worker.py@master#L585-L594

This is more tricky though as you have to capture the output of a forked process.

@weatherfrog
Copy link

@anandsaha Have you found a solution for this?
I think it would be great if this were standard behavior: the output of a job is automatically attached to Job objects, similar to the return value in job.result.
Would this be hard to implement?

@jetkov
Copy link

jetkov commented Oct 2, 2018

I agree this would be a great feature. It would streamline the process of integrating RQ into backend applications using existing shell tools, for example.

@robsalasco
Copy link

+1 it would be awesome to have this!

@selwin
Copy link
Collaborator

selwin commented Sep 8, 2019

Can't you use logging to redirect your output to stdout?

@pampanelson
Copy link

+1

@selwin
Copy link
Collaborator

selwin commented Jan 22, 2020

Yes, I think capturing job outputs in “job.output” is a good idea. I’d welcome a PR for this.

@pampanelson
Copy link

I solved this by a dummy way:
1.manual set job id and output with job.meta when start a job
job = q.enqueue(render,job_id=jobID)
job.meta['output'] = 'init'
job.save_meta()

2.fetch and update job.meta['output'] by job id during subprocess running

proc = subprocess.Popen([cmd]
        stdout=subprocess.PIPE, 
        stderr=subprocess.STDOUT,
        universal_newlines=True)

while proc.poll() is None:
    line = proc.stdout.readline()
    if line:
        # Process output here
        job = Job.fetch(jobID, connection=redis)
        job.meta['output'] = line
        job.save_meta()

3.define a function to read job.meta['output'] and call it

@brettnolan
Copy link

I solved this by a dummy way:
1.manual set job id and output with job.meta when start a job
job = q.enqueue(render,job_id=jobID)
job.meta['output'] = 'init'
job.save_meta()

2.fetch and update job.meta['output'] by job id during subprocess running

proc = subprocess.Popen([cmd]
        stdout=subprocess.PIPE, 
        stderr=subprocess.STDOUT,
        universal_newlines=True)

while proc.poll() is None:
    line = proc.stdout.readline()
    if line:
        # Process output here
        job = Job.fetch(jobID, connection=redis)
        job.meta['output'] = line
        job.save_meta()

3.define a function to read job.meta['output'] and call it

Using this method, are you still able to watch the output of the currently executing job by issuing 'rq worker'?

@rpkak rpkak linked a pull request Jul 14, 2021 that will close this issue
@ccrvlh ccrvlh added the feature label Jan 29, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

Successfully merging a pull request may close this issue.

9 participants