Replies: 1 comment
-
Also getting this problem. Not sure what the ramifications are of running celery like this. It seems to work OK, but I am worried about the long-term viability of this solution. |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Hello, command
works fine.
But, with -D option it creates a process that consumes 100% CPU and doesn't work at all. I even don't see the worker in flower. (without -D I can see the worker in flower).
What I use:
python 3.10
celery 5.2.7
django-celery 3.1.17
redis 4.3.3
Also, what is strange. I use docker and on production server it works fine, but not on my local machine.
UPD. If anyone is interested why I even need -D option:
In our project we create 5 workers via "celery multi start" command. And this command under the hood executes 5 commands (celery -A ... worker ... -D). So it uses -D option and doesn't work locally.
My current workaround is run these workers in the bash script like this:
celery -A app worker -n worker1@app &
celery -A app worker -n worker2@app &
celery -A app worker -n worker3@app &
celery -A app worker -n worker4@app &
celery -A app worker -n notification_worker@app &
Note & at the end of each command.
So can someone help me? Please give me know and I upload additional information that will help to resolve the problem.
Beta Was this translation helpful? Give feedback.
All reactions