Celery. Decrease number of processes

Is there any way around to limit number of workers in celery? I have small server and celery always creates 10 processes on 1 core processor. I want to limit this number to 3 processes.


I tried setting concurrency to 1 and max_tasks_per_child to 1 in my settings.py file and ran 3 tasks at the same time. It just spawns 1 process as a User and the other 2 as celery. It should should just run 1 process and then wait for it to finish before running the other one.

I am using django celery.


I was assigning concurrency by writing CELERYD_CONCURRENCY = 1 in settings.py file. But when I looked at the celery log file using "tail -f /var/log/celery/w1.log" then I saw a value of 8 assigned to concurrency. This told me that setting.py does not change the concurrency. To fix this issue I added the following lines to "/etc/default/celeryd" file.

# Extra arguments to celeryd

Now the second task in the queue waits until the first is finished.


celery worker --concurrency option allows to specify the number of child processes processing the queue.

I have this in my celeryd-config file


which results in

$ ps -ef | grep "celery" | grep -v "grep"
www-data  1783     1  0 17:50 ?        00:00:46 /usr/bin/python /opt/webapps/repo/manage.py celeryd --loglevel=INFO -n celery1.xxx-31-39-06-74-75 --logfile=/var/log/celery/1.log --pidfile=/var/run/celery/1.pid 
www-data  1791  1783  0 17:50 ?        00:00:01 /usr/bin/python /opt/webapps/repo/manage.py celeryd --loglevel=INFO -n celery1.xxx-31-39-06-74-75 --logfile=/var/log/celery/1.log --pidfile=/var/run/celery/1.pid
www-data  1802     1  0 17:50 ?        00:00:52 /usr/bin/python /opt/webapps/repo/manage.py celeryd --loglevel=INFO -n celery2.xxx-31-39-06-74-75 --logfile=/var/log/celery/2.log --pidfile=/var/run/celery/2.pid 
www-data  1858  1802  0 17:50 ?        00:00:01 /usr/bin/python /opt/webapps/repo/manage.py celeryd --loglevel=INFO -n celery2.xxx-31-39-06-74-75 --logfile=/var/log/celery/2.log --pidfile=/var/run/celery/2.pid

There are FOUR processes, not two, but there are two workers. It looks like each worker thread has two processes. So presumably if you set CELERYD_NODES to 3, you will get 3 workers but 6 processes.

You should try --autoscale=3 in the celery option params.

Need Your Help

Concurrency in a GIT repo on a network shared folder

windows linux git concurrency network-share

I want to have a bare git repository stored on a (windows) network share. I use linux, and have the said network share mounted with CIFS. My coleague uses windows xp, and has the network share

Working with USB devices in .NET

.net usb communication

Using .Net (C#), how can you work with USB devices?