Recently I faced the same problem and was able to solve it by adding flag -O fair to run Celery command.

My whole command is as follows:

# "-O fair" is a key component for simultaneous task execution by prefork workers
# celery app is module name in my program with Celery instance inside
# cel_app_worker is the name of Celery worker 
# -P prefork - is not necessary since it is default value, but I decided to keep it
celery -A celery_app worker --loglevel=INFO --concurrency=8 -O fair -P prefork -n cel_app_worker

Please try it and let me know if it worked for you.

I use Celery app in Docker, Dockerfile:

FROM python:3.7-alpine

WORKDIR /usr/src/app

RUN apk add --no-cache tzdata

ENV TZ=Europe/Moscow

RUN ln -snf /usr/share/zoneinfo/$TZ /etc/localtime && echo $TZ > /etc/timezone

COPY requirements.txt ./
RUN pip install --no-cache-dir -r requirements.txt

COPY . .

# Create a group and user
RUN addgroup -S appgroup && adduser -S celery_user -G appgroup

# Tell docker that all future commands should run as the appuser user
USER celery_user

# !! "-O fair" is a key component for simultaneous task execution by on worker !!
CMD celery -A celery_app worker --loglevel=INFO --concurrency=8 -O fair -P prefork -n cel_app_worker

CLICK HERE to find out more related problems solutions.

Leave a Comment

Your email address will not be published.

Scroll to Top