If you look here you will find the following:
Django-celery uses MySQL to keep track of all tasks/results, rabbit-mq is used as a communication bus basically.
What really is happening is that you are trying to fetch the
ASyncResult of the worker while the task is still running (the task invoked an HTTP request to your server and since it didn’t return yet, the db locking session from the worker is still active and the result row is still locked). When Django tries to read the task result (its state and the actual return value of the run function) it finds the row locked and issues you a warning.
There are a few ways to go about resolving this:
Set up another celery task to reap the result and chain it to your processing task. That way original task will finish, release the lock on db and the new one will acquire it, read the result in django and do whatever you need it to do. Look up celery docs on this.
Don’t bother at all, and simply do a POST to Django with full processing result attached as a payload, rather than trying to fetch it via db.
Override on_success in your task class and POST your notification request to Django then at which point the lock should be released on the db table.
Notice that you need to store the whole processing result (no matter how big it is) in the return of the run method (possibly pickled). You didn’t mention how big the result can be so it might make sense to actually just do scenario #2 above (which is what I would do). Alternatively I would go with #3. Also don’t forget to handle on_failure method as well in your task.