[Fixed]-Get Celery to Use Django Test DB

5👍

One way to guarantee that the Celery worker is configured to use the same test database as the tests is to spawn the Celery worker inside the test itself. This can be done by using start_worker

from celery.contrib.testing.worker import start_worker
from myproject.celery import app
def setUpClass(self):
   start_worker(app)

method of the TestCase.

You have to also use a SimpleTestCase from Django or an APISimpleTestCase from Rest rather than a plain TestCase so that the Celery thread and the test thread can see the changes that each other make to the test database. The changes are still destroyed at the end of testing, but they are not destroyed between tests unless you manually destroy them in the tearDown method.

4👍

I battled with a similar problem. The following solution is not clean but it works.

  1. Create a separate Django settings file that inherits from your main
    one. Let’s call it integration_testing.py.
  2. Your file should look like this:
    from .settings import *

    DATABASES = {
    'default': {
    'ENGINE': '<your engine>',
    'NAME': 'test_<your database name>',
    'USER': <your db user>,
    'PASSWORD': <your db password>,
    'HOST': <your hostname>,
    'PORT': <your port number>,
    }

  3. Create a shell script which will set your environment and start up
    the celery worker:

    #!/usr/bin/env bash

    export DJANGO_SETTINGS_MODULE="YOURPROJECTNAME.settings.integration_testing"

    celery purge -A YOURPROJECTNAME -f && celery worker -A YOURPROJECTNAME -l debug

  4. The above works if you configured celery in this manner:

    app = Celery('YOURPROJECTNAME')

    app.config_from_object('django.conf:settings', namespace='CELERY')

  5. Run the script in the background.

  6. Make all tests that involve Celery inherit from TransactionTestCase (or APITransactionTestCase in django-rest-framework)

  7. Run your unit tests that use celery. Any celery tasks will now use your test db. And hope for the best.

2👍

There’s no obvious problem with your code. You don’t need to run a celery worker. With these settings celery will run the task synchronously and won’t actually send anything to your message queue.

You can’t easily run tests with live celery workers anyway because each test is wrapped in a transaction so even if they were connecting to the same database (which they aren’t) the transactions are always rolled back by the test and never available to the worker.

If you really need to do this, look at this stackoverflow answer.

👤joshua

0👍

I have found adding the following to conftest.py works:

from django.conf import settings

...

@pytest.fixture(scope="session")
def celery_worker_parameters(django_db_setup):
    assert settings.DATABASES["default"]["NAME"].startswith("test_")
    return {}

The trick is to add here the django_db_setup fixture so it will be enabled on the worker.

This was tests for tests marked with:

@pytest.mark.django_db(transaction=True)
@pytest.mark.celery()
def test_something(celery_worker):
    ...
👤Udi

0👍

"Question is, how can I get celery to use the same temporary db as the
rest of my tests?"

I solved it by running my tests using docker compose, making the database name configurable by environment variable, and setting the database name to test_db (normal db name is ‘db’).

But I don’t use sqlite…

If you need a solution with sqlite: Make Django test case database visible to Celery

Leave a comment