[Solved]-Django collectstatic from Heroku pushes to S3 everytime


Try setting DISABLE_COLLECTSTATIC=1 as an environment setting for your app – that should disable it from running on every push.

See this article for details – https://devcenter.heroku.com/articles/django-assets :

> Sometimes, you may not want Heroku to run collectstatic on your behalf.
> You can disable collectstatic by enabling user-env-compile as well:

$ heroku labs:enable user-env-compile
$ heroku config:set DISABLE_COLLECTSTATIC=1

I’ve found that simply setting the config will do – no need to also enable user-env-compile – it may be that that this has passed from labs into production?

NB the deployment is managed by the Heroku python buildpack, which you can see here – https://github.com/heroku/heroku-buildpack-python/


I’ve just done a bunch of tests on this, and can confirm that DISABLE_COLLECTSTATIC does indeed disable collectstatic, irrespective of the user-env-compile setting – I think that’s now in the main trunk (but that’s speculation). Doesn’t seem to care what the setting is – if DISABLE_COLLECTSTATIC exists as a config var it is used.


I strongly recommend using the collectfast package for any django static deployment to s3, whether local or from your heroku server. It ignores modified dates and utilizes md5 hashes, which the s3 api will provides very quickly, and (optional) caching to make your static deployments zoom. It took my static deployments from ~10-15 minutes to < 2 minutes and only deploys the files that have actually changed.


I’ve just had that exact same issue and contacted Heroku’s support to find out what is going on. My question to them was

I’ve run into a funky issue doing some deployments. It appears that on each push the date modified on all files is updated to the time a new deploy/git push happens. Is this intended behaviour?

When considering that Django’s collectstatic command only checks the modified date on files when evaluating if the file should be copied across to the final storage backend for static assets, it means that on each new push, all files are first removed from the remote storage (in this case S3) and then re-uploaded. This is both a very slow and wasteful process in terms of bandwidth consumed and requests made.

The answer I received today from “Caio”, one of Heroku’s support staff, was

Hi, that’s how it currently works, yes. I’m routing your feedback to our runtime team to see if we can package files with their original dates.


As confirmed by Alen, Heroku changes the modified date of the files when it deploys. However, Amazon S3 also has an attribute called etag that is an md5 hash of the file content. It’s possible to use this to check if the files have changed instead of the modified date, as implemented in this Django snippet.

I took that code, packaged it and fixed some errors I found and put it on Github as django-s3-collectstatic. It includes a new management command fasts3collectstatic that only uploads new files. Check the Github page for installation instructions.


Why not run collectstatic from local machine?

python manage.py collectstatic --noinput --settings=settings.[prod]


I agree this is annoying- there’s a couple things you can do. I override the collectstatic command and wire it up in my production settings. Below is the command I use:


from django.core.management.base import BaseCommand
class Command(BaseCommand):
    args = '< none >'
    help = "disables collectstatic cmd in contrib"
    def handle(self, *args, **kwargs):
        print 'collectstatic disabled'


I keep this in mysite/disablecollectstatic/management/commands
Then in production settings:

INSTALLED_APPS += ('mysite.disablecollectstatic',)

Alternatively you could use the fact that Heroku does a dry run first before actually invoking the command. If it fails, it won’t run it, which means you could contrive an error (by maybe deleting the static root in your settings, for example) but this approach makes me nervous:


Leave a comment