Unlikely. A Django application is almost always I/O-bound, usually because of the database connection. PyPy wouldn’t help with that at all, even if it was purely compatible (which I’m not sure it is).
PyPy does improve performance for all benchmarks that are in the PyPy’s benchmark suite. This is only template rendering for now, but noone submitted anything else. It’s however safe to assume that performance critical code will be faster (especially after some tuning).
Compatibility-wise databases are a bit of an issue, because only sqlite is working and it’s slow (there is a branch to fix it though). People also reported pg8000 working with sqlalchemy for example, but I don’t have a first-hand experience.
- Inlineformset_factory create new objects and edit objects after created
- Does changing a django models related_name attribute require a south migration?
- How do you actually use a reusable django app in a project?
I have done some experimentation with PyPy + Django. There are two main issues:
Most database adaptors and other third-party modules cannot be compiled with PyPy (even when the wiki says they can).
One server I thought might benefit from JIT compilation because it did a fancy calculation in some requests had an increasing memory footprint, perhaps because the JIT was storing traces that turned out to be unique to each request so were never reused?
Theoretically PyPy might be a win if your server is doing interesting calculations, uses pure-python modules, and has large numbers of objects in-memory (because PyPy can reduce the memory used per-object in some circumstances). Otherwise the higher memory requirements of the JIT will be an impediment because it reduces opportunities for in-memory caching and may require extra servers to run enough server processes.
- Ruby HAML with Django?
- Python Social auth authentication via access-token fails
- Accessing Django OneToOneField in templates?
- AuthAlreadyAssociated Exception in Django Social Auth
- Accessing form fields as properties in a django view