[Answered ]-Live (long-polling) connection to Django, via Nginx (or apache) – reducing the number of queries


Huh — Ok, look: it was a little hard to follow your response (so if I’m missing something, you might want to clarify yr language) but: I think I get the sense of what you’re doing.

If you’re always dealing with the last 10 messages (or really, the last N messages where in this case N = 10) you might find Redis to be a good fit. I set up long-polling to provide the status of a queue worker to an interface widget, via WebSockets — my app was a Django app (running on gunicorn fronted by nginx) but I added a tiny little tornado class for this worker thread and its WebSocket status interface with no problem.

The status function had to compare the last N values; since I was using Redis data structures for the queue, it was easier to do the comparison in the client WebSocket consumer — I had that luxury, as the behavior that hinged on the comparison was only a UI state, and not an update to data or app logic, which I assume you’re worried about something like that.

What I am getting at is that the model structure you provided will work for the task, but it isn’t particularly well-suited. If Redis and Tornado sound too exotic to you, consider adding fields to group your records into per-session sets… maybe with indices derived from your session identifiers or some such.

Let me know if that is helpful.

Leave a comment