Django Models: Keep track of activity through related models?

I have something of a master table of Persons. Everything in my Django app some relates to one or more People, either directly or through long fk chains. Also, all my models have the standard bookkeeping fields 'created_at' and 'updated_at'. I want to add a field on my Person table called 'last_active_at', mostly for raw sql ordering purposes.

Creating or editing certain related models produces new timestamps for those objects. I need to somehow update Person.'last_active_at' with those values. Functionally, this isn't too hard to accomplish, but I'm concerned about undue stress on the app.

My two greatest causes of concern are that I'm restricted to a real db field--I can't assign a function to the Person table as a @property--and one of these 'activity' models receives and processes new instances from a foreign datasource I have no control over, sporadically receiving a lot of data at once.

My first thought was to add a post_save hook to the 'activity' models. Still seems like my best option, but I know nothing about them, how hard they hit the db, etc.

My second thought was to write some sort of script that goes through the day's activity and updates those models over the night. My employers a 'live'er stream, though.

My third thought was to modify the post_save algo to check if the 'updated_at' is less than half an hour from the Person's 'last_active_at', and not update the person if true.

Are my thoughts tending in a scalable direction? Are there other approaches I should pursue?


It is said that premature optimization is the mother of all problems. You should start with the dumbest implementation (update it every time), and then measure and - if needed - replace it with something more efficient.

First of all, let's put a method to update the last_active_at field on Person. That way, all the updating logic itself is concentrated here, and we can easily modify it later.

The signals are quite easy to use : it's just about declaring a function and registering it as a receiver, and it will be ran each time the signal is emitted. See the documentation for the full explanation, but here is what it might look like :

from django.db.models.signals import post_save
from django.dispatch import receiver

@receiver(post_save, sender=RelatedModel)
def my_handler(sender, **kwargs):
    # sender is the object being saved
    person = # Person to be updated

As for the updating itself, start with the dumbest way to do it.

def update_activity(self):
    self.last_active_at = now()

Then measure and decide if it's a problem or not. If it's a problem, some of the things you can do are :

  • Check if the previous update is recent before updating again. Might be useless if a read to you database is not faster than a write. Not a problem if you use a cache.
  • Write it down somewhere for a deferred process to update later. No need to be daily : if the problem is that you have 100 updates per seconds, you can just have a script update the database every 10 seconds, or every minutes. You can probably find a good performance/uptodatiness trade-off using this technique.

These are just some though based on what you proposed, but the right choice depends on the kind of figures you have. Determine what kind of load you'll have, what kind of reaction time is needed for that field, and experiment.

Need Your Help

Docx4J: How to remove relationship for MailMerge?

ms-access relationship mailmerge docx4j

I'm developing a feature with which users can upload docx-documents with placeholders. These placeholders will later be replaced with data from a database. In order to assure that the user only uses

Android Wear: Get phone battery level for display in watch face

android wear-os battery

As the title says, I would like to retrieve the battery level of the phone the wearable is connected so that I can display the level on the watch face. Can someone give me a straightforward example...