Strategies for speeding up batch ORM operations in Django

One of my API calls can result in updates to a large number of objects (Django models). I'm running into performance issues with this since I'm updating each item individually, saving, and moving on to the next:

for item in Something.objects.filter(x='y'):

Sometimes my filter criterion looks like "where x in ('a','b','c',...)".

It seems the official answer to this is "won't fix". I'm wondering what strategies people are using to improve performance in these scenarios.


The ticket you linked to is for bulk creation - if you're not relying on an overridden save method or pre/post save signals to do bits of work on save, QuerySet has an update method which you can use to perform an UPDATE on the filtered rows:

Something.objects.filter(x__in=['a', 'b', 'c']).update(a='something')

You need to use transactions or create the sql statement by hand. You could also try using SQLAlchemy which supports a few great ORM features like Unit of Work (or application transaction).

Django transactions:


Need Your Help

Why does assigning a value to a cell change the value to american date format?

excel-vba excel-2010 vba excel

I'd like an explanation as to why the following occurs (I've found a few questions offering partial workarounds, but none explaining what's going on).

Updating nullability of columns in SQL 2008

sql-server sql-server-2008 ssms

I have a very wide table, containing lots and lots of bit fields. These bit fields were originally set up as nullable. Now we've just made a decision that it doesn't make sense to have them nulla...