MemoryError while loading huge initial data
I have the initial data from my old database which takes around 6GB. I could "dumpdata" my old database without any problem. But when I attempted to restore them to the new database, I got the MemoryError:
python manage.py loaddata fixtures/initial_data.json MemoryError: Problem installing fixture 'fixtures/initial_data.json':
Is there any way to make loaddata work with chunks or is it possible to load that big file?
For large database use backup tools for dumping database data instead "django dumpdata". To load database data use restore tools instead "django loaddata".
I've wrote this script, which is a fork of django's dumpdata, but dumps data in chunks to avoid MemoryError. And then load these chunks one by one.
Script is available at https://github.com/fastinetserver/django-dumpdata-chunks
1) Dump data into many files:
mkdir some-folder ./manage.py dumpdata_chunks your-app-name --output-folder=./some-folder --max-records-per-chunk=100000
2) Load data from the folder:
find ./some-folder | egrep -o "([0-9]+_[0-9]+)" | xargs ./manage.py loaddata
PS. I used it to move data from Postgresql to MySQL.