MemoryError while loading huge initial data

I have the initial data from my old database which takes around 6GB. I could "dumpdata" my old database without any problem. But when I attempted to restore them to the new database, I got the MemoryError:

    python manage.py loaddata fixtures/initial_data.json
    MemoryError: Problem installing fixture 'fixtures/initial_data.json': 

Is there any way to make loaddata work with chunks or is it possible to load that big file?

Answers


For large database use backup tools for dumping database data instead "django dumpdata". To load database data use restore tools instead "django loaddata".


I've wrote this script, which is a fork of django's dumpdata, but dumps data in chunks to avoid MemoryError. And then load these chunks one by one.

Script is available at https://github.com/fastinetserver/django-dumpdata-chunks

Example usage:

1) Dump data into many files:

mkdir some-folder

./manage.py dumpdata_chunks your-app-name
--output-folder=./some-folder --max-records-per-chunk=100000

2) Load data from the folder:

find ./some-folder | egrep -o "([0-9]+_[0-9]+)" | xargs ./manage.py loaddata

PS. I used it to move data from Postgresql to MySQL.


Need Your Help

get access to chart-object in highchart

javascript angularjs highcharts

I´m not able to get access to the chart-object in highchart with the angularjs directive HIGHCHARTS-NG.

What is the benefit of zerofill in MySQL?

mysql types unsigned-integer

I just want to know what is the benefit/usage of defining ZEROFILL for INT DataType in MySQL?