This defines the
dump2py management command.
Write a dump of your database to a set of Python modules. This dump is useful for creating a daily backup or before an upgrade with data migration.
Usage: cd to your project directory and say:
$ python manage.py dump2py TARGET
This will create a python dump of your database to the directory TARGET.
Do not prompt for user input of any kind.
Tolerate database errors. This can help making a partial snapshot of a database which is not (fully) synced with the application code.
Don't complain if the TARGET directory already exists. This will potentially overwrite existing files.
Change the maximum number of rows per source file from its default value (50000) to NUM.
When a table contains many rows, the resulting
.pyfile can become so large that it doesn't fit into memory, causing the Python process to get killed when it tries to restore the data. To avoid this limitation,
dump2pydistributes the content over several files if a table contains are more than NUM rows.
The default value has been "clinically tested" and should be small enough for most machines.
Hint: When your process gets killed, before using this option, consider restarting the web services on your server and trying again. The web services can occupy considerable amounts of memory on a long-running production site. A simple
reload_services.shcan fix your issue.