Migration from Postgresql to Sqlite

SQLite and PostgreSQL are both popular database systems that can be used in Django projects, although they are intended for different use cases.

SQLite is a serverless, standalone SQL database that is mainly suitable for smaller projects, prototyping and for applications that are only used by a single person or a small team. It is the default database for Django projects and comes with Python, which makes it very easy to set up. SQLite stores data in a single file and does not require separate database software or services, making it perfect for development environments or small applications. However, simultaneous access by many users can lead to performance losses and some advanced database functions that are available in other database systems are missing.

PostgreSQL, on the other hand, is a powerful, open-source, object-relational database system. It is well suited for productive use and can handle large amounts of data and many simultaneous users. PostgreSQL offers a variety of advanced features such as complex queries, transaction control, Multi-Version Concurrency Control (MVCC), and it also supports JSON fields, which makes it very flexible and powerful for more complex applications. Django has excellent support for PostgreSQL and many Django-specific functions such as ArrayFields, HStore and JSONFields require PostgreSQL. However, setting up PostgreSQL can be a little more complicated than SQLite, especially in terms of installing and configuring the server.

Development workflow in Django

Our development workflow is basically as follows:

  1. Local development environment, with SQLite as database. You can build and program here and, if necessary, even save entire databases in GIT so that other developers can also develop with the same database.
  2. Test environment has a PostgreSQL server. We want to be sure that it really works, as the functions and data types do not behave the same in certain cases.
  3. productive environment. This is where the data is created and where the work is done.

Transfer productive data to development environment

There are always curious situations where the software does not do what it should. In most cases clearly visible, in some cases not so clearly visible with a greater debugging effort. Debugging in a production environment is not very nice.

The following steps will help you migrate data from PostgreSQL to SQLite:

  1. Make a backup of the productive ones with PgAdmin. Use the following options:
    • Format: Plain
    • Options: “Use Column Inserts”
  2. Use the following Python script to clean up the resulting SQL script:
import re

input_file = 'data.sql'
output_file = 'fixed-data.sql'

# Definiere die regulären Ausdrücke, nach denen gesucht und gelöscht werden sollen
regex_expressions = [
    r'SET [a-z_ =0-9\']*;',
    r'SELECT pg_catalog.set_config.*;',
    r'CREATE SCHEMA public;',
    r'ALTER SCHEMA public OWNER TO postgres;',
    r'COMMENT ON SCHEMA public IS \'standard public schema\';',
    r'CREATE INDEX[ a-z_0-9()]*;',
    r'SET client_encoding = \'UTF8\';',
    r'ALTER TABLE [a-z_]* OWNER TO postgres;',
    r'ALTER TABLE[\n\r \w()\.,]*;',
    r'CREATE INDEX[ \w()]*;',
    r"SELECT[ \w\.',()]*\);",
    r'--[\w ():;\.-]*' #remove comments

# Lade den Inhalt der Eingabedatei
with open(input_file, 'r') as input_f:
    input_content = input_f.read()

# Durchsuche den Inhalt nach den regulären Ausdrücken und entferne sie
for pattern in regex_expressions:
    input_content = re.sub(pattern, '', input_content, flags=re.MULTILINE)

# Schreibe den bereinigten Inhalt in die Ausgabedatei
with open(output_file, 'w') as output_f:

# Benachrichtigung, dass der Vorgang abgeschlossen ist
print('Durchsuchen und Löschen der Zeilen abgeschlossen.')

And now start the script as follows:

python mysql.py

This removes all expressions that are not understood by SQLite. The file can now be conveniently read in using a tool or the command line:

"C:\Program Files (x86)\sqlite-tools\sqlite3.exe" travelmanager.db < fixed-data.sql