Many businesses rely on relational databases to manage transactional data. While SQLite offers a serverless, high-speed option, it lacks the robust capabilities needed for real-world applications like web development. PostgreSQL provides concurrent processing and other advanced features.

This article will guide you through two simple methods to migrate your SQLite data into a PostgreSQL database.

By moving from SQLite to PostgreSQL, you can optimize performance and unlock more powerful data management and querying for your applications. Let’s get started connecting these databases!

Methods for Connecting SQLite to PostgreSQL

  • Method 1: SQLite to PostgreSQL using Django
  • Method 2: SQLite to PostgreSQL using pgloader

Method 1: SQLite to PostgreSQL using Django

The following steps will help you in connecting SQLite to Postgres using the Fixture method:

Step 1: Create SQLite DB Dumpdata Backup

python manage.py dumpdata > whole.json

You can also use the modified version of the above command by including natural primary & foreign keys as follows:

python manage.py dumpdata — natural-foreign — natural-primary > whole.json

It will generate the SQLite dump data in the JSON fixture format. It is suggested that you use this alternate command only if you face any error in restoring (loaddata) data to PostgreSQL.

Step 2: Generate a Postgres DB and User

Now, you have to install Postgres in your system’s operating system. Once that is in place, use the below command to log into it:

sudo su — postgres
psql

Next, generate a PostgreSQL database and create a new user. Provide the user with a password and grant it the required database permission:

create user john;create database nw_db;alter role hero with password ‘new_db@123’;grant all privileges on database new_db to john;alter database new_db owner to john;

Now, to connect SQLite to PostgreSQL you need to configure the Setttings.py tool.

Pro Tip: Connect any data source to PostgreSQL and watch your data load in real-time. Experience a hassle-free, zero-maintenance data load with Hevo.

Integrate MySQL to PostgreSQL
Integrate FTP/SFTP to PostgreSQL
Integrate REST API to PostgreSQL
Integrate Webhooks to PostgreSQL

Step 3: Configure Settings.py

In your system, write the following commands:

# install this package 
pip install psycopg2
settings.py
DATABASES = {
‘default’: {
‘ENGINE’: ‘django.db.backends.postgresql_psycopg2’,
‘NAME’: ‘new_db’,
‘USER’ : ‘john’,
‘PASSWORD’ : ‘new_db@123’,
‘HOST’ : ‘localhost’,
‘PORT’ : ‘5452’,
}
}

Next, delete all migration files and create a new migration file for each application. The following command will allow you to delete the old migration files:

find . -path “*/migrations/*.py” -not -name “__init__.py” -delete
find . -path “*/migrations/*.pyc” -delete
do
python manage.py makemigrations
python manage.py migrate

Also, delete the content types (mandatory steps) to avoid numerous errors as follows:

python manage.py shell
from django.contrib.contenttypes.models import ContentType
ContentType.objects.all().delete()

This step completes the task of developing an empty Postgres database. Now, you have to load data into the PostgreSQL database from fixtures.

Step 4: Import Required Fixture via Loaddata from SQLite to PostgreSQL

The final main step to connect SQLite to PostgreSQL requires you to disable every signal present in projects. This will safeguard you from getting unique prebuilt constraints. Finally, use the following command and load data to PostgreSQL database from fixtures:

python manage.py loaddata fixture/whole.json

With this step, you have moved data successfully from SQLite to PostgreSQL via fixtures.

Our comprehensive guide to PostgreSQL analytics can be a great resource if you’re keen on working with PostgreSQL.

Method 2: SQLite to PostgreSQL using pgloader

You can also use command instructs in pgloader to move data from a SQLite file. It supports automatic discovery of the schema, including build of the indexes

Using default settings

Here is the simplest command line example, which might be all you need:

$pgloadersqlite:///path/to/file.dbpgsql://pguser@pghost/dbname

Using advanced options and a load command file

The command then would be:

$ pgloader db.load

Here’s an example of the db.load contents then:

load database

from sqlite:///Users/dim/Downloads/lastfm_tags.db

into postgresql:///tags

with include drop, create tables, create indexes, reset sequences

set work_mem to '16MB', maintenance_work_mem to '512 MB';

Explore more about : MySQL to PostgreSQL.

Best Practices to Migrate from SQLite to Postgres

Since, you now know the steps required to connect SQLite to PostgreSQL, keep in mind the following practices while implementing the above method:

  • In case you wish to use groups and permissions for moving data from SQLite to PostgreSQL, create a data dump without excluding the types (content) & permissions. This way, you can avoid errors while using the loaddata command as groups heavily rely on permissions.
  • Always use SQLite for development only and perform the SQLite to PostgreSQL data transfer in the early stages of your projects. This is because the above-mentioned fixture method to connect SQLite to PostgreSQL operates on massive RAM for loading data. This means, if your SQLite holds more than 100MB of data, the method may falter.
  • You might have to change the char field of your model to max_length size. This is to ensure that you are not stumped by the error caused by dumpdata command which sometimes creates space in charfield value. This way you will avoid errors like “in the model, field varying length 30 exceeds 30” while using loaddata command to connect SQLite to PostgreSQL.

Conclusion

  1. This article introduced you to SQLite and PostgreSQL along with their key features. It also explained the 4 easy steps using which you can connect SQLite to PostgreSQL via Fixture.
  2. The blog also elaborated on the best practices that you can adopt while implementing the above method on your own. 
  3. PostgreSQL is a great tool for storing your business data. However, at times, you need to transfer this data from multiple sources to your PostgreSQL account for further analysis.
  4. For instance, besides migrating data from SQLite, businesses also migrate their data from MySQL to PostgreSQL. Building an in-house solution for this process could be an expensive and time-consuming task.

FAQ on SQLite to PostgreSQL

How to transfer data from SQLite to Postgres?

When transferring data from SQLite to PostgreSQL, the first step is to export the database to a SQL dump file. Then, use PostgreSQL’s psql command line tool or pgAdmin to import the SQL dump file into PostgreSQL. Finally, the SQL syntax should be adjusted to handle the differences in data types of schemas.

Is SQLite better than PostgreSQL?

SQLite is best suited for a single machine with small databases, but its single disk file limits its scalability. In contrast, PostgreSQL is designed for large-scale applications.

How to change SQLite to PostgreSQL in Airflow?

To change from SQLite to PostgreSQL in Apache Airflow, you need to update the Airflow configuration file with PostgreSQL database connection details. Make sure to install PostgreSQL Python Driver in your system. Finally, initialize the Airflow metadata database with PostgreSQL by running airflow initdb.

What is the best way to transfer the data in PostgreSQL?

The best way to transfer data into PostgreSQL is pg_dump or pg_restore.

How to extract data from SQLite database?

To extract data from SQLite database you can use SQLite Command Line, SQLite GUI Tools or any programming language.

Want to take Hevo for a spin? Sign Up or a 14-day free trial and experience the feature-rich Hevo suite firsthand. Also checkout our unbeatable pricing to choose the best plan for your organization.

Share your views on connecting SQLite to PostgreSQL in the comments section!

Abhinav Chola
Research Analyst, Hevo Data

Abhinav Chola, a data science enthusiast, is dedicated to empowering data practitioners. After completing his Master’s degree in Computer Science from NITJ, he joined Hevo as a Research Analyst and works towards solving real-world challenges in data integration and infrastructure. His research skills and ability to explain complex technical concepts allow him to analyze complex data sets, identify trends, and translate his insights into clear and engaging articles.

No Code Data Pipeline For PostgreSQL