One of the primary jobs of data practitioners and database administrators is PostgreSQL import CSV to add data to PostgreSQL tables. PostgreSQL is an open-source relational database that is an ideal pltform for developers to build applications.

This article delves into four distinct methods for CSV File Into PostgreSQL, a pivotal process for improving the data readability and enhancing the data analysis process.

The first method involves using Hevo Data’s automated data pipeline tool to replicate your data into your PostgreSQL destination seamlessly. The other three methods cover importing CSV into PostgreSQL using the COPY command, \copy command, and pgAdmin to import CSV files into PostgreSQL.

What is PostgreSQL?

PostgreSQL is an Open-source Relational Database Management System (RDBMS) that fully supports both SQL (Relational) and JSON (Non-relational) querying.

PostgreSQL is an integral part of the Modern LAPP Stack that consists of Linux, Apache, PostgreSQL, and PHP (or Python and Perl). It acts as a robust back-end database that powers many dynamic websites and web applications. It also supports a wide range of popular programming languages such as Python Java, C#, C, C++, Ruby, JavaScript, (Node.js), Perl, Go & Tcl.

Key Features of PostgreSQL

PostgreSQL has become one of the most sought-after Database Management Systems due to the following eye-catching features:

  • Enhanced Data Integrity: PostgreSQL assures data integrity via primary Keys, foreign keys, explicit locks, advisory locks, & exclusion constraints.
  • Multiple Data Types: PostgreSQL allows you to work with a broad range of datasets. It is compatible with several data types such as INTEGER, NUMERIC, BOOLEAN, CHAR, VARCHAR, DATE, INTERVAL, and TIMESTAMP, etc.
  • Data Security: You can rest assured with several layers of data authentication and protection layers. It provides different authentications such as Lightweight Directory Access Protocol (LDAP), Generic Security Service Application Program Interface (GSSAPI), SCRAM-SHA-256, Security Support Provider Interface (SSPI), etc. It also provides a robust access control system along with column & row-level security. 
  • Reliability: PostgreSQL provides a reliable environment with special features such as multi-version concurrency control (MVCC), point-in-time recovery, tablespaces, asynchronous replication, online/hot backups, and write-ahead logging.

Why import CSV File into PostgreSQL?

Importing CSV files into a PostgreSQL database allows for seamless data integration from various sources, making the data more readily available and accessible for analysis, reporting, and other data-driven applications. PostgreSQL’s advanced features, such as data integrity constraints, transactions, and referential integrity, help ensure the consistency and reliability of the imported data.

Additionally, the database’s scalability, performance, backup and recovery mechanisms, and robust security features make it a superior choice over managing large CSV files directly.

You can explore more about : Migrate Postgres to MySQL

How to Import CSV to PostgreSQL?

Before you move forward with performing the PostgreSQL import CSV job, you need to ensure the following 2 aspects:

  • A CSV file containing data that needs to be imported into PostgreSQL.
  • A table in PostgreSQL with a well-defined structure to store the CSV file data.
Automatically Sync Data from Google Sheets to PostgreSQL
Automatically Sync Data from Google Drive to PostgreSQL
Automatically Sync Data from Salesforce to PostgreSQL

In this article, the following CSV file is considered to contain the data given below:

Employee ID,First Name,Last Name,Date of Birth,City
1,Max,Smith,2002-02-03,Sydney
2,Karl,Summers,2004-04-10,Brisbane
3,Sam,Wilde,2005-02-06,Perth

You can create a table “employees” in PostgreSQL by executing the following command:

CREATE TABLE employees(
emp_id SERIAL,
first_name   VARCHAR(50),
last_name VARCHAR(50),
dob DATE,
city VARCHAR(40)
PRIMARY KEY(emp_id)
);

After creating the sample CSV file and table, you can now easily import CSV to PostgreSQL via any of the following methods:

Method 1: Using the COPY Command

To successfully use the COPY command for executing the PostgreSQL import CSV task, ensure that you have PostgreSQL Superuser Access. 

  • Step 1: Run the following command to perform the PostgreSQL import CSV job:
COPY employees(emp_id,first_name,last_name,dob,city)
FROM ‘C:newdbemployees.csv’
DELIMITER ‘,’
CSV HEADER;

Output:

COPY 3

On successful execution of PostgreSQL import CSV job, the Output “COPY 3” is displayed meaning that 3 records have been added to your PostgreSQL table.

The above COPY command has the following essential aspects:

  • employees(emp_id, first_name, last_name, dob, city): “employees” is the name of the table where you want to import the data. Specifying the column names is optional if the order of the columns is maintained.
  • ‘C:newdbemployees.csv’: This is the location of the CSV file stored on your system. You can change it according to your computer.
  • DELIMITER: This is the character that determines how the values in the rows of a CSV file are separated. In the above example, the Delimiter is a comma i.e. “,”. At times, values are separated by characters like ‘|’ or tabs (t). In the case of a tab(t) delimiter,  you can use the “DELIMITER E’t’ ” where ‘E’ allows for the tab character to be recognized. 
  • CSV: This option specifies that data is imported from a CSV file.
  • HEADER: This option is used to let PostgreSQL know that the CSV file contains headers i.e the column names “First Name, Last Name, City”. Now, the CSV file data is imported from the second row onwards.   
  • Step 2: You can print out the contents of the table to check if the data is entered correctly.
SELECT * FROM employees;

Output:

emp_id  first_name    last_name	    dob	         city
      1        Max	Smith     2002-02-03    Sydney
      2        Karl	Summers   2004-04-10    Brisbane
      3        Sam      Wilde     2005-02-06    Perth

Also Read: How to Export a PostgreSQL Table to a CSV File

Advantages and Disadvantages of the COPY command Method

Advantages

  • It is one of the most quick and efficient methods to import large amounts of data from CSV files to PostgreSQL tables
  • The COPY command is easy to execute, therefore automating the entire migration process.

Disadvantages

  • Importing CSV files might require some knowledge of SQL and the PostgreSQL command line.
  • Data transformation and validation is limited in this method.

Method 2: Using pgAdmin

pgAdmin is an open-source tool for effortlessly managing your PostgreSQL Database. You can easily download it from the official pgAdmin website. You can also perform the PostgreSQL import CSV task via pgAdmin by following the simple steps given below:

  • Step 1: You can directly create a table from the pgAdmin GUI(Graphical User Interface). Open pgAdmin and right-click on the Tables option present in the Schema section on the left side menu.
  • Step 2: Hover over the Create option and click on the “Table…” option to open a wizard for creating a new table.
PostgreSQL Import CSV: Create New Table
Create New Table
  • Step 3: You can now enter the table-specific details such as Table Name, Column Names, etc. Once done, you can click on the Save button to create a new table.
  • Step 4: Now for performing the Postgres import CSV job, go to the “Schemas” section on the left side menu and click on the Tables option.
  • Step 5: Navigate to the “employees” table and right-click on it. Click on the Import/Export option to open the Wizard.
Import/Export Data
Import/Export Data
  • Step 6: Toggle On the Import/Export flag button to import the data. Specify the filename as “employees” and the file format as CSV. You can toggle on the header button and specify “,” as the delimiter for your CSV file. Switch to the Columns Tab from the Options Tab to select columns and the order in which they need to be imported. 
Choosing Filename
Choosing Filename
  • Step 7: Click on the OK button. A window will pop up on your screen showing the successful execution of the PostgreSQL import CSV job using pgAdmin.   
Successful Execution
Successful Execution

Advantages and Disadvantages of using PgAdmin for CSV PostgreSQL Import

Advantages

  • The graphical user interface in PgAdmin is easy and user-friendly.
  • It allows more flexibility for data transformation and validation.

Disadvantages

  • It can only import smaller files due to limited features in the PgAdmin interface.
  • It is slower than the COPY command when importing large files.

Method 3: Using \copy Command (psql Meta-Command)

The \copy command can be used for PostgreSQL import to CSV when you are interactively working with ‘psql’ and want to import data directly from your local machine or a specified file path.

  • Step 1: Use ‘psql’ to Connect to your Database
psql postgres://<username>:<password>@<host>:<port>/<database>

# for example
psql postgres://postgres:postgres@localhost:5432/postgres
  • Step 2: Import the Data from the CSV file

Once connected to psql, you can use the \copy command to import the data. Make sure to specify the correct path to your CSV file.

\COPY employee_table FROM 'employees.csv' WITH CSV HEADER;
  • Step 3: Verify Data Import

After running the \COPY command, you can verify that the data has been imported by querying the table:

SELECT * FROM employee_table;

You can exit the ‘psql’ by using the ‘\q’ command.

Advantages and Disadvantages of using /copy command

Advantages

  • It is an easy-to-use and straightforward method without the need for any additional scripting or tooling.
  • The /copy command quickly transfers large amounts of data from CSV files into your PostgreSQL tables.

Disadvantages

  • The \copy command has a relatively limited set of options for data transformation and validation.
  •  It requires an understanding of SQL and the PostgreSQL command line interface.

Method 4: Using Cloud SQL for PostgreSQL

Google Cloud Platform offers a fully-managed database service called Cloud SQL for PostgreSQL, which allows users to easily create, manage, and scale PostgreSQL databases in the cloud. One of the key capabilities of Cloud SQL for PostgreSQL is the ability to import CSV files directly into PostgreSQL tables.

The process to import a CSV file into a PostgreSQL table using Cloud SQL for PostgreSQL involves the following steps:

Step 1: Create a new Cloud SQL for PostgreSQL instance and database within the Google Cloud Platform.

Step 2: Upload the CSV file that needs to be imported into the PostgreSQL table to Google Cloud Storage.

Step 3: Use the COPY command within the PostgreSQL environment to import the CSV file from the Google Cloud Storage location into the target PostgreSQL table. The command would look like this:

COPY table_name FROM 'gs://bucket-name/path/to/csv/file.csv' DELIMITER ',' CSV HEADER;

The gs:// prefix in the command indicates that the CSV file is located in Google Cloud Storage, and the DELIMITER ',' CSV HEADER; portion specifies the CSV file format.

Advantages and Disadvantages of using Cloud SQL for PostgreSQL

Advantages

  • Fully-managed PostgreSQL service by Google, handling infrastructure and maintenance
  • Automatic scalability to accommodate varying CSV import workloads

Disadvantages

  • Potentially higher ongoing costs compared to self-hosting PostgreSQL
  • Limited customization options as a managed service

Method 5: Using Hevo

Hevo Data is a no-code data pipeline solution that can help you move data from 150+ data sources like FTP/SFTP & Google Sheets to your desired destination such as PostgreSQL, Data Warehouses, or BI tools in a completely hassle-free & automated manner. Using Hevo you can easily upload files in formats such as CSV, JSON, and XML to your PostgreSQL Database.

Get Started with Hevo for Free

To effortlessly perform the PostgreSQL import CSV job using Hevo, follow the simple steps given below:

  • Step 1: Connect Hevo Data with your system by setting FTP/SFTP as a source. You can provide a unique name to your Pipeline along with information such as Port Number, Username, Password, etc. Hevo currently supports CSV, JSON, and XML formats. You can specify the file format as CSV and the Delimiter as “,”.
PostgreSQL Import CSV - Hevo Data Configure FTP/SFTP as your source
Image Source
  • Step 2: Complete the Postgres import CSV job by providing your PostgreSQL database credentials such as your authorized Username and Password, along with information about your Host IP Address and Port Number value. You will also need to provide a name for your database and a unique name for this destination.
PostgreSQL Import CSV - Hevo Data Configure PostgreSQL as your Destination
Image Source
SIGN UP HERE FOR A 14-DAY FREE TRIAL

Additional Resources on Import CSV to Postgresql

Conclusion

In this article, you have learned how to effectively perform the PostgreSQL import CSV task via 4 different methods.

  • If you require analysis-ready high-quality Data in Real-time from several sources, then a Cloud-based ETL tool like Hevo Data is the Right Choice for you! 
  • However, if you rarely import data into PostgreSQL and don’t need to perform data transformations and cleaning operations, then the other three manual data integration methods will work. 
  • Hevo also supports PostgreSQL as a source for loading data to a destination of your choice. Hevo’s fault-tolerant architecture ensures that the data is handled in a secure, consistent manner with zero data loss.

FAQ on Import CSV File Into PostgreSQL Table

How to import CSV file to PostgreSQL?

Use the copy command to import a CSV file to PostgreSQL.

How do I import a CSV file into a table in pgAdmin?

1. Open pgadmin
2. Navigate to the Table
3. Configure import settings
4. Map Columns
5. Execute Import

Can you add Excel CSV data in PostgreSQL?

1. Save file as CSV
2. Follow CSV import steps and use copy command

Sanchit Agarwal
Research Analyst, Hevo Data

Sanchit Agarwal is an Engineer turned Data Analyst with a passion for data, software architecture and AI. He leverages his diverse technical background and 2+ years of experience to write content. He has penned over 200 articles on data integration and infrastructures, driven by a desire to empower data practitioners with practical solutions for their everyday challenges.

No-code Data Pipeline for PostgreSQL