Azure SQL PostgreSQL Integration: 3 Easy Methods

on Data Integration, Database Management Systems, Microsoft, Microsoft Azure, PostgreSQL • April 12th, 2022 • Write for Hevo

Microsoft Azure often referred to as Azure is a cloud computing service operated by Microsoft for application management via Microsoft-managed data centres. It provides infrastructure as a service (IaaS), platform as a service (PaaS) and software as a service (SaaS) capabilities. 

Under the Azure hood, Microsoft provides many tools/frameworks/services like Virtual machines, Apps, Active Directory, Storage and Communication services, Data warehouse(Synapse Analytics) and Data lakes, Data explorer, Blockchain workbench etc. 

This article introduces you to Azure SQL and PostgreSQL and discussed 3 easy methods to set up the Azure SQL PostgreSQL integration. Red along to learn their steps and decide which method is best for your business!

Table of Contents

Prerequisites

At the outset, you must understand that though both Azure SQL and PostgreSQL are ANSI-SQL compliant, there are subtle differences in SQL syntax, data types offered/supported and how things are implemented internally. To set up the Azure SQL PostgreSQL integration, you must:

  • The first task would be to map the data types between the two and find appropriate data types in the destination database which can house the data types from the source database. 
  • Add tables or fields in the source database that record the time of the latest data transfer, it could be timestamps or fields like updated_at/created_by in tables, these will help you identify the records that need to be updated and the frequency of data refreshes. 
  • Identify product-specific features that are not present in the other database, and plan for alternatives. 

What is Azure SQL?

Azure SQL PostgreSQL: Azure SQL Logo
Image Source

Azure SQL is a  database engine provided as a fully managed platform as a service (PaaS), by Microsoft. In the background, Azure SQL runs the latest stable version of the MS SQL Server, all latest improvements and new features are first-served via Azure SQL and later via SQL server.

All infrastructure is managed by Azure, and it does all the upgrades, patching, backups and monitoring for you. Some more prominent features of Azure SQL are:- 

  • High-performance in-memory technologies and intelligent query processing. 
  • Supports relational data and non-relational structures(Graphs, JSON, XML etc.) 
  • Dynamic scalability, high performance, enhanced security with data encryption  
  • Built-in performance monitoring, tuning and alerting capabilities 
  • Offers two purchasing models, three service tiers and multiple deployment models, to suit different needs. 

You can learn more about Azure SQL, here.

What is PostgreSQL?

Azure SQL PostgreSQL: PodtgreSQL Logo
Image Source

PostgreSQL is an open-source database management system that has equally good support for NoSQL (non-relational, multi-valued, hierarchical ) data types, as well as relational ( single-valued) data types. 

It has gained prominence and is under widespread usage, as NoSQL data is getting more and more important. It’s free, open-source and has been under community development for the last 20 years. 

You can learn more about PostgreSQL, here.

What is the Need to Set Up Azure SQL PostgreSQL Integration?

Azure SQL is a paid solution whereas Postgre is free, so in cases where the budget could be limited, PostGre is a good choice. 

Azure SQL is a proprietary software whereas PostGre is developed and actively maintained by a global community of open source developers, hence PostGre tends to be more versatile and open to integration with diverse technologies. 

PostGre is not cloud-based whereas Azure SQL is maintained and operated by Microsoft as an IAAS, so in cases where technical resources would be used for innovation/business logic requirements rather than infrastructure maintenance, Azure SQL is a good choice. Therefore, it is good to setup Azure SQL PostgreSQL integration.

Methods to Set Up Azure SQL PostgreSQL Integration

Method 1: Azure SQL PostgreSQL Integration Using Hevo Data

Hevo Data, an Automated Data Pipeline, provides you with a hassle-free solution to connect Azure SQL to PostgreSQL within minutes with an easy-to-use no-code interface. Hevo is fully managed and completely automates the process of not only loading data from RDS Oracle but also enriching the data and transforming it into an analysis-ready form without having to write a single line of code.

Hevo’s fault-tolerant Data Pipeline offers a faster way to move data from databases or SaaS applications into your PostgreSQL database. Hevo’s pre-built integration with Oracle, among 100+ data sources (including 40+ free data sources) and PostgreSQL will take full charge of the data transfer process, allowing you to focus on key business activities.

Get Started with Hevo for Free

Method 2: Azure SQL PostgreSQL Integration Using Fine Grained Manual Connections

This method requires you to connect the 2 SQL tools manually. Though this method is lengthy and complex, it gives you the utmost control and flexibility in your migration process. You can fine-tune your data types and add data specific optimizations to suit your purpose. 

Method 3: Azure SQL PostgreSQL Integration Using Free Automation Tools

A tool can give you the facility to provide a replication frequency and give up to the minute stats on synced and replicated data. You could use tools like Pgloader or  sqlserver2pgsql to connect Azure SQL to PostgreSQL.

The three methods are explained below

Methods to Set Up Azure SQL PostgreSQL Integration

This section will discuss some methods on how to connect Azure SQL to PostgreSQL and migrate data. With a little innovation, these methods could also work the other way round, i.e. connect PostgreSQL to Azure SQL.  

Method 1: Azure SQL PostgreSQL Integration Using Hevo Data

Azure SQL PostgreSQL: Hevo Logo
Image Source

Hevo Data, a No-code Data Pipeline helps you directly transfer data from Azure SQL and 100+ other data sources (including 40+free data sources), Data Warehouses, or a destination of your choice such as PostgreSQL in a completely hassle-free & automated manner. Hevo allows you to move data to the desired data destination.

Hevo is fully managed and completely automates the process of not only loading data from your desired source but also enriching the data and transforming it into an analysis-ready form without having to write a single line of code. It provides a consistent & reliable solution to manage data in real-time and always have analysis-ready data in your desired destination.

You can establish the Azure SQL PostgreSQL connection via Hevo Data using the following 2 steps:

  • Configure Source: Establish your Azure SQL account as the source for your Hevo Pipeline. You can learn more about this configuration, here.
  • Authenticate Destination: Set up PostgreSQL as your destination with Hevo, as shown in the below image.
Azure SQL PostgreSQL: PostgreSQL as Destination
Image Source

More Reasons to Choose Hevo Data

  • Fully Managed: It requires no maintenance as Hevo is a fully automated platform.
  • Data Transformation: It provides a simple interface to perfect, modify, and enrich the data you want to transfer. 
  • Fault-Tolerant: Hevo is capable of detecting anomalies in the incoming data and informs you instantly. All the affected rows are kept aside for correction so that it doesn’t hamper your workflow.
  • Real-Time: Hevo offers real-time data migration. So, your data is always ready for analysis.
  • Schema Management: Hevo can automatically detect the schema of the incoming data and maps it to the destination schema.
  • Live Monitoring: Advanced monitoring gives you a one-stop view to watch all the activities that occur within pipelines.
  • Live Support: Hevo team is available round the clock to extend exceptional support to its customers through chat, email, and support calls.
Sign up here for a 14-Day Free Trial!

Method 2: Azure SQL PostgreSQL Integration Using Fine Grained Manual Connections

Your data in PostgreSQL or Azure may not necessarily be single-valued or relational, instead, it could be multi-valued as well as hierarchically structured. 

Step 1: Export Your Data from PostgreSQL

So choose your destination data types accordingly, PostgreSQL gives you Arrays/JSON/XML/Enumeration/Geometric/Range as composite/multi-valued types. 

Azure SQL gives you NVarchar/VarBinary/XML etc. So the JSON object in PostGre should be mapped to NVarchar in AzureSQL, and Blob to VarBinary in AzureSQL. 

So, the first prerequisite is a working AzureSQL database and tables with data in them. If you don’t already have a database in Azure, create it using  SQL Server Management Studio (SSMS). 

Azure SQL PostgreSQL: Selecting SQL Database
Image Source

The first step would be to create a schema in Postgres such that the schema can ingest the data from AzureSQL. This schema should have a combination of tables with appropriate data types ( Integer, Float, JSON, String etc.) to house each of the Azure SQL datums. 

To create a database in Postgres emter:

CREATE DATABASE  my_Azure_DB WITH ENCODING='UTF8' LC_CTYPE='en_US.UTF-8' LC_COLLATE='en_US.UTF-8' OWNER=postgres CONNECTION LIMIT=-1; 

Use UTF8 as it supports all languages and it’s one of the most encompassing, connection limit is set to -1 to allow multiple connections. The owner is Postgres to allow the widest possible set of permissions for a smooth transfer. You can learn more about it here.

Next, create a table in Postgres:- 

CREATE TABLE my_pg_table  (
id Integer SERIAL, 
success BOOLEAN, 
text_field text, 
my_chars varchar(100), 
additional_data JSON, 
created_at TIMESTAMP WITH TIMEZONE 
); 

You can create more corresponding tables in PostgreSQL this way. 

Step 2: Create a PostgreSQL Function

Then formulate a function that can insert multiple rows into a PostgreSQL table, something like:- 

CREATE OR REPLACE FUNCTION insertdata(boolean bool_from_Azure, text text_from_Azure, char char_from_Azure, json data_from_Azure) ) 
  RETURNS VOID AS 
$$ 
 INSERT into my_pg_table values ( 0,  bool_from_Azure,  text_from_Azure, data_from_Azure, now() ) ; 
$$ 
LANGUAGE sql STRICT; 

Use 0 to allow PostGre to generate auto-increment(SERIAL) integers which would act as unique IDs for each record. Similarly, define insert functions for other tables. Then call these functions one by one, supply them with the necessary data and populate the tables you just created. 

Azure SQL gives you a function called “bcp” that is useful for bulk imports as well as bulk exports. So, to bulk export your data from Azure SQL, you can run the BCP utility as:- 

bcp my_Azure_DB.dbo.##My_Azure_Table out "C:MyFolderBCPFilesMy_Azure_TableFile.csv" -c -t -T 


This command will make a CSV backup of the whole table and place it under the specified folder path. Next, Postgres will readily accept this CSV and do a bulk import, via the COPY command. 

COPY my_pg_table FROM '/tmp/My_Azure_TableFile.csv' WITH (DELIMITER  ',' ); 

Step 3: Reverse Migration

Repeat Step 1 and Step 2 for every table in Azure, and import the data into Postgres. If you’re implementing the migration the other way round, from PostGre to AZureSQL your commands will reverse and look something like this:- 

COPY my_pg_table TO  '/tmp/My_PG_TableFile.csv' WITH (DELIMITER  ',' )  // TO replaces FROM 
and then 
bcp my_Azure_DB.dbo.##My_Azure_Table IN  "C:MyFolderBCPFilesMy_PG_TableFile.csv"  -T  -c   // IN replaces OUT 

Method 3: Azure SQL PostgreSQL Integration Using Free Automation Tools

This method uses certain free tools to accomplish all the boilerplate code for you and automate the Azure SQL to PostgreSQL database migration.  

Two popular tools for the purpose are:- 

  •  sqlserver2pgsql – This tool migrates a Microsoft SQL Server Database into a PostgreSQL database, as automatically as possible. 

This toll will first convert your Azure SQL schema to a PostgreSQL schema, and then all the data will be migrated in the next step. 

To facilitate this process, the tool will generate 3 types of files. 

  1.  tables-before.sql: holds the converted schema, you can manually check if everything looks good to you. 
  2.  tables-after.sql: holds all the constraints on Azure SQL data, like foreign keys/triggers etc., which will be executed after data migration. 
  3.  tables-unsure.sql: In case the tool hits an impediment or is unsure of what mapping/conversion to execute, it will store those problematic instructions here. 

 You will need to check and review tables-unsure.sql and instruct what needs to be done OR identify problematic conversions and perform them manually. To keep your data fresh and updated, this tool will also create an incremental version of the migration job, which you can periodically run to update what has changed since the last migration. 

  •  pgloader: pgloader is a free tool offered by PostgreSQL which enables loading data from many other databases into PostgreSQL. 

pgLoader intelligently uses the COPY command internally, and maintains a separate file for problematic data but continues to load good data while you decide/mend the erroneous source data. 

pgloader will automatically detect the source schema and its associated referential integrity constraints, indexes and primary keys etc. It then maps/finds appropriate matching data types in PostGre( destination) for the source data types, plans its execution and runs it. 

pgloader can also maintain a log, which can be helpful in case there are errors or you want to improve future migrations. 

Benefits of Azure SQL PostgreSQL Integration Using Method 2 & 3

  • You get fine-grained control over your collaboration/migration process. 
  • You can fine-tune the migration to suit your application-specific needs. 
  • As you minutely plan your connections, you can find bugs or improve the way you are using your data types in the database. 

Limitations of Azure SQL PostgreSQL Integration Using Method 2 & 3

  • To ensure proper migration, you will have to keep an eye on the migration process as it may ask for your help/instructions. 
  • All the above methods and tools are a bit old, from the era when ETL and BigData had not gained prominence. 
  • With the advent of cloud-based data warehouses, data pipelining solutions and advancements in network and storage technologies, some powerful and versatile tools have gained prominence. These modern approaches to data ingestion deliver faster implementation than traditional ETL. 
  • They also let expensive technical resources focus on innovation, business intelligence and core functions rather than ETL.

Conclusion

This article introduced you to Azure SQL and PostgreSQL. It further provided a step-by-step guide of 3 easy methods to set up your Azure SQL PostgreSQL integration. The second method requires you to manually connect these 2 tools. The third method relies on external tools like pgloader to connect Azure SQL to PostgreSQL. A downside of using these tools is that they might not be versatile enough to support many sources and destination types, they might be working on only a small set of databases.

Visit our Website to Explore Hevo

Hevo Data, a No-code Data Pipeline provides you with a consistent and reliable solution to manage data transfer between a variety of sources like Azure SQL and a wide variety of Desired Destinations, with a few clicks. Hevo Data with its strong integration with 100+ sources (including 40+ free sources) allows you to not only export data from your desired data sources & load it to the destination of your choice, but also transform & enrich your data to make it analysis-ready so that you can focus on your key business needs and perform insightful analysis.

Want to take Hevo for a spin? Sign Up for a 14-day free trial and experience the feature-rich Hevo suite first hand. You can also have a look at the unbeatable pricing that will help you choose the right plan for your business needs.

Share your experience of learning about the Azure SQL PostgreSQL integration! Let us know in the comments section below!

No Code Data Pipeline For Your Data Warehouse