Data Parsing is a simple process of transforming data from one format to another to make it more readable and subsequently ready for analysis. Since the release of PostgreSQL version 9.2, PostgreSQL has introduced a number of JSON operators and functions to parse JSON data. PostgreSQL parse JSON operation helps you and your workgroup to break JavaScript objects like arrays and objects into smaller pieces of data, resulting in improved Data Accessibility for everyone who interacts with your PostgreSQL database.

In this Parse JSON PostgreSQL guide, we discuss how parsing helps to make sense of information encoded in your JSON files. You’ll learn about the different types of PostgreSQL Parse JSON operators and functions that allow you to parse your JSON files and make them more accessible for your people than before.

Why is PostgreSQL Parse JSON Required?

To understand the importance of PostgreSQL Parse JSON operation, consider a usual scenario. Let’s say you asked your sales team for data on recent customers who had purchased a product from your website. 

They agreed to give you the details, but you received something unexpected like this.

"customers": [
  	"id": 1,
  	"first_name": "Dayna",
  "last_name": "Wile",
  "gender": "Female",
  "email": "",
  "phone": "989-311-5262"
}, {
  "id": 2,
  "first_name": "Bliss",
  "last_name": "Pogosian",
  "gender": "Female",
  "email": "",
  "phone": "466-731-8534"
}, {
  "id": 3,
  "first_name": "Lalo",
  "last_name": "Dener",
  "gender": "Male",
  "email": "",
  "phone": "656-432-6184"
}, {
  "id": 4,
  "first_name": "Gayleen",
  "last_name": "Mateescu",
  "gender": "Female",
  "email": "",
  "phone": "281-359-3811"
}, {
  "id": 5,
  "first_name": "Tadd",
  "last_name": "Lotherington",
  "gender": "Male",
  "email": "",
  "phone": "541-923-6204"

Clearly, this format isn’t suitable for storage. To get these records in your relational database like PostgreSQL, you need PostgreSQL Parse JSON operation. 

PostgreSQL Parse JSON operation will take your JSON key-value pairs and convert them into a format that is compatible with PostgreSQL’s table structure. You can parse the JSON objects using PostgreSQL’s JSON operators and functions, and then insert the values into your desired PostgreSQL tables for future use.

What Makes Hevo’s ETL Process Best-In-Class

Are you looking for an easy way to replicate your PostgreSQL data? Hevo’s Automated, No-Code data pipeline Platform helps you with everything you need to replicate data from 150+ Data Sources like PostgreSQL to a destination.

Try our 14 day free trial to understand how seamless replication can be.

Sign up here for a 14-Day Free Trial!

Next up, we discuss the process of performing parse JSON Postgres operation.

How Does PostgreSQL JSON Parse Work?

Since the release of PostgreSQL 9.2, there have been several significant enhancements, including support for JSON data types. PostgreSQL now provides various functions and operators for users to work with JSON data. 

PostgreSQL Parse JSON Operators

Some commonly used JSON operators available for use in PostgreSQL are listed below:

json -> integer → jsonExtracts nth element of JSON array (array elements are indexed from zero, and negative integers are counted from the end)‘[{  “id”: 1,  “first_name”: “Walther”,  “last_name”: “Laybourn”}, {  “id”: 2,  “first_name”: “Calley”,  “last_name”: “Bamblett”}, {  “id”: 3,  “first_name”: “Cecelia”,  “last_name”: “Asquez”}]’::json -> 2 -> {  “id”: 2,  “first_name”: “Calley”,  “last_name”: “Bamblett”}
json ->text → jsonExtracts JSON object field with the given key‘{“Name”: {“Calley”:”Bamblett”}}’::json -> ‘Name’ → {“Calley”:”Bamblett”}
json ->> integer → textExtracts nth element of JSON array, as text‘[10,20,30]’::json ->> 2 → 30
json ->> text → textExtracts JSON object field with the given key, as text‘{“Walther”:1,”Laybourn”:2}’::json ->> ‘Laybourn’ → 2
json #> text[] → jsonExtracts JSON sub-object at the specified path, where path elements can be either field keys or array indexes‘{“Customer”: {“FullName”: [“Walther”,”Laybourn”]}}’::json #> ‘{Customer,FullName,1}’ → “Laybourn”
json #>> text[] → textExtracts JSON sub-object at the specified path as text‘{“Customer”: {“FullName”: [“Walther”,”Laybourn”]}}’::json #>> ‘{Customer,FullName,1}’ → Laybourn

Key PostgreSQL Parse JSON Functions

The table below shows JSON functions that are available for use in PostgreSQL.

to_json ( anyelement ) → jsonConverts SQL value(s) to JSON. Any SQL arrays or composites if provided will be converted to JSON arrays and objectsto_json(‘Walther Laybourn says “Hi.”‘::text) → “Walther Laybourn says “Hi.””
array_to_json ( anyarray [, boolean ] ) → jsonTransforms a SQL array to a JSON arrayarray_to_json(‘{{22,24},{87,89}}’::int[]) → [[22,24],87,89]]
row_to_json ( record [, boolean ] ) → jsonTransforms a SQL composite value to a JSON objectrow_to_json(row(1,’Walther’)) → {“f1″:1,”f2″:”Walther”}
json_object ( text[] ) → jsonCreates a JSON object out of a text array.json_object(‘{“FirstName”, “Walther”, “SecondName”, “Laybourn”}’) → {“FirstName” : “Walther”, “SecondName” : “Laybourn”}
json_object ( keys text[], values text[] ) → jsonThis JSON function takes keys and values pairwise from separate text arrays. json_object(‘{FirstName,Walther}’, ‘{SecondName,Laybourn}’) → {“FirstName”: “Walther”, “SecondName”: “Laybourn”}
json_array_elements ( json ) → set of jsonExpands the top-level JSON array into a smaller set of JSON valuesselect * from json_array_elements(‘[customers,1, [FirstName, Walther]]’) →
   value———– customers 1 [FirstName,Walther]
json_array_length ( json ) → integerReturns the number of elements in a JSON array as an integer.json_array_length(‘[1,2,3,{“s1″:1,”s2”:[5,6]},{s3:[6,7,8]},4]’) → 6
json_each ( json ) → setof record ( key text, value json )Expands the top-level JSON object into a set of key/value * from json_each(‘{“FirstName:”Walther”, “LastName”:”Laybourn”}’) →
 key             |  value————-+——- FirstName  | “Walther” LastName  | “Laybourn”
json_object_keys ( json ) → setof textReturns the set of keys in the top-level JSON * from json_object_keys(‘{“p1″:”oranges”,”p2″:{“p3″:”apples”}’) →
json_object_keys—————— p1 p2

PostgreSQL supports many more JSON functions and operators than the ones listed here. You can find more information about these in the following PostgreSQL documentation – PostgreSQL JSON Functions and Operators.

PostgreSQL Parse JSON Operation In Action

Since you now have all the knowledge about frequently used PostgreSQL parse JSON functions and operators, let’s unravel our previous problem and see how we can parse our customer’s JSON file we received from the Sales team.

1) Inserting PostgreSQL JSON Data

After receiving the customer’s JSON file, you can use the INSERT function to insert JSON data into your PostgreSQL tables. 

Here’s the command to do so:

  customers (customer_json_info)
    '{ "id": 1, "first_name": "Dayna", "last_name": "Wile", "gender": "Female", "email", "", "phone": "989-311-5262"}'
    '{ "id": 2, "first_name": "Bliss", "last_name": "Pogosian", "gender": "Female", "email": "", "phone": "466-731-8534"}'
    '{ "id": 3, "first_name": "Lalo", "last_name": "Dener", "gender": "Male", "email": "", "phone": "656-432-6184"}'
    '{ "id": 4, "first_name": "Gayleen", "last_name": "Mateescu", "gender": "Female", "email": "", "phone": "281-359-3811"}'
    '{ "id": 5, "first_name": "Tadd", "last_name": "Lotherington", "gender": "Male", "email": "", "phone": "541-923-6204"}'

Once it’s in the database, your PostgreSQL customer_json_info column would look like this:

{ “id”: 1, “first_name”: “Dayna”, “last_name”: “Wile”, “gender”: “Female”, “email”, “”, “phone”: “989-311-5262”}
{ “id”: 2, “first_name”: “Bliss”, “last_name”: “Pogosian”, “gender”: “Female”, “email”: “”, “phone”: “466-731-8534”}
{ “id”: 3, “first_name”: “Lalo”, “last_name”: “Dener”, “gender”: “Male”, “email”: “”, “phone”: “656-432-6184”}
{ “id”: 4, “first_name”: “Gayleen”, “last_name”: “Mateescu”, “gender”: “Female”, “email”: “”, “phone”: “281-359-3811”}
‘{ “id”: 5, “first_name”: “Tadd”, “last_name”: “Lotherington”, “gender”: “Male”, “email”: “”, “phone”: “541-923-6204”}

2) Querying PostgreSQL JSON Data

To query data, you can use a SELECT statement like in any other SQL query. You can use the native PostgreSQL operators to query the data in PostgreSQL.

  • The operator -> returns a JSON object field by key.
  • The operator ->> returns a JSON object field by text.

As an example, if you run a SELECT query,

SELECT customer_json_info ->> CONCAT(first_name, ' ', last_name) AS full_name
FROM customers;

You will receive the following output.

Dayna Wile
Bliss Pogosian
Lalo Dener
Gayleen Mateescu
Tadd Lotherington

You can also run another SELECT query to expand the top-level JSON object into a set of key/value pairs. 

Let’s consider a simple table of our first two customers; Dayna Wile who purchased a T-shirt from our online store, and Bliss Pogosian who purchased trousers. 

| id | first_name | last_name | product_description                 |                  
----- ------------ ----------- ---------------------------------------
| 1  | Dayna      | Wile      | {"category":"apparel","product":tshirt"}                    |
| 2  | Bliss      | Pogosian  | {"category":"accessories","product":"cap"}                  |

You can return a set of rows with key and value column names using the json_each_text PostgreSQL parse JSON function. 

SELECT * FROM customers, 
json_each_text(customers.product_description) AS json_data
WHERE VALUE = 'apparel';


| id | first_name | last_name | product_description| key | value |    
 ---- ------------ ----------- -------------------- ----- ------- 
| 1  | Dayna      | Wil       | {"category":"apparel","product":"tshirt"} | category | apparel | -------------------------------------------------------------------

3) Postgres JSON Parse Operation

Here’s another simple example to parse PostgreSQL JSON fields. You can write an SQL query to output an array of all customer_note keys:

CREATE TABLE customers (
    id serial primary key,
    customers_json_2 json

INSERT INTO customers (customers_json_2) VALUES (
  '{"customer_note": {"note_199": {"message_number": 1453197190, "status": 1}, "note_215": {"message_number": 1459325611, "status": 1}, "note_219": {"message_number": 1454244074, "status": 1}, "note_225": {"message_number": 1453981312, "status": 1}, "note_229": {"date": 1459327685, "status": 1}}}'::json

When you run the SELECT command with json_object_keys PostgreSQL parse JSON operation,

SELECT json_object_keys(customers_json_2->'customer_note') FROM customers

You get the following output.


Here’s another query to return the type of the top-level JSON value.

SELECT json_typeof('{ "id": 1, "first_name": "Dayna", "last_name": "Wile", "gender": "Female", "email", "", "phone": "989-311-5262"}');

(1 row)

There are many more interesting functions that operate on JSON like  jsonb_path_queryjson_to_record, and jsonb_insert  that can be used to perform parse JSON Postgre operations. In this guide, we focused on some basic examples, but the same fundamental concepts allow you to parse complex JSON files in your PostgreSQL tables. 

4) How to Convert JSON to PostgreSQL Table 

First Create a JSON Data Type Table. You can construct a table and assign a JSON or JSONB data type to a column, just as you would an Int, VARCHAR, or Double data type. You may simply assign the column a data type of JSON or JSONB.

Here’s an example of creating a Table Journal and giving the column “diary_information” the data type JSON.

CREATE TABLE journal (
  diary_information JSON

Next to extract JSON data in PostgreSQL table, use this command;

INSERT INTO journal (id, day, diary_information) 
    1, "Tuesday", '{"title": "My first day at work", "Feeling": "Mixed feeling"}'

5) How to Extract Values from a JSON Array

Consider a table named ’employees’, which includes a column ‘skills’. This column contains an array of JSON objects, each representing the skills of an individual employee. If you need to identify the employees who possess “Python” as a skill, you can employ the ->> operator to extract the “name” attribute from each skill object. Additionally, use the @> operator to verify if “Python” is present within the array of skills.

  skills @ > '[{"name": "Python"}]' :: json

Fixing issues in querying JSON columns

Identifying and resolving problems associated with querying JSON columns in PostgreSQL entails the identification and resolution of concerns pertaining to data integrity, query efficiency, and syntax errors. The process of troubleshooting JSON column queries in PostgreSQL frequently necessitates the amalgamation of SQL expertise, comprehension of JSON data formats, and meticulous query enhancement.

By attending to these typical challenges and adhering to recommended practices, you can enhance your proficiency in effectively managing JSON data within PostgreSQL. Below, you will find a compilation of standard troubleshooting procedures and potential pitfalls to be vigilant about while working with JSON columns during queries.

Nested JSON structures

The process of querying JSON columns, as demonstrated in the tutorial, is relatively uncomplicated. Nonetheless, it becomes more challenging when dealing with nested JSON structures. It is essential to employ suitable JSON operators and functions for effectively traversing and querying nested JSON objects and arrays.

Functions such as “->,” “->>,” “#>,” and “#>>” prove invaluable in gaining access to nested elements. The “->” operator yields a JSON object, while “->>” provides the value in text format. By concatenating these operators in a sequence, one can effectively navigate nested JSON structures to extract the desired information.

Inaccurate JSON Path Specification

Although it may appear obvious, it’s surprisingly common to encounter incorrect query outcomes or query failures due to specifying an incorrect JSON path. Error messages such as “cannot extract elements from a scalar” or “JSON path not found” are illustrative examples. It’s crucial to meticulously verify the accuracy of the JSON path employed in your queries, particularly when dealing with nested structures. Utilizing tools like JSON viewers to visualize the JSON structure can be highly beneficial in this regard.

Error Handling

Data quality remains a prevalent issue across industries. While we grapple with this challenge on a daily basis, the absence of error handling in queries can lead to unexpected errors. Low-quality data can result in sporadic missing keys, and even a single missing key within a substantial query can disrupt query execution and trigger an error. To ensure that your queries don’t fail entirely due to a few missing keys, it’s advisable to incorporate error-handling mechanisms.

To handle such situations more gracefully, you can employ the COALESCE function or conditional logic to provide a default value in cases where a JSON key is absent. Instead of experiencing a complete failure, the query will then yield “Uncategorized,” allowing you to still retrieve values associated with existing keys.

Before we wrap up, let’s go through some basics as well.

What is the JSONB Data Type? How Does it Vary from JSON?

JSONB (JSON Binary) is a data type in PostgreSQL that enables you to store and manage JSON data more effectively and efficiently than the standard JSON data type.

JSONB saves JSON data in binary representation, allowing for better indexing and query performance than conventional JSON. This is because the binary format is more efficient for storing and retrieving JSON data, especially when working with big or complicated JSON objects.

In addition, JSONB provides other indexing options, such as the ability to index certain keys within a JSON object, allowing for speedier searches.

PostgreSQL’s normal JSON data type stores JSON data in plain text with no binary encoding or special indexing capabilities. This makes it easier to use, but it may result in worse query performance when working with big or sophisticated JSON documents.

The main differences between JSONB and JSON are:

  • JSONB is faster to query and manipulate than JSON because it does not need to be parsed each time.
  • JSONB supports indexing, which can improve the performance of some queries.
  • JSONB does not preserve the order of object keys, the white space, or the duplicate keys in the input, while JSON does.
  • JSONB usually takes more disk space to store than JSON and more time to build from the input.

One of the advantages of using jsonb over json is that jsonb allows you to parse json in postgres more efficiently, as it does not need to re-parse the data for every element access.

You can use JSONB if you need to perform many operations on the JSON data in PostgreSQL or use indexing on some JSON fields. You can use JSON if you only need to store and retrieve the JSON data as it is and do not care about the order, white space, or duplicate keys.

Querying a JSONb column in PostgreSQL

You can parse JSONB columns using PostgreSQL’s JSON functions and operators. Some often-used functions and operators are –

  • >: Extracts a JSON element by key or array index.
  • ->>: Extracts a JSON element as text.
  • #>: Extracts a JSON sub-object at a specified path.
  • #>>: Extracts a JSON sub-object as text.
  • @>: Checks if a JSON document contains another JSON document.
  • <@: Checks if a JSON document is contained within another JSON document.
  • jsonb_array_elements(): Expands a JSON array into a set of rows.

jsonb_each(): Expands a JSON object into key-value pairs.

Learn more about: Loading JSON data into Snowflake


PostgreSQL offers a number of ways to perform a parse JSON PostgreSQL operation by using different PostgreSQL parse JSON operators and functions. In this guide, we discussed a select few functions to help you understand how parsing or breaking of JSON objects can be performed for a better understanding of data in your PostgreSQL tables. 

If you are a PostgreSQL user, replicating data into a warehouse using ETL for Data Analysis can get demanding. For starters, this problem is exaggerated by the fact that they need extensive money and resources to hire data engineers and analysts to make sense of this data. Luckily, you can set up and start Data Replication from PostgreSQL to your favorite warehouse in a matter of minutes using Hevo.

Hevo is the only real-time ELT No-code Data Pipeline platform that cost-effectively automates data pipelines that are flexible to your needs. With integration with 150+ Data Sources (40+ free sources), we help you not only export data from sources & load data to the destinations but also transform & enrich your data, & make it analysis-ready.

Visit our Website to Explore Hevo

Give Hevo a try. Sign Up here for a 14-day full feature access trial and experience the feature-rich Hevo suite first hand. You can also check our pricing and make a decision on your best-suited plan. 

Share your thoughts on learning about PostgreSQL Parse JSON operation in the comments section below. If you have any questions, do let us know. We’d be happy to help.

Divyansh Sharma
Marketing Research Analyst, Hevo Data

Divyansh is a Marketing Research Analyst at Hevo who specializes in data analysis. He is a BITS Pilani Alumnus and has collaborated with thought leaders in the data industry to write articles on diverse data-related topics, such as data integration and infrastructure. The contributions he makes through his content are instrumental in advancing the data industry.

No-code Data Pipeline For PostgreSQL