One noteworthy success story I recently came across was The Guardian newspaper’s migration of their mission-critical content publication application from MongoDB to PostgreSQL (“Bye-bye Mongo, Hello Postgres”). The advanced JSON functionality (requiring minimal changes to their existing JSON data model), improved performance and availability, potential opportunities for SQL leveraging, and ease of operation of cloud-hosted PostgreSQL all prompted the migration.

However, one challenge with migrating from one document database to another is that the interfaces of document databases are almost always custom, making it impossible to simply unplug one database and plug in another.

In this article, you’ll learn how Postgres JSON Extraction of Data is done and PostgreSQL, and JSON platforms and their features respectively.

What is PostgreSQL?

Postgres JSON Extract - PostgreSQL logo

PostgreSQL, also known as Postgres, is a free and open-source Relational Database Management System (RDBMS) that focuses on extensibility and SQL compliance. It was created at the University of California, Berkeley, and debuted on July 8, 1996. It was the Ignes database’s replacement. PostgreSQL stores data as Structured objects rather than documents. It adheres to the standard SQL format and syntax.

It is written in C and has a monolithic architecture, which means that the components are completely integrated and work in a logical order. It provides community support as well as additional assistance to some of its paid customers. Because of its innovative backup mechanisms, it is widely used in the healthcare, banking, and manufacturing industries.

To implement relational and non-relational queries on data stored in databases, PostgreSQL supports both SQL and JSON. In other words, PostgreSQL allows you to write SQL commands to process data in tables belonging to the respective database servers. PostgreSQL is reportedly ranked fourth among the most popular databases in the world due to its extensive features and functionalities.

Key Features of PostgreSQL

PostgreSQL has some distinguishing characteristics. Some of these  are as follows:

  • PostgreSQL supports a wide range of data and document types, as well as customizations.
  • It has a monolithic architecture in which all of the components interact in an automated manner.
  • It is ACID (Atomicity, Consistency, Isolation, and Durability) compliant and runs on major operating systems.
  • It is ideal for transactional workflows, such as those found in bank systems, risk assessments, BI (Business Intelligence), and powering a variety of business applications.
  • It has numerous fail-safes and redundancies that ensure storage reliability.

What is JSON?

Postgres JSON Extract - JSON logo

JSON is a popular text notation/format for structured data. This schema-less format stores data in key-value pairs and operates on ordered lists. JSON, which was developed as a derivation of JavaScript, is now supported by the majority of programming languages. They have libraries where you can get the JSON Data Type. JSON’s primary application is data exchange between web clients and web servers.

JSON’s popularity has steadily grown since its inception around 15 years ago. JSON is now used for data exchange operations by the vast majority of publicly available Web services. JSON operates as a string and is useful for sending data across a network. You must, however, first convert it into a JavaScript object. Following that, you can access the transferred JSON data. JavaScript includes a global JSON Object that can help you speed up your JSON conversions and make it easier to use this format.

Key Features of JSON

Some key features are as follows:

  • Google BigQuery JSON can be used in the development of JavaScript-based applications such as websites and browser extensions.
  • It serializes and transmits structured data over a network connection.
  • Its primary purpose is to transfer data from a server to web applications and vice versa.
  • It serves as a source of public data for Web services and APIs.
  • It’s used with a variety of modern programming languages.
  • It’s simple to read and write.
  • It employs a lightweight text-based interchange format.
  • It is language-independent and works well with a variety of languages.
Simplify PostgreSQL ETL & Analysis with Hevo’s No-code Data Pipeline

With Hevo’s no-code data pipeline, you can effortlessly load data from multiple sources—including PostgreSQL, Google Search Console, SaaS apps, Cloud Storage, SDKs, and streaming services—into your destination of choice.

Why Hevo for PostgreSQL ETL & Analysis?

  • No-Code Data Pipeline: Simplify the entire ETL process without writing a single line of code.
  • Supports 150+ Data Sources: Seamlessly integrate with over 150 sources, including 60+ free data sources.
  • Data Enrichment & Transformation: Hevo doesn’t just load data—it transforms and enriches it to make it analysis-ready.

Experience the power of a fully automated, code-free data pipeline with Hevo and transform your PostgreSQL data into actionable insights effortlessly.

Get Started with Hevo for Free

Data Types in Postgres JSON Extract

PostgreSQL supports several JSON-related data types:

  • JSON data can be stored in two formats: JSON and JSONb (A? and B).
  • And a third type (C! ), the JSON path type, supports the SQL/JSON path language for efficient JSON data querying.

What is the distinction between JSON and JSONb?

The JSON data type stores an exact copy of the JSON (in a text data type) and must be re-parsed before it can be used. As a result, it is faster to store but slower to use. It also keeps the order of object keys and avoids duplicate object keys, whereas JSONb does not.

The JSONb data type stores JSON in a “decomposed” binary format, which is slower to store at first (due to conversion overhead), but faster to process later (because re-parsing isn’t required), and may take up slightly more disc space. The main advantage of JSONb is that it supports indexing, which allows for more efficient querying.

The general consensus appears to be that, while JSONb originally meant “JSON binary,” it now truly means “JSON better” and should be the default JSON data type you use.

How does Postgres JSON Extraction work?

You know that data you’ve always wanted to export to a JSON file that’s sitting in a PostgreSQL database?

SELECT *
FROM people
WHERE meaning_of_life = 42;

--  id | name  | meaning_of_life
-- ----+-------+-----------------
--   1 | Alice |              42
--   2 | Bob   |              42
-- (2 rows)

Because PostgreSQL is awesome, it supports JSON and has tons of cool JSON-related functions. To begin, use the row to JSON function to convert those rows to JSON objects:

SELECT row_to_json(people)
FROM people
WHERE meaning_of_life = 42;

--                  row_to_json
-- ----------------------------------------------
--  {"id":1,"name":"Alice","meaning_of_life":42}
--  {"id":2,"name":"Bob","meaning_of_life":42}
-- (2 rows)

With the JSON aggregation function, you can now aggregate these rows into a JSON array:

Finally, use the COPY command to convert this data to text and dump it to a file:

COPY (
  SELECT json_agg(row_to_json(people)) :: text
  FROM people
  WHERE meaning_of_life = 42;
) to '/path/to/some/file.json';

And your JSON file is complete:

# Note: again, the contents of the file are represented
# on two lines here, but there's only one line in it.
$> cat /path/to/some/file.json
[{"id":1,"name":"Alice","meaning_of_life":42},
 {"id":2,"name":"Bob","meaning_of_life":42}]

Postgres JSON Extract Operators

Here are a few Operators:

1. Postgres JSON Extract: Get/Extract

There are six get/extract operators (accessors and serializers) that are compatible with both json and jsonb data types (“->”, “->>”, “#>”, “#>>”). Half of these (“->”, “#>”) return json, while the other half (“->>”, “#>>”) return text. It is worth noting that they return NULL if the JSON does not have the correct structure for the request.

In a SELECT statement, these operators (also known as the range and path operators) are used as follows:

SELECT reading FROM tides;

2. Postgres JSON Extract: Chaining

Operators can be linked together. This example returns all of the metadata name fields as text and demonstrates operator chaining because “->” returns JSON, so you can simply apply another operator to the results, and so on:

This refinement is likely to be more useful because it returns the set of distinct locations (duplicates are removed).

SELECT reading -> 'metadata' ->> 'name' AS location FROM tides;

3. Postgres JSON Extract: Comparison

Jsonb supports all of the standard comparison operators (but not JSON). These can be used to filter the results in the WHERE clause.

SELECT reading -> 'data' AS data FROM tides
WHERE reading -> 'metadata' ->> 'name' = 'Key West';

There are also 12 additional jsonb-specific operators (“@>”, “@”, “?”, “?|”, “?&”, “||”, “-“, “#-“, “@?”, “@@”). These are some examples:

  • Operators for containing (“@>” and “@”)
  • Existence operators (“?,” “?|,” and “?&”)
  • (“||”) is a concatenation operator.
  • Operators for deletion (“-“, “#-“)
  • “@?” and “@@” are path checking operators.

Conclusion

PostgreSQL is an established database technology with strong native support for JSON column data types and operations. It has more than enough JSON functionality to get started with new JSON database use cases. However, users of document databases are increasingly utilizing Postgres JSON Extraction of Data and migrating to PostgreSQL.

In Postgres JSON Extraction, you can have both at the same time and in the same place. JSON is a full-fledged data type in PostgreSQL, as we discovered in this blog, so you can use it for SQL function return types, columns, views, or anything else.

In this article, you have learned about Postgres JSON Extraction of Data, PostgreSQL, and JSON platforms and their features respectively.

Hevo Data with its strong integration with 150+ data sources (including 60+ Free Sources) allows you to not only export data from your desired data sources & load it to the destination of your choice but also transform & enrich your data to make it analysis-ready. Hevo also allows integrating data from non-native sources using Hevo’s in-built Webhooks Connector. You can then focus on your key business needs and perform insightful analysis using BI tools. 

Want to give Hevo a try?

Try Hevo’s 14-day free trial and experience the feature-rich Hevo suite first hand. You may also have a look at the amazing price, which will assist you in selecting the best plan for your requirements.

Share your experience of understanding Postgres JSON Extraction of Data in the comment section below! We would love to hear your thoughts.

FAQs

1. How do I extract JSON data from PostgreSQL?

Use the -> and ->> operators to access JSON fields in PostgreSQL. -> retrieves a JSON object, while ->> returns text. Example: SELECT data->’field_name’ FROM table_name.

2. How to open a JSON file in PostgreSQL?

Use the COPY command to load JSON data. First, ensure your table has a JSON or JSONB column, then run: COPY table_name(json_column) FROM ‘/path/to/file.json’ WITH (FORMAT json);.

3. How do I extract data from JSON?

To extract data, use PostgreSQL’s JSON functions and operators, such as json_extract_path_text(json, path_elements…) for deep extraction or json_column->>’key’ for direct access to specific fields.

Davor DSouza
Research Analyst, Hevo Data

Davor DSouza is a data analyst with a passion for using data to solve real-world problems. His experience with data integration and infrastructure, combined with his Master's in Machine Learning, equips him to bridge the gap between theory and practical application. He enjoys diving deep into data and emerging with clear and actionable insights.