Many businesses use cloud-based applications like Salesforce, HubSpot, Mailchimp, and Zendesk for daily operations. We need to combine data from these sources to measure key metrics and drive growth.

These applications, run by third-party vendors, provide APIs for data extraction into data warehouses like Google BigQuery. In this blog, we’ll walk you through the process of moving data from an API to BigQuery, discuss potential challenges, and share workarounds. Let’s dive in!

Note: When you connect API to BigQuery, consider factors like data format, update frequency, and API rate limits to design a stable integration.

Overview of BigQuery

Google BigQuery is a Cloud Data Warehouse service provider and a part of Google Cloud Platform. It helps companies store and analyze their business data at a secure Data Warehouse. Google allows users to leverage other Google Cloud Platform features such as engines, APIs, etc on their data directly from the Google BigQuery Data Warehouse.

Google BigQuery can manage terabytes of data using SQL language. Also, it enables companies to analyze their data stored in Data Warehouse using SQL queries. Google BigQuery has a Columnar Storage structure that helps in delivering faster query processing and file compression.

Supercharge Your API to BigQuery Integration with Hevo!

Unleash the full potential of your API data with Hevo’s no-code platform. Skip the coding and dive straight into real-time BigQuery insights, as Hevo effortlessly handles data transfer, schema mapping, and error handling—all while you focus on what matters most: your analysis.

Check out what makes Hevo amazing:

  • Schema Management: Hevo can automatically detect the schema of the incoming data and map it to the destination schema.
  • Scalable Infrastructure: Hevo has in-built integrations for 150+ data sources (with 60+ free sources) that can help you scale your data infrastructure as required.
  • Live Support: Hevo team is available round the clock to extend exceptional support to its customers through chat, email, and support calls.

Try to see why customers like Voiceflow and ScratchPay have upgraded to a powerful data and analytics stack by incorporating Hevo!

Get Started with Hevo for Free!

API to BigQuery: Use Cases

  • Advanced Analytics: BigQuery has powerful data processing capabilities that enable you to perform complex queries and data analysis on your API data. This way, you can extract insights that would not be possible within API alone.
  • Data Consolidation: If you’re using multiple sources along with API, syncing them to BigQuery can help you centralize your data. This provides a holistic view of your operations, and you can set up a change data capture process to avoid discrepancies in your data.
  • Historical Data Analysis: API has limits on historical data. However, syncing your data to BigQuery allows you to retain and analyze historical trends.
  • Scalability: BigQuery can handle large volumes of data without affecting its performance. Therefore, it’s an ideal solution for growing businesses with expanding API data.
  • Machine Learning: You can apply machine learning models to your data for predictive analytics, customer segmentation, and more by having API data in BigQuery

Method 1: Loading Data from Rest API to BigQuery using Hevo Data

Here are the steps to move data from Rest API to BigQuery using Hevo:

Step 1.1: Configure REST API as your source

API to BigQuery
API to BigQuery

Step 1.2: Configure BigQuery as your Destination

API to BigQuery

Yes, that is all. Hevo will do all the heavy lifting to ensure that your analysis-ready data is moved to BigQuery, in a secure, efficient, and reliable manner.

To know in detail about configuring REST API as your source, refer to Hevo Documentation.

Method 2: API to BigQuery ETL Using Custom Code

The BigQuery Data Transfer Service provides a way to schedule and manage transfers from REST API datasource to Bigquery for supported applications.

One advantage of the REST API to Google BigQuery is the ability to perform actions (like inserting data or creating tables) that might not be directly supported by the web-based BigQuery interface. The steps involved in migrating data from API to BigQuery are as follows:

  1. Getting your data out of your application using API
  2. Preparing the data that was extracted from the Application 
  3. Loading data into Google BigQuery

Step 2.1: Getting data out of your application using API

Below are the steps to extract data from the application using API.
Get the API URL from where you need to extract the data. In this article, you will learn how to use Python to extract data from Exchangerate-api.com, a free service for current and historical foreign exchange rates published by the European Central Bank. The same method should broadly work for any API that you want to use.

API URL = https://v6.exchangerate-api.com/v6/[Access-Key]/latest/USD 

If you click on the above URL, you will get the following result format:
Note: Replace [Access-Key] with your actual access token.

{
	"result": "success",
	"documentation": "https://www.exchangerate-api.com/docs",
	"terms_of_use": "https://www.exchangerate-api.com/terms",
	"time_last_update_unix":1722556802,
"time_last_update_utc":"Fri, 02 Aug 2024 00:00:02 +0000",
 	"time_next_update_unix":1722643202,
"time_next_update_utc":"Sat, 03 Aug 2024 00:00:02 +0000",
	"base_code": "USD",
	"conversion_rates": {
		"USD": 1,
		"AUD": 1.4817,
		"BGN": 1.7741,
		"CAD": 1.3168,
		"CHF": 0.9774,
		"CNY": 6.9454,
		"EGP": 15.7361,
		"EUR": 0.9013,
		"GBP": 0.7679,
		"...": 7.8536,
		"...": 1.3127,
		"...": 7.4722, etc. etc.
	}
}

Reading and Parsing API response in Python:

a. To handle API response will need two important libraries

import requests
import json 

b. Connect to the URL and get the response

url = 'https://v6.exchangerate-api.com/v6/YOUR-API-KEY/latest/USD' 
response = requests.get(url)

c. Convert string to JSON format

data = response.json()

d. Extract data and print

parsed = json.loads(data)
eur_rate = parsed["conversion_rates"]["EUR"]
usd_rate = parsed["conversion_rates"]["USD"]

Here is the complete code:

import requests
import json

url = "https://v6.exchangerate-api.com/v6/YOUR-API-KEY/latest/USD"

response = requests.get(url)
data = response.json()
parsed = json.loads(data)

eur_rate = data["conversion_rates"]["EUR"]
usd_rate = parsed["conversion_rates"]["USD"]

print(str(usd_rate) + "USD equals " + str(eur_rate) + " EUR")

Step 2.2: Preparing data received from API

There are two ways to load data to BigQuery. 

  1. You can save the received JSON formatted data on a JSON file and then load it into BigQuery.
  2. You can parse the JSON object, convert JSON to a dictionary object, and load it into BigQuery.

Step 2.3: Loading data into Google BigQuery

We can load data into BigQuery directly using an API call or create a CSV file and then load it into a BigQuery table.

  • Create a Python script to extract data from the API URL and load (UPSERT mode)  into the BigQuery table. Here, UPSERT is nothing but Update and Insert operations. This means – that if the target table has matching keys, then update the data else and insert a new record.
import requests 
import json 
from google.cloud import bigquery

url = "https://v6.exchangerate-api.com/v6/YOUR-API-KEY/latest/USD" 
response = requests.get(url) 
data = response.text 
parsed = json.loads(data) 
base = parsed["base"] 
date = parsed["date"] 
client = bigquery.Client() 
dataset_id = 'my_dataset' 
table_id = 'currency_details' 
table_ref = client.dataset(dataset_id).table(table_id) 
table = client.get_table(table_ref) 
for key, value in parsed.items():
    if type(value) is dict:
        for currency, rate in value.items():
            QUERY = ('SELECT target_currency FROM my_dataset.currency_details where currency=%', currency)
            query_job = client.query(QUERY)
            if query_job == 0:                    
                QUERY = ('update my_dataset.currency_details set  rate = % where currency=%',rate, currency)                    
                query_job = client.query(QUERY)                
            else:                    
                rows_to_insert = [(base, currency, 1, rate)]                    
                errors = client.insert_rows(table, rows_to_insert) 
                assert errors == []
  • Load JSON file to BigQuery. You need to save the received data in a JSON file and load the JSON file into the BigQuery table.
import requests 
import json 
from google.cloud import bigquery 
url = "https://v6.exchangerate-api.com/v6/YOUR-API-KEY/latest/USD" 
response = requests.get(url) 
data = response.text 
parsed = json.loads(data) 
for key, value in parsed.items():        
    if type(value) is dict:            
        with open('F:Pythondata.json', 'w') as f:                
            json.dump(value, f) 
            client = bigquery.Client(project="analytics-and-presentation") 
            filename = 'F:Pythondata.json' 
            dataset_id = 'dayaset' 
            table_id = 'currency_rate_details' 
            dataset_ref = client.dataset(dataset_id) 
            table_ref = dataset_ref.table(table_id) 
            job_config = bigquery.LoadJobConfig() 
            job_config.source_format = bigquery.SourceFormat.NEWLINE_DELIMITED_JSON 
            job_config.autodetect = True 
            with open(filename, "rb") as source_file:    
                job = client.load_table_from_file(source_file, table_ref, job_config=job_config) 
                job.result()  # Waits for table load to complete. 
                print("Loaded {} rows into {}:{}.".format(job.output_rows, dataset_id, table_id)) 

Limitations of writing custom scripts and developing ETL to load data from API to BigQuery

  1. The above code is written based on the current source as well as target destination schema. If the data coming in is either from the source or the schema on BigQuery changes, ETL process will break.
  2. In case you need to clean your data from API – say transform time zones, hide personally identifiable information and so on, the current method does not support it. You will need to build another set of processes to accommodate that. Clearly, this would also need you to invest extra effort and money.
  3. You are at a serious risk of data loss if at any point your system breaks. This could be anything from source/destination not being reachable to script breaks and more. You would need to invest upfront in building systems and processes that capture all the fail points and consistently move your data to the destination.
  4. Since Python is an interpreted language, it might cause performance issue to extract from API and load data into BigQuery api. 
  5. For many APIs, we would need to supply credentials to access API. It is a very poor practice to pass credentials as a plain text in Python script. You will need to take additional steps to ensure your pipeline is secure. 
Load Data from REST API to BigQuery
Load Data from REST API to Redshift
Load Data from Webhooks to BigQuery

Additional Resources on API to Bigquery

Conclusion

From this blog, you will understand the process you need to follow to load data from API to BigQuery. This blog also highlights various methods and their shortcomings. Using these two methods you can move data from API to BigQuery. However, using Hevo, you can save a lot of your time!

Move data effortlessly with Hevo’s zero-maintenance data pipelines, Get a demo that’s customized to your unique data integration challenges

You can also have a look at the unbeatable Hevo Pricing that will help you choose the right plan for your business needs!

FAQ on API to BigQuery

1. How to connect API to BigQuery?

1. Extracting data out of your application using API
2. Transform and prepare the data to load it into BigQuery.
3. Load the data into BigQuery using a Python script.
4. Apart from these steps, you can also use automated data pipeline tools to connect your API url to BigQuery.

2. Is BigQuery an API?

BigQuery is a fully managed, serverless data warehouse that allows you to perform SQL queries. It provides an API for programmatic interaction with the BigQuery service.

3. What is the BigQuery data transfer API?

The BigQuery Data Transfer API offers a wide range of support, allowing you to schedule and manage the automated data transfer to BigQuery from many sources. Whether your data comes from YouTube, Google Analytics, Google Ads, or external cloud storage, the BigQuery Data Transfer API has you covered.

4. How to input data into BigQuery?

Data can be inputted into BigQuery via the following methods.
1. Using Google Cloud Console to manually upload CSV, JSON, Avro, Parquet, or ORC files.
2. Using the BigQuery CLI
3. Using client libraries in languages like Python, Java, Node.js, etc., to programmatically load data.
4. Using data pipeline tools like Hevo

5. What is the fastest way to load data into BigQuery?

The fastest way to load data into BigQuery is to use automated Data Pipeline tools, which connect your source to the destination through simple steps. Hevo is one such tool.

mm
Freelance Technical Content Writer, Hevo Data

Lahudas focuses on solving data practitioners' problems through content tailored to the data industry by using his problem-solving ability and passion for learning about data science.