Nowadays, streaming data is a crucial data source for any business that wants to perform real-time analytics. The first step to analyze data in real-time is to load the streaming data – often from Webhooks, in real-time to the warehouse. A common use case for a BigQuery webhook is automatically sending a notification to a service like Slack or email whenever a dataset is updated. In this article, you will learn two methods of how to load real-time streaming data from Webhook to BigQuery.

Note: When architecting a Webhooks Google BigQuery integration, it’s essential to address security concerns to ensure your data remains protected. Also, when connecting BigQuery webhook, defining your webhook endpoint is essential – the address or URL that will receive the incoming data is essential.

Connect Webhook to BigQuery efficiently

Method 1: Webhook to BigQuery using Hevo Data

Easily automate and streamline the process of exporting data from webhooks to BigQuery using Hevo. Our platform seamlessly connects and transfers your data, enabling you to set up your integration in just two minutes.

Trusted by 2000+ Customers
Join industry leaders like Thoughtspot who rely on Hevo for seamless data integration and real-time data management.

Method 2: Webhook to BigQuery ETL Using Custom Code

Develop a custom application to receive and process webhook payloads. Write code to transform the data and use BigQuery’s API or client libraries to load it into the appropriate tables.

Get Started with Hevo for Free

Method 1: Webhook to BigQuery using Hevo Data

Hevo Data lets you load real-time streaming data from Webhook to BigQuery in two simple steps:

Step 1: Configure your source

Connect Hevo Data with your source, in this case, Webhooks. You also need to specify some details, such as the Event Name Path and Fields Path.

Webhook to BigQuery

Step 2: Select your Destination

Load data from Webhooks to BigQuery by selecting your destination. You can also choose the options for auto-mapping and JSON fields replication here.

Webhook to BigQuery

Now you have successfully established the connection between Webhooks and BigQuery for streaming real-time data.

  • Click here to learn more on how to Set Up Webhook as a Source.
  • Click here to learn more on how to Set Up BigQuery as a Destination.
Integrate Webhooks to BigQuery
Integrate Webhooks to Redshift
Integrate Webhooks to Snowflake

Method 2: Webhook to BigQuery ETL Using Custom Code

The steps involved in migrating data from WebHook to BigQuery are as follows:

  1. Getting data out of your application using Webhook
  2. Preparing Data received from Webhook
  3. Loading data into Google BigQuery

Step 1: Getting data out of your application using Webhook 

Setup a webhook for your application and define the endpoint URL on which you will deliver the data. This is the same URL from which the target application will read the data.

Step 2: Preparing Data received from Webhook 

Webhooks post data to your specified endpoints in JSON format. It is up to you to parse the JSON objects and determine how to load them into your BigQuery data warehouse.

You need to ensure the target BigQuery table is well aligned with the source data layout, specifically column sequence and data type of columns.

Step 3: Loading data into Google BigQuery

We can load data into BigQuery directly using API call or can create CSV file and then load into BigQuery table.

Create a Python script to read data from the Webhook URL endpoint and load it into the BigQuery table.


from google.cloud import bigquery 
 
import requests

client = bigquery.Client()

dataset_id = 'dataset_name'
 
#replace with your dataset ID
 
table_id = 'table_name'

#replace with your table ID

table_ref = client.dataset(dataset_id).table(table_id)

table = client.get_table(table_ref)  # API request

receive data from WebHook

Convert received data into rows to insert into BigQuery

errors = client.insert_rows(table, rows_to_insert)# API request

assert errors == []

You can store streaming data into a file by a specific interval and use the bq command-line tool to upload the files to your datasets, adding schema and data type information. In the GCP documentation of the GSUTIL tool, you can find the syntax of the bq command line. Iterate through this process as many times as it takes to load all of your tables into BigQuery.

Once the data has been extracted from your application using Webhook, the next step is to upload it to the GCS. There are multiple techniques to upload data to GCS. 

Upload file to GCS bucket

Using Gsutil: Using Gsutil utility we can upload a local file to GCS(Google Cloud Storage) bucket.

gsutil cp local_folder/file_name.csv  gs://gcs_bucket_name/path/to/folder/

To copy a file to GCS:

Using Web console: An alternative way to upload the data from your local machine to GCS is using the web console. To use the web console option follow the below steps.

Webhook to BigQuery
Image Source
  • First of all, you need to login to your GCP account. You must have a working Google account of GCP. In the menu option, click on storage and navigate to the browser on the left tab.
  • If needed create a bucket to upload your data. Make sure that the name of the bucket you choose is globally unique.
  • Click on the bucket name that you have created in step #2, this will ask you to browse the file from your local machine.
  • Choose the file and click on the upload button. A progression bar will appear. Next, wait for the upload to complete. You can see the file is loaded in the bucket.

Create Table in BigQuery

Webhook to BigQuery
Image Source
  • Go to the BigQuery from the menu option.
  • On G-Cloud console, click on create a dataset option. Next, provide a dataset name and location.
  • Next, click on the name of the created dataset. On G-Cloud console, click on create table option and provide the dataset name, table name, project name, and table type.

Load the data into BigQuery Table

Once the table is created successfully, you will get a notification that will allow you to use the table as your new dataset.

Webhook to BigQuery
Image Source

Alternatively, the same can be done using the Command Line as well.

Start the command-line tool and click on the cloud shell icon shown here.

The syntax of the bq command line to load the file in the BigQuery table:

Note: The Autodetect flag identifies the table schema

bq --location=[LOCATION] load --source_format=[FORMAT] 
[DATASET].[TABLE] [PATH_TO_SOURCE] [SCHEMA] 
[LOCATION] is an optional parameter that represents Location name like “us-east” 
[FORMAT] to load CSV file set it to CSV [DATASET] dataset name. 
[TABLE] table name to load the data. 
[PATH_TO_SOURCE] path to source file present on the GCS bucket. 
[SCHEMA] Specify the schema
bq --location=US load --source_format=CSV your_dataset.your_table gs://your_bucket/your_data.csv ./your_schema.json

You can specify your schema using bq command line

Loading Schema Using the Web Console

Webhook to BigQuery
Image Source

BigQuery will display all the distinct columns that were found under the Schema tab. 

Alternatively, to do the same in the command line, use the below command: 

bq --location=US load --source_format=CSV your_dataset.your_table gs://your_bucket/your_data.csv ./your_schema.json

Your target table schema can also be autodetected: 

bq --location=US load --autodetect --source_format=CSV your_dataset.your_table  gs://mybucket/data.csv

BigQuery command-line interface allows us to 3 options to write to an existing table. 

The Web Console has the Query Editor which can be used for interacting with existing tables using SQL commands.

Webhook to BigQuery
Image Source

Overwrite the table

bq --location = US load  --autodetect  --replace  --source_file_format = CSV your_target_dataset_name.your_target_table_name gs://source_bucket_name/path/to/file/source_file_name.csv

Append data to the table

bq --location = US load  --autodetect  --noreplace  --source_file_format = CSV your_target_dataset_name.your_table_table_name gs://source_bucket_name/path/to/file/source_file_name.csv  ./schema_file.json

Adding new fields in the target table

bq --location = US load  --noreplace  --schema_update_option = ALLOW_FIELD_ADDITION  --source_file_format = CSV  your_target_dataset.your_target_table  gs://bucket_name/source_data.csv  ./target_schema.json

Update data into BigQuery Table

The data that was matched in the above-mentioned steps not done complete data updates on the target table. The data is stored in an intermediate data table. This is because GCS is a staging area for BigQuery upload. There are two ways of updating the target table as described here.

Update the rows in the target table. Next, insert new rows from the intermediate table

UPDATE target_table  t 

SET t.value = s.value 

FROM  intermediate_table  s 

WHERE t.id = s.id;

INSERT target_table (id, value) 

SELECT id, value 

FROM  intermediate_table  WHERE NOT id IN (SELECT id FROM target_table);

Delete all the rows from the target table which are in the intermediate table. Then, insert all the rows newly loaded in the intermediate table. Here the intermediate table will be in truncate and load mode.

DELETE FROM final_table f WHERE f.id IN (SELECT id from intermediate_table); INSERT data_setname.target_table(id, value)  SELECT id, value FROM data_set_name.intermediate_table;

Limitations of writing custom Scripts to stream data from Webhook to BigQuery

  • The above code is built based on a certain defined schema from the Webhook source. There are possibilities that the scripts break if the source schema is modified. 
  • If in future you identify some data transformations need to be applied on your incoming webhook events, you would require to invest additional time and resources on it. 
  • Overload of incoming data, you might have to throttle the data moving to BQ. 
  • Given you are dealing with real-time streaming data you would need to build very strong alerts and notification systems to avoid data loss due to an anomaly at the source or destination end. Since webhooks are triggered by certain events, this data loss can be very grave for your business. 

Webhook to BigQuery: Use Cases

  • Inventory Management in E-commerce: E-commerce platforms can benefit from real-time inventory updates by streaming data from inventory management webhooks into BigQuery. This enables businesses to monitor stock levels, optimize supply chains, and prevent stockouts or overstocking, ensuring a seamless customer experience. Source
  • Patient Monitoring in Healthcare: Healthcare providers can leverage real-time data streaming for patient monitoring. By connecting medical device webhooks to BigQuery, clinicians can track patient health in real time, and receive alerts for abnormal readings, and provide timely interventions, ultimately leading to better patient outcomes.
  • Fraud Detection in Finance: Financial institutions can use webhooks to stream transaction data into BigQuery for fraud detection. Analyzing transaction patterns in real time helps to identify and prevent fraudulent activities, protect customer accounts, and ensure regulatory compliance.
  • Event-driven marketing: Businesses across various industries can stream event data, such as user sign-ups or product launches, into BigQuery. This allows for real-time analysis of marketing campaigns, enabling quick adjustments and targeted follow-ups to boost conversion rates.

Additonal Reads:

Conclusion

In this blog, you learned two methods for streaming real-time data from Webhook to BigQuery: using an automated pipeline or writing custom ETL codes. Regarding moving data in real-time, a no-code data pipeline tool such as Hevo Data can be the right choice for you.

Using Hevo Data, you can connect to a source of your choice and load your data to a destination of your choice cost-effectively. Hevo ensures your data is reliably and securely moved from any source to BigQuery in real time.

Want to take Hevo for a spin?

Sign Up for a 14-day free trial and experience the feature-rich Hevo suite firsthand.

Check out our Hevo Pricing to choose the best plan for you. Do let us know if the methods were helpful and if you would recommend any other methods in the comments below.

FAQ Webhook to BigQuery

How to create a webhook in GCP?

To create a Webhook follow the given steps:
1. Create a Google Cloud Function
2. Set Up the Webhook Endpoint
3. Handle the Webhook Request

Is the Webhook Technology Outdated?

Webhooks are not outdated; they are widely used in modern applications to provide real-time updates and notifications. They are a lightweight and efficient way to send real-time data between systems.

Can we call API from BigQuery?

Yes, you can call APIs from BigQuery, although BigQuery itself does not have built-in support for making HTTP requests. However, you can use external tools and workflows to achieve this. Here are a few approaches:
1. Using Google Cloud Function
2. Using Google Cloud Dataflow
3. Using Apps Script
4. External Applications

mm
Freelance Technical Content Writer, Hevo Data

Lahudas focuses on solving data practitioners' problems through content tailored to the data industry by using his problem-solving ability and passion for learning about data science.