Summary IconKey Takeaways

There are 3 main ways to move your Facebook Ads data into Google BigQuery:

1. Hevo Data: This is the simplest method. You just set it up once, and it keeps sending your data on its own. It’s great for anyone who doesn’t know how to code.

2. Manual CSV Export: You download your data from Facebook and then upload it to BigQuery yourself. This works well if you only need the data sometimes and don’t mind doing it manually.

3. Facebook Marketing API: This method is for people who know how to code. It takes more work but gives you more control over your data.

Getting insights from Facebook Ads can be tricky. Export limits, scattered reports, and lots of manual work often get in the way. If you want cleaner, faster reporting, you need a better setup.

Google BigQuery is a great solution. It helps you store, manage, and analyze all your ad data in one place. But sending your Facebook Ads data to BigQuery isn’t always simple.

In this blog, we’ll walk you through three easy ways to connect the two. You can use Hevo Data, write custom scripts, or move the data manually.

Seamlessly Load Your Facebook Ads Data to BigQuery with Hevo!

Ditch the manual process of writing long commands to connect Facebook Ads to BigQuery, and choose Hevo’s no-code platform to streamline your data migration. 

With Hevo:

  1. Easily migrate different data types like CSV, JSON, etc. 
  2. 150+ connectors like Facebook Ads and Google BigQuery(including 60+ free sources).
  3. Eliminate the need for manual schema mapping with the auto-mapping feature.

Experience Hevo and see why 2000+ data professionals, including customers such as Thoughtspot, Postman, and many more, have rated us 4.4/5 on G2.

Get Started with Hevo for Free

Method 1: Using Hevo to Move Data from Facebook Ads to BigQuery

Hevo is a data pipeline tool. It connects your source (Facebook Ads) to your destination (Google BigQuery) without any code.

Step 1: Sign Up and Log In to Hevo

  • Once you’re logged in, you’ll land on the Hevo Dashboard.

Step 2: Add Facebook Ads as a Source

  • On the Dashboard, click the + Add Source button.
  • Choose Facebook Ads from the list of 150+ sources.
  • Authorize Hevo to connect to your Facebook account.
  • Select the Facebook Ad account you want to pull data from, like campaigns, ad sets, or performance metrics.

Step 3: Set Google BigQuery as the Destination

  • Back on the Dashboard, click + Add Destination.
  • Select Google BigQuery from the list.
  • Sign in with your Google account and grant access.
  • Select the BigQuery Project, Dataset, and Table where you want the data to land.
  • Make sure permissions are granted to allow Hevo to write to BigQuery.

Step 4: Activate Pipeline

  • Name your pipeline (e.g., FB Ads to BigQuery).
  • Click Start Pipeline.

That’s it, now your data is flowing automatically. You can start running SQL queries in BigQuery or build dashboards in tools like Looker Studio.

Method 2: Writing Custom Scripts to Move Data

If you’re comfortable writing code, you can build your own pipeline using Facebook APIs and Google Cloud services.

Step 1: Extract Data from Facebook Ads

There are two ways to get the data:

Option A: Use Facebook Marketing API

1. Create a Facebook Developer Account

Start by registering as a developer.

  • Go to Facebook for Developers website.
  • Log in with your Facebook credentials.
  • Click “Get Started” and follow the steps to set up your developer profile.

2. Create a Facebook App

This app will be the interface that connects your code to Facebook’s API.

  • On the Developer Dashboard, click “My Apps”“Create App.”
  • Select the app type: Business is a good choice for advertising use cases.
  • Enter a name for the app and your contact email.
  • Click “Create App” and complete the security check.

3. Add the Marketing API Product to Your App

This enables your app to access ad-related data.

  • Inside your app dashboard, scroll down to “Add a Product” on the left menu.
  • Scroll down to Marketing API and click “Set Up”.

4. Generate an Access Token

This token lets you authenticate your API calls.

  • In the Tools section of your app dashboard, open the Access Token Tool.
  • Click “Get Token” → Select “Get User Access Token”.
  • Select the necessary permissions you’ll need, such as:
    • ads_read
    • ads_management
    • business_management
  • Approve the permission request when prompted by Facebook.
  • You’ll now see your access token in the tool. Copy it. You’ll need this for every API call.

5. Use the Graph API to Fetch Ad Insights

Use this basic Graph API request to pull ad performance data:

curl -G \-d 'access_token=<ACCESS_TOKEN>' \-d 'fields=campaign_name,spend,impressions,clicks' \"https://graph.facebook.com/v18.0/act_<AD_ACCOUNT_ID>/insights"

Option B: Use Real-time Updates (Webhooks)

If you want real-time updates from Facebook Ads, such as when there’s a new campaign, ad, or change in performance metrics, you can use Facebook Webhooks

This method pushes data to your system automatically, rather than you pulling it with API calls.

Here’s how to set it up:

1. Create a Facebook App (if not already done)

If you haven’t already created a Facebook App in the Developer portal, follow the same steps as in Option A.

2. Set Up a Webhook Endpoint on Your Server

  • Build an HTTPS endpoint that can receive POST requests from Facebook.
  • The endpoint must handle:
    • A GET request for verification (Facebook sends a challenge code).
    • A POST request that contains the actual updates when changes happen.

Example verification response:

# Pseudocode exampledef webhook_verify(request):    if request.GET['hub.verify_token'] == 'your_custom_token':        return request.GET['hub.challenge']    else:        return 403

3. Subscribe to the Webhooks

  • Go to your app dashboard on developers.facebook.com.
  • In the left sidebar, click WebhooksAdd Callback URL.
  • Enter your server’s HTTPS URL and a verify token (must match your code).
  • Choose the object type you want to track, for example: ad_account.
  • Subscribe to fields such as:
    • ads
    • campaigns
    • adsets
    • insights (for performance metrics)

4. Grant Permissions and Subscribe to a Page or Ad Account

  • You’ll need to use an access token with the required permissions (ads_management, pages_manage_ads, etc.) to subscribe your webhook to actual objects like a Page or Ad Account.

Here’s an example Graph API call to subscribe a webhook to an ad account:

curl -X POST \  "https://graph.facebook.com/v18.0/<APP_ID>/subscriptions" \  -F "object=ad_account" \  -F "callback_url=https://yourserver.com/webhook" \  -F "fields=ads,insights" \  -F "verify_token=your_custom_token" \  -F "access_token=YOUR_APP_ACCESS_TOKEN"

5. Process Incoming Data

Once set up, Facebook will send updates to your webhook endpoint anytime a tracked event occurs. This is useful for:

  • Monitoring performance in near real-time
  • Triggering alerts or workflows
  • Syncing data into a warehouse like BigQuery through an ETL pipeline

Note: Webhook playloads are usually minimal, just IDs and timestamps. You’ll still need to query the Graph API with those IDs to get full details.

Step 2: Load Data to BigQuery

Once you’ve extracted your Facebook Ads data, you need to get it into BigQuery for analysis. There are two main ways to do this.

Option A: Use Google Cloud Storage (GCS)

This is one of the most commonly used methods when working with batch files like CSV or JSON.

1. Save the Facebook Ads Data as a File

  • Once you’ve fetched your ad data using the API (or received it through Webhooks and stored it), format it as either:
    • .csv – good for simple tabular data
    • .json – useful for nested structures like ad sets or campaign details

Make sure the formatting aligns with BigQuery’s supported data types.

2. Upload the File to a GCS Bucket

You have multiple ways to get your file into Google Cloud Storage:

  • Option 1: GCS Console
  • Option 2: Using gsutil Command
gsutil cp ./facebook_data.json gs://your-bucket-name/
  • Option 3: REST API

    You can automate uploads with a POST request:
POST /upload/storage/v1/b/my-bucket/o?uploadType=media&name=data.json

3. Create a Load Job in BigQuery

Once your data is in the bucket:

  • Go to BigQuery in the Cloud Console.
  • Click Create Table.
  • Set the source as Google Cloud Storage and select the file.
  • Choose the correct file format (CSV or JSON).
  • Define your table schema manually or let BigQuery auto-detect it.
  • Click Create Table to import the data.

Option B: Direct API Upload

If you want to skip the Google Cloud Storage step and load data into BigQuery programmatically, you can use the BigQuery API to insert data directly. This is useful for real-time or automated pipelines.

  • Use the BigQuery API to send data directly using HTTP POST requests.

1. Format Your Facebook Ads Data as JSON

Before sending the data, make sure it’s in the proper JSON format that BigQuery accepts for insertion.

Example:

[  {    "campaign_name": "Summer Sale",    "impressions": 12000,    "clicks": 950,    "spend": 340.75,    "date": "2025-07-15"  },  {    "campaign_name": "Back to School",    "impressions": 18000,    "clicks": 1200,    "spend": 475.00,    "date": "2025-07-16"  }]

2. Set Up Authentication

You’ll need to authenticate using either:

  • A service account key file (for server-to-server apps)
  • Or an OAuth token (for user-based apps)

Make sure your credentials have the following permissions:

  • bigquery.tables.insertData
  • bigquery.jobs.create

3. Send Data Using BigQuery’s tabledata.insertAll Endpoint

Here’s a sample POST request using curl:

curl -X POST \  -H "Authorization: Bearer YOUR_ACCESS_TOKEN" \  -H "Content-Type: application/json" \  -d '{        "rows": [          {            "json": {              "campaign_name": "Summer Sale",              "impressions": 12000,              "clicks": 950,              "spend": 340.75,              "date": "2025-07-15"            }          }        ]      }' \  "https://bigquery.googleapis.com/bigquery/v2/projects/YOUR_PROJECT_ID/datasets/YOUR_DATASET_ID/tables/YOUR_TABLE_ID/insertAll"

Replace:

  • YOUR_ACCESS_TOKEN with a valid OAuth or service token
  • YOUR_PROJECT_ID, YOUR_DATASET_ID, and YOUR_TABLE_ID with actual values

4. Monitor the Response for Errors

The API will return a response showing whether the insert was successful or if any rows failed.

Example success response:

{
  "kind": "bigquery#tableDataInsertAllResponse"
}

If there are errors, they’ll be listed with the problematic row index and error type.

This method is ideal if you want to continuously stream cleaned Facebook Ads data into BigQuery without storing files first..

Limitations of Using Custom Scripts

Building a custom code for transferring data from Facebook Ads to Google BigQuery may appear to be practical. However, this approach comes with some limitations, too.

  • Code Maintenance: Since you are building the code yourself, you must also monitor and maintain it. If Facebook refreshes its API or the API sends a field with a datatype that your code doesn’t understand, you would need to have resources that can handle these ad-hoc requests.
  • Data Consistency: You will also need to set up a data validation system to ensure no data leakage in the infrastructure.
  • Real-time Data: The above approach can help you move data from Facebook Ads into BigQuery once. If you are looking to analyze data in real-time, you will need to deploy additional code on top of this.
  • Data Transformation Capabilities: Often, you will need to transform the data received from Facebook before analyzing it. For example, when running ads across different geographies globally, you will want to convert the timezones and currencies from your raw data and bring them to a standard format. This would require extra effort.
Integrate Facebook Ads to BigQuery
Integrate Facebook Ads to Snowflake
Integrate Facebook Ads to Redshift

Method 3: Manual Upload (No Coding, But Repetitive)

This is a DIY option if you don’t need automation or real-time updates.

Step-by-Step:

Step 1: Create a Google Cloud Project

Step 2: Open BigQuery

  • Go to the BigQuery section in the left-hand menu.
  • You’ll see your project listed.

Step 3: Download Facebook Ads Data

  • Open Facebook Ads Manager.
  • Create or view a report.
  • Click Export Table Data → Choose .CSV format.
  • Save the file to your computer.

Step 4: Upload Data to BigQuery

  1. Go back to BigQuery and select your project.
  2. Click + Create Dataset → Give it a name → Click Create.
  3. Click on your dataset → Click Create Table.
  4. In the Source section:
    • Select “Upload”
    • Choose your downloaded CSV file.
  5. In the Destination section:
    • Select your dataset.
    • Name your new table.
  6. In the Schema section:
    • Choose Auto-detect or enter manually.
  7. Optional: Set Partitioning and Field Delimiter (use comma).
  8. Click Create Table.

That’s it! Your Facebook data is now inside BigQuery.

Limitations of Manual Upload of Data

  • Data Extraction: Downloading data from Facebook Ads manually for large-scale data is daunting and time-consuming.
  • Data Uploads: A manual process of uploading will need to be watched and involved continuously.
  • Human Error: In a manual process, errors such as mistakes in data entry, omitted uploads, and duplication of records can take place.
  • Data Integrity: There is no automated assurance mechanism to ensure the integrity and consistency of the data.
  • Delays: Manual uploads risk creating delays in availability and the actual integration of data for analysis.

Benefits of Sending Data From Facebook Ads to Google BigQuery

  • Identify patterns with SQL queries: You can use advanced SQL queries to gain deeper insights into your ad performance. This helps you analyze data from multiple angles, spot patterns and understand metric correlations.
  • Conduct multi-channel ad analysis: You can integrate your Facebook Ads data with metrics from other sources like Google Ads, Google Analytics 4, CRM, or email marketing apps. Doing this lets you analyze your overall marketing performance and understand how different channels work together.
  • Analyze ad performance in-depth: You can conduct a time series analysis to identify changes in ad performance over time and understand how factors like seasonality impact ad performance.
  • Leverage ML algorithms: You can also build and train ML models to forecast future performance, identify which factors drive ad success, and optimize your campaigns accordingly.
  • Data Visualization: ​​You can build powerful interactive dashboards by connecting BigQuery to PowerBI, Looker Studio (former Google Data Studio), or another data visualization tool. This enables you to create custom dashboards that showcase your key metrics, highlight trends, and provide actionable insights to drive better marketing decisions.

Use Cases of Facebook Ads BigQuery Integration

  • Marketing Campaigns: Analyzing Facebook Ads audience data in BigQuery can help you enhance the performance of your marketing campaigns. Advertisement data from Facebook combined with business data in BigQuery can give better insights for decision-making. 
  • Personalized Audience Targeting: On viewing Facebook Ads conversion data in BigQuery, you can utilize BigQuery’s powerful querying capabilities to segment audiences based on detailed demographics, interests, and behaviors extracted from Facebook Ads data. 
  • Competitive Analysis: You can compare your Facebook attribution data in BigQuery to understand the ad performance of industry competitors using publicly available data sources.

How Can You Get Real-time Streams of Your Facebook Ad Statistics?

You can easily create a real-time data infrastructure for extracting and loading data from Facebook Ads into a Data Warehouse repository. You can achieve this by subscribing to real-time updates to receive API updates with Webhooks. Armed with the proper infrastructure, you can have an almost real-time data feed into your repository and ensure that it will always be updated with the latest data. Facebook Ads is a real-time bidding system where advertisers can compete to showcase their advertising material.

Facebook Ads imparts a very rich API that allows you to get extremely granular data regarding your accounting activities and leverage it for reporting and analytic purposes. This richness will cost you, though many complex resources must be tackled with an equally intricate protocol.

You can also look at how to connect Facebook Ads to Snowflake to load your Facebook Ads data to different destinations.

    Conclusion

    This blog talks about three methods you can use to move data from Facebook Ads to BigQuery seamlessly. It also provides information on the limitations of manual methods and use cases of integrating Facebook Ads data to BigQuery.

    Data migration from Facebook Ads is a time-consuming and tedious operation, but with the help of a data integration solution like Hevo, it can be done with little work and in no time.

    Sign up for a 14-day free trial and simplify your data integration process. Check out the pricing details to understand which plan fulfills all your business needs.

      FAQ about Facebook Ads to Google BigQuery

      1. How do I get Facebook data into BigQuery?

      To get Facebook data into BigQuery you can use one of the following methods:
      1. Use ETL Tools
      2. Google Cloud Data Transfer Service
      3. Run Custom Scripts
      4. Manual CSV Upload

      2. How do I integrate Google Ads to BigQuery?

      Google Ads has a built-in connector in BigQuery. To use it, go to your BigQuery console, find the data transfer service, and set up a new transfer from Google Ads.

      3. How to extract data from Facebook ads?

      To extract data from Facebook ads, you can use the Facebook Ads API or third-party ETL tools like Hevo Data.

      4. How to Prepare Facebook Ads data for BigQuery

      Before loading Facebook Ads data into BigQuery, ensure it’s in CSV or JSON format. BigQuery accepts specific data types only: STRING, INTEGER, FLOAT, TIMESTAMP, and RECORD. If your data is in XML or another format, convert it first. Proper formatting ensures your data integrates smoothly with BigQuery.

      Arannyk Monon
      Freelance Technical Content Writer, Hevo Data

      Arannyk is specialized in writing for the data industry, offering valuable insights and informative content on the complexities of data integration and analysis. He loves to update himself about the latest trends and help businesses through the knowledge.