Summary IconKEY TAKEAWAY
  • Salesforce data can be loaded into BigQuery using managed tools, native connectors, or manual workflows.
  • Managed ETL platforms like Hevo simplify syncing with automated schema handling and minimal maintenance.
  • BigQuery’s native transfer service supports scheduled batch loads with limited flexibility.
  • Manual methods offer control but require ongoing engineering effort.
  • Salesforce to BigQuery integration enables scalable analytics, BI reporting, and AI use cases.

Salesforce delivers strong value as a CRM, but as data volumes grow, analytics limitations become clear. Dashboards slow down, reports struggle to scale, and combining CRM data with other business systems becomes difficult.

This is where teams turn to Google BigQuery for faster queries, larger datasets, and scalable analytics without heavy infrastructure management. Yet moving data from Salesforce to BigQuery is rarely simple. APIs, custom objects, schema changes, and incremental updates introduce complexity that’s easy to underestimate.

Without careful planning around schema design, data freshness, and maintenance, pipelines become fragile. Reporting trust drops, and analytics teams spend more time fixing data than using it.

When done right, the results are immediate. BigQuery’s serverless architecture and high-performance querying make it an ideal destination for Salesforce data and cross-system analytics.

This guide explores three reliable ways to connect Salesforce to BigQuery, helping teams choose the right approach based on data scale, freshness needs, and engineering effort.

What is Salesforce?

Salesforce is one of the world’s most renowned customer relationship management platforms with several features to manage key accounts and sales pipelines. While Salesforce provides analytics within the software, many businesses would want to extract this data and combine it with data from other sources, such as marketing, product, and more, to gain deeper insights into customers. This can be achieved by bringing the CRM data into a modern data warehouse like BigQuery. You can also connect your Salesforce account to Power BI to directly visualize your customer data.

Key features of Salesforce

  • Ease of use: Businesses can spend more time putting it to use and comparatively less time understanding how Salesforce works.
  • Customizable: Salesforce is convenient to use and can be customized by businesses to meet their requirements.
  • Account planning: Salesforce provides you with enough data about each lead so that your sales team can customize their approach for every potential lead.
  • ETL capabilities: Salesforce connects seamlessly with other tools with the help of Salesforce ETL tools.
  • Accessibility: Salesforce is a cloud-based software system accessible from any remote location as long as you have an internet connection. Also, a mobile application makes it super convenient.


What is Google BigQuery?

BigQuery comes with a pay-as-you-go pricing model and allows users to pay only for the queries run. It is also very cost-effective as you only pay for the queries you run. Together, these features make BigQuery a sought-after data warehouse platform. It provides an outstanding query performance owing to its column-based storage system.

Key features of Google BigQuery:

  • Serverless architecture: BigQuery manages servers and storage in the background, so a user does not need to.
  • High scalability: It scales seamlessly to handle petabytes of data.
  • SQL compatibility: It supports ANSI SQL, which is helpful for people who already know SQL and want to write and run queries. This allows a user to combine various BI tools for data visualization.
  • Machine learning: BigQuery ML enables users to train and run machine learning models in BigQuery using only SQL syntax.

You can read more about the key features of BigQuery to get a better understanding of the platform.

Why Connect Salesforce to Google BigQuery?

Moving Salesforce data to BigQuery enables organizations to combine CRM data with information from other sources. This creates a unified analytics environment that drives better business decisions.

IndustryHow Salesforce Data is Used in BigQueryBusiness Impact
RetailCombines CRM data with real-time online activities and social media sentiments to analyze the full customer journey. Supports AI models for behavior forecasting.Personalized product recommendations and targeted engagement across email, mobile apps, and social channels.
HealthcareMerges appointment history and patient feedback with demographics and medical records for deeper analysis.Early identification of readmission risks and creation of personalized care plans for better patient outcomes.
Financial InstitutionsIntegrates CRM data like transaction history, credit scores, and financial goals with market and economic data.Accurate forecasting of spending and investment behavior, enabling personalized banking services and tailored offers.

Learn more about integrating Salesforce Marketing Cloud to BigQuery for marketing analytics use cases.

Prerequisites to Connect Salesforce to BigQuery

Before integrating Salesforce with BigQuery, make sure the following are in place:

  • Salesforce access: An edition with API access enabled and permission to create a Connected App for OAuth authentication.
  • Google Cloud setup: An active GCP project with BigQuery enabled and a dataset created.
  • Service account: A Google Cloud service account with BigQuery Data Editor and BigQuery Job User roles, along with a JSON key.
  • Data planning: Identify the Salesforce objects to sync, choose incremental fields, and account for data type and schema changes.

Methods to Connect Salesforce to BigQuery

There are three primary approaches to moving Salesforce data to BigQuery, each suited to different requirements, technical capabilities, and business needs.

AspectHevo DataBigQuery Native TransferManual ETL Process
Best forOngoing, production-grade analytics with minimal maintenanceSimple, scheduled batch loads for standard Salesforce objectsOne-time migrations or highly customized workflows
Setup effortMinimal, no-code setupLow to moderate, GCP UI-basedHigh, engineering-driven
MaintenanceFully managed and monitoredPartial (job monitoring required)High ongoing maintenance
Data syncAutomated incremental syncs, near real-timeBatch-based (daily or weekly)Custom-built, usually batch-based
Schema ChangesAutomatically handledLimited handling, manual review neededManual fixes required
ScalabilityAuto-scales with data volumeScales within BigQuery limitsLimited by scripts and infrastructure
Error handlingAutomated retries, alerts, and fault toleranceBasic logs and retry supportCustom error handling required
ProsNo-code, reliable, low operational riskNative Google integrationFull control over logic
ConsTool cost, less low-level customizationLimited flexibility and object supportTime-consuming, hard to scale

Method 1: Simplest Way to Move Salesforce Data to BigQuery Using Hevo

Hevo is a no-code data pipeline platform that provides a fully managed way to move Salesforce data to BigQuery with automated schema handling, incremental syncs, and built-in reliability.

Step 1: Sign in to Hevo and create a new pipeline

Sign in to your Hevo account and navigate to the dashboard. Click Create Pipeline and select Salesforce as the source.

You will be guided through the initial pipeline setup:

  • Choose Salesforce from the list of supported sources.
  • Assign a pipeline name for easy identification.
  • Select your Salesforce environment (Production or Sandbox).

Hevo will then redirect you to authenticate Salesforce using OAuth.
Once authenticated, the pipeline is ready for source configuration in the next step.

Learn more about configuring Salesforce from Hevo’s documentation.

Step 2: Connect Salesforce as the source

Sign in to your Salesforce account when prompted and grant Hevo the required API permissions. Once authenticated, select the Salesforce objects you want to replicate:

  • Standard objects (Accounts, Contacts, Leads, Opportunities).
  • Custom objects specific to your business.

Choose the replication mode:

  • Incremental sync (recommended for ongoing analytics).
  • Full refresh (for one-time backfills).

Hevo automatically detects primary keys and incremental fields where available, simplifying the setup process.

Refer to the Salesforce source guide for detailed configuration options.

Step 3: Configure Google BigQuery as the destination

Click Add Destination and select Google BigQuery. Connect using a GCP Service Account and provide:

  • Project ID
  • Dataset name (existing or auto-created)
  • Dataset region

Test the connection and continue to the next step.

Refer to the BigQuery destination setup documentation for complete details.

Step 4: Configure data loading and schema behavior

Choose whether Hevo should automatically create tables or load into existing BigQuery tables. Enable:

  • Auto schema mapping (recommended)
  • Schema evolution handling for new or changed Salesforce fields

This ensures your pipelines don’t break when Salesforce schemas change, a common challenge with manual approaches.

Step 5: Activate the pipeline

Review your pipeline configuration to ensure all settings are correct. Click Activate Pipeline to start syncing data.

Move Salesforce Data to BigQuery with Hevo

Once set up, Hevo automatically starts replicating your Salesforce data into BigQuery at your chosen sync frequency. You can track pipeline health, data freshness, and performance in real-time from the Hevo dashboard without manual intervention.

Additional Integrations:

Start your 14-day free trial

Method 2: Using BigQuery’s Native Salesforce Data Transfer Service

Google BigQuery provides a native Salesforce connector through the BigQuery Data Transfer Service. It allows you to load Salesforce data into BigQuery without building custom pipelines.

Prerequisites

Before setting up the transfer, ensure you have:

  • An active Google Cloud project with BigQuery enabled
  • A Salesforce account with API access (Professional edition or higher)
  • A Google Cloud service account with BigQuery Data Editor permissions
  • OAuth credentials or connected app access in Salesforce

Step 1: Open BigQuery Data Transfer Service

  1. Go to the Google Cloud Console
  2. Navigate to BigQuery → Data Transfers
  3. Click Create Transfer

Step 2: Select Salesforce as the source

  1. Choose Salesforce from the list of available data sources
  2. Provide a transfer name and select your target BigQuery dataset

Step 3: Authenticate Salesforce access

  1. Sign in using Salesforce OAuth
  2. Grant BigQuery permission to access Salesforce objects
  3. Select the Salesforce environment (Production or Sandbox)

Step 4: Configure transfer settings

  1. Choose the Salesforce objects to sync (e.g., Accounts, Contacts, Opportunities)
  2. Set the transfer schedule (daily or weekly)
  3. Define whether the transfer should overwrite or append data

Step 5: Run and monitor the transfer

  1. Save and start the transfer
  2. Monitor job status and failures directly from the BigQuery UI
  3. Review logs for schema changes or API-related issues

Method 3: Manual Export and Custom Code Workflows

Manual export and custom code–based workflows give you full control over how Salesforce data is extracted, transformed, and loaded into BigQuery. This approach is typically used by teams with strong engineering bandwidth or specific customization requirements that managed tools cannot easily support.

Step 1: Export data from Salesforce

  1. Use Salesforce Data Export or Data Loader to export required objects as CSV
  2. Or use Salesforce REST/Bulk API to extract data programmatically

Step 2: Stage the data

  1. Upload exported CSV files to Google Cloud Storage (GCS)
  2. Organize files by object and date to simplify downstream loads

Step 3: Prepare the schema

  1. Review Salesforce field types and map them to BigQuery data types
  2. Handle DateTime, Boolean, and picklist fields explicitly

Step 4: Load data into BigQuery

  1. Use BigQuery UI or bq load to ingest data from GCS into tables
  2. Enable schema auto-detection or provide a predefined schema

Step 5: Handle incremental updates (optional)

  1. Track LastModifiedDate or similar watermark fields
  2. Export and load only new or updated records in subsequent runs

Step 6: Monitor and maintain

  1. Update scripts when Salesforce schemas or APIs change
  2. Add logging, retries, and validation checks
Move Salesforce Data to BigQuery with Hevo

Hevo provides a no-code way to keep Salesforce data reliably synced with BigQuery. Fully manage pipelines, handle schema changes automatically, and support near real-time updates without manual scripting or ongoing maintenance.

Why teams choose Hevo

  • Fast setup: Connect Salesforce and BigQuery in minutes using prebuilt connectors.
  • Reliable pipelines: Automated retries, monitoring, and schema handling keep data consistent.
  • No-code workflows: Build and manage pipelines without custom scripts.

Trusted by data teams at companies like ThoughtSpot and Postman, Hevo is rated 4.4/5 on G2 for ease of use and reliability. Sign up for a 14-day free trial and explore Hevo’s capabilities for yourself.

Get Started with Hevo for Free

What are the Use cases for Migrating Salesforce Data to BigQuery?

Organizations use Salesforce’s Data Cloud along with Google’s BigQuery and Vertex AI to enhance their customer experiences and tailor interactions with them.  Salesforce BigQuery integration enables organizations to combine and analyze data from their Salesforce CRM system with the powerful data processing capabilities of BigQuery. Let’s understand some real-time use cases:

  • Retail: Retail businesses can integrate CRM data with non-CRM data, such as real-time online activity and social media sentiment in BigQuery to help you understand the complete customer journey and subsequently when you implement customized AI models to forecast customer tendency. The outcome involves delivering highly personalized customer recommendations through optimal channels like email, mobile apps, or social media.
  • Healthcare Organizations: CRM data, including appointment history and patient feedback, can be integrated with non-CRM data, such as patient demographics and medical history, in BigQuery. The outcome is the prediction of patients susceptible to readmission, allowing for creating personalized care plans. This proactive approach enhances medical outcomes through preemptive medical care.
  • Financial institutions: Financial institutions have the capability to integrate CRM data encompassing a customer’s transaction history, credit score, and financial goals with non-CRM data such as market analysis and economic trends. By utilizing BigQuery, these institutions can forecast customers’ spending patterns, investment preferences, and financial goals. This valuable insight informs the provision of personalized banking services and offers tailored to individual customer needs.

Read More About: Salesforce Marketing Cloud to Bigquery

Conclusion

manual exports and custom scripts to fully managed integrations. While custom coding offers maximum control, it also brings significant overhead in terms of development effort, monitoring, and ongoing maintenance.

For teams that want a simpler and more dependable path, a managed platform like Hevo removes this operational burden. With built-in reliability, automated schema handling, and minimal setup, Hevo enables secure and consistent Salesforce to BigQuery data movement without the complexity of maintaining custom pipelines.

Sign up for a 14-day free trial and experience the feature-rich Hevo suite firsthand. You can also have a look at the unbeatable pricing that will help you choose the right plan for your business needs

Frequently Asked Questions

1. How to transfer data from Salesforce to Oracle?

Export Data: Use Salesforce Data Export or Data Loader to export data as CSV files.
Transform Data: Prepare data for Oracle if necessary.
Import Data: Use Oracle SQL Developer or an ETL tool like Hevo Data to load the CSV files into Oracle.

2. How do I transfer data from Salesforce to SQL Server?

Export Data: Export data from Salesforce using Data Export or Data Loader.
Prepare Data: Format data to match SQL Server schema.
Import Data: Use SQL Server Integration Services (SSIS) or Hevo Data to import the data into SQL Server.

3. How do I transfer data to BigQuery?

You can use automated platforms like Hevo to migrate your data to BigQuery.

Shiny is a Senior Content Specialist with expertise in B2B SaaS product marketing. A tech marketer with a passion for product-led storytelling, Shiny focuses on creating customer-centric narratives, clear product positioning, and strategic content that drives business growth.