Pardot to BigQuery Integration: 2 Easy Ways to Connect

on CRMs, Data Integration, Data Migration, Data Warehouse, Google BigQuery, pardot, Salesforce • July 27th, 2022 • Write for Hevo

pardot to bigquery: FI

Pardot is a top-rated marketing tool that helps businesses automate tasks such as email automation, targeting marketing campaigns, leading management projects, and more. It helps track customer behavior and therefore improves the digital marketing campaigns of businesses.

Companies can store all these marketing campaign data of Pardot in a centralized repository like Google BigQuery. The marketing campaign data can then be stored and analyzed in BigQuery with powerful BI tools such as Google Data Studio, PowerBI, Tableau, and more to gain meaningful insights. You can connect Pardot data with Google BigQuery using third-party ETL (Extract, Load, and Transform) tools and standard APIs.

This article will guide to connecting Pardot to BigQuery using different processes.

Table of Contents

Prerequisites

The basic need for integration

What is Pardot?

pardot to bigquery: pardot logo
Image Source

Developed in 2007, Pardot is a Software as a Service marketing automation platform developed by Salesforce. It helps businesses in marketing and sales to create, manage, and implement online marketing campaigns that improve sales. Pardot allows marketers to identify their potential customers by speaking with the prospects at the right time and in the right manner.

Pardot assists businesses in tasks such as lead management, email automation, ROI tracking, targeted email campaigns, and more. Salesforce’s CRM (Customer Relationship Management) software synchronizes with Pardot to improve the performance of businesses. Users’ changes in Pardot can be easily reflected in Salesforce within 10 minutes. You need to configure the Pardot Lightning app to provide users access to Pardot. With the Lightning app, businesses can collaborate their sales and marketing teams on a single platform.

Key Features of Pardot

  • Dynamic Content: The dynamic content feature in Pardot allows businesses to tailor content for each market segment. Businesses can design various iterations of forms, landing pages, emails, and more depending on user engagement.
  • Salesforce User Sync: With the Salesforce user sync administration feature, businesses can manage their Pardot and Salesforce accounts simultaneously.
  • B2B Marketing Analytics: Businesses use ROI metrics to calculate the effectiveness of marketing campaigns. The B2B Marketing analytics feature in Pardot combines sales and marketing data to provide information about the performance of businesses.

What is Google BigQuery?

pardot to bigquery: bigquery logo
Image Source

Developed in 2010, Google BigQuery is a trendy and highly scalable data warehouse that does not require any administration. It consists of a BI engine that can easily store and analyze petabytes of data quickly. BigQuery uses standard SQL queries to analyze and obtain answers from a colossal amount of data. BigQuery includes columnar storage, which assists businesses in providing high performance and high data compression capabilities.

BigQuery enables developers and data scientists to work with several programming languages like C. C++, Java, Python, Go, JavaScript, and more. They can also leverage BigQuery API for transforming and managing data effectively.

Key Features of Google BigQuery

  • BigQuery BI Engine: BigQuery consists of a BI engine that assists enterprises in processing large volumes of data with sub-second response time and high concurrency.
  • Machine Learning: Google BigQuery provides an opportunity for businesses to create machine learning models using standard SQL queries. Using SQL queries, enterprises can build machine learning models such as Linear Regression, Multi-class Regression, K-means Clustering, Binary Logistic Regression, and more.
  • User Friendly: Storing and analyzing data in BigQuery is a straightforward process. BigQuery has an intuitive user interface that allows organizations to set up a cloud data warehouse without installing clusters and efficiently choosing storage sizes and encryption settings.

Why Integrate Pardot to BigQuery?

If you use Pardot to centrally store all of your customer data and interactions, you’ll likely want to analyze this information along with user behavior and product demand data to better understand your clientele and revenue sources. Get analytics-ready data without any manual work by integrating your Pardot data with BigQuery. Once your Pardot data has been loaded into BigQuery, you can combine it with your customer, marketing, and service data to derive rich, practical insights for better business opportunities.

Reliably integrate data with Hevo’s Fully Automated No Code Data Pipeline

If yours is anything like the 1000+ data-driven companies that use Hevo, more than 70% of the business apps you use are SaaS applications. Integrating the data from these sources in a timely way is crucial to fuel analytics and the decisions that are taken from it. But given how fast API endpoints etc can change, creating and managing these pipelines can be a soul-sucking exercise.

Hevo’s no-code data pipeline platform lets you connect over 150+ sources in a matter of minutes to deliver data in near real-time to your warehouse. What’s more, the in-built transformation capabilities and the intuitive UI means even non-engineers can set up pipelines and achieve analytics-ready data in minutes. 

All of this combined with transparent pricing and 24×7 support makes us the most loved data pipeline software in terms of user reviews.

Take our 14-day free trial to experience a better way to manage data pipelines.

Get started for Free with Hevo!

Connecting Pardot to BigQuery

Method 1: Using Hevo to Set Up Pardot to BigQuery

pardot to bigquery: Hevo Logo
Image Source

Hevo provides an Automated No-code Data Pipeline that helps you move your Pardot swiftly to BigQuery. Hevo is fully-managed and completely automates the process of not only loading data from your 150+ Sources(including 40+ free sources)but also enriching the data and transforming it into an analysis-ready form without having to write a single line of code. Its fault-tolerant architecture ensures that the data is handled in a secure, consistent manner with zero data loss.

Using Hevo Pardot to BigQuery Migration can be done in the following 2 steps:

  • Step 1: Configure Pardot as the Source in your Pipeline by following the steps below:
    • Step 1.1: In the Asset Palette, select PIPELINES.
    • Step 1.2: In the Pipelines List View, click + CREATE.
    • Step 1.3: Select Pardot on the Select Source Type page.
    • Step 1.4: Click + ADD SALESFORCE ACCOUNT LINKED TO PARDOT on the Configure your Salesforce Account linked to Pardot page.
pardot to bigquery: configure salesforce account linked to pardot
Image Source
  • Step 1.5: Click CONTINUE after selecting the environment linked to your Salesforce login.
pardot to bigquery: select salesforce environment
Image Source
  • Step 1.6: Enter your Salesforce login information.
  • Step 1.7: To grant Hevo access to your Pardot environment, click Allow. The Configure your Pardot Source page then opens for you.
pardot to bigquery: allow hevo access
Image Source
  • Step 1.8: Enter the following information on the Configure your Pardot Source page:
    • Pipeline Name: A name for the Pipeline that is unique and does not exceed 255 characters. 
    • Pardot Domain Name: Choose the domain name that hosts your Pardot data.
    • Pardot Business Unit ID: A distinguishing number for the Pardot business unit whose information you want to replicate. It starts with the value 0Uv and has 18 characters.
    • Historical Sync Duration: The amount of time that the historical data must be ingested. All Available Data is the default historical sync duration.
pardot to bigquery: configure pardot as source
Image Source
  • Step 1.9: TEST & CONTINUE is the button to click.
  • Step 1.91: Set up the Destination and configure the data ingestion.
  • Step 2: To set up Google BigQuery as a destination in Hevo, follow these steps:
    • Step 2.1: In the Asset Palette, select DESTINATIONS.
    • Step 2.2: In the Destinations List View, click + CREATE.
    • Step 2.3: Select Google BigQuery from the Add Destination page.
    • Step 2.4: Choose the BigQuery connection authentication method on the Configure your Google BigQuery Account page.
pardot to bigquery: configure bigquery account
Image Source
  • Step 2.5: Choose one of these:
    • Using a Service Account to connect:
      • Service Account Key file, please attach.
      • Note that Hevo only accepts key files in JSON format.
      • Go to CONFIGURE GOOGLE BigQuery ACCOUNT and click it.
    • Using a user account to connect:
      • To add a Google BigQuery account, click +.
      • Become a user with BigQuery Admin and Storage Admin permissions by logging in.
      • To grant Hevo access to your data, click Allow.
pardot to bigquery: hevo access
Image Source
  • Step 2.6: Set the following parameters on the Configure your Google BigQuery page:
    • Destination Name: A unique name for your Destination.
    • Project ID: The BigQuery Project ID that you were able to retrieve in Step 2 above and for which you had permitted the previous steps.
    • Dataset ID: Name of the dataset that you want to sync your data to, as retrieved in Step 3 above.
    • GCS Bucket: To upload files to BigQuery, they must first be staged in the cloud storage bucket that was retrieved in Step 4 above.
    • Enable Streaming Inserts: Enable this option to load data via a job according to a defined Pipeline schedule rather than streaming it to your BigQuery Destination as it comes in from the Source. To learn more, go to Near Real-time Data Loading Using Streaming. The setting cannot be changed later.
    • Sanitize Table/Column Names: Activate this option to replace the spaces and non-alphanumeric characters in between the table and column names with underscores ( ). Name Sanitization is written.
pardot to bigquery: configure bigquery as destination
Image Source
  • Step 2.5: Click Test Connection to test connectivity with the Amazon Redshift warehouse.
  • Step 2.6: Once the test is successful, click SAVE DESTINATION.

Deliver smarter, faster insights with your unified data

Using manual scripts and custom code to move data into the warehouse is cumbersome. Changing API endpoints and limits, ad-hoc data preparation, and inconsistent schema makes maintaining such a system a nightmare. Hevo’s reliable no-code data pipeline platform enables you to set up zero-maintenance data pipelines that just work.

  • Wide Range of Connectors: Instantly connect and read data from 150+ sources including SaaS apps and databases, and precisely control pipeline schedules down to the minute.
  • In-built Transformations: Format your data on the fly with Hevo’s preload transformations using either the drag-and-drop interface or our nifty python interface. Generate analysis-ready data in your warehouse using Hevo’s Postload Transformation. 
  • Near Real-Time Replication: Get access to near real-time replication for all database sources with log-based replication. For SaaS applications, near real-time replication is subject to API limits.   
  • Auto-Schema Management: Correcting improper schema after the data is loaded into your warehouse is challenging. Hevo automatically maps source schema with destination warehouse so you don’t face the pain of schema errors.
  • Transparent Pricing: Say goodbye to complex and hidden pricing models. Hevo’s Transparent Pricing brings complete visibility to your ELT spend. Choose a plan based on your business needs. Stay in control with spend alerts and configurable credit limits for unforeseen spikes in the data flow.
  • 24×7 Customer Support: With Hevo you get more than just a platform, you get a partner for your pipelines. Discover peace with round-the-clock “Live Chat” within the platform. What’s more, you get 24×7 support even during the 14-day free trial.
  • Security: Discover peace with end-to-end encryption and compliance with all major security certifications including HIPAA, GDPR, and SOC-2.

Try Hevo Today!

SIGN UP HERE FOR A 14-DAY FREE TRIAL

Method 2: Using Custom Code to Move Data from Pardot to BigQuery

This method of connecting Pardot to BigQuery uses a more technical approach. Pardot helps businesses improve their digital marketing campaigns by automating tasks such as email marketing, leading management projects, tracking customer behavior, and more. Companies can store all this Pardot marketing campaign data in a data warehouse like BigQuery, which can be used for in-depth analysis. Google BigQuery uses powerful BI tools to help businesses make better data-driven decisions.

Exporting Pardot Data

It is assumed that you have signed in to the Pardot account. Follow the below steps to export Pardot data.

  • Navigate to the:

Lightening: Prospects > Pardot Prospects 

Classic: Prospects > Prospects List

  • Click Tools and then select the CSV Export option.
pardot to bigquery: prospects
Image Source
  • Give the name to the Export.
  • Select the Export type as Express or Full. It is recommended to select the Express Export Type as you have lots of prospects.
pardot to bigquery: export 5 prospects
Image Source
  • Click on Export.
  • Once your export is ready, you will receive an email.
  • For your marketing assets and sync errors, go to them and follow the same steps above. 

For example, if you want to export prospect data from sent emails, follow the below steps:

  • Navigate to Pardot Email > Sent Emails > Tools.
pardot to bigquery: sent emails csv export
Image Source
  • To export sync errors navigate to Pardot Settings > Connectors > Click on the CRM cog > Sync Errors.
pardot to bigquery: prospect sync errors
Image Source

Importing Data into BigQuery

With BigQuery, you can append your CSV file for overwriting an existing table or position in BigQuery.

The file is loaded into the BigQuery table and converted into columnar format or Capacitor.

Add the below required IAM permissions before loading the CSV file into BigQuery.

  • Permissions to load data into BigQuery

bigquery.tables.create
bigquery.tables.updateData
bigquery.tables.update
bigquery.jobs.create
  • Each predefined IAM role includes the below permission.
roles/bigquery.dataEditor
roles/bigquery.dataOwner
roles/bigquery.admin (includes the bigquery.jobs.create permission)
bigquery.user (includes the bigquery.jobs.create permission)
bigquery.jobUser (includes the bigquery.jobs.create permission)
  • Permissions for loading data from the cloud storage
Storage.objects.get
Storage.objects.list
  • The below prerequisites are needed to load the CSV file into a BigQuery table.
    • The Cloud Console.
    • The bq command-line tool’s bq load command.
    • Calling the jobs.insert API method.
    • Client libraries.

Follow the following steps to load the CSV file into the BigQuery table.

  • Navigate to the BigQuery on the Cloud Console.
pardot to bigquery: bigquery in cloud console
Image Source
  • Open your project with the Explorer pane and then select the database.
  • Click on the Create Table option in the Dataset Info section.
pardot to bigquery: create table
Image Source
  • Navigate to the Create table panel and enter the details. Choose Google Cloud Storage in the Source section. Create the table from the list and follow the instructions below.
  • Select the file from the Google Cloud Storage bucket or enter the Cloud Storage URI
  • Select the CSV format.

Mention the below details in the Destination section.

  • Select the database to create the table in BigQuery.
  • You need to enter the table’s name in the Table field.
  • Check that the table is set to the Native table.
  • Enter the Schema definition in the Schema definition. Select the Auto-detect option for enabling auto-detection for Schema. Enter the below Schema definition using any of the following ways.
  • Click on the Edit as Text and paste the Schema in the JSON arrays. You can generate the Schema by using the same process as ‘creating a JSON schema file’ when using JSON arrays.

The Schema of the existing table is viewed in the JSON format using the below command.

bq show --format=prettyjson dataset.table
  • Click on the Add field to add the table Schema. You can also add the field’s name, type, and mode.

To create the table in BigQuery, you have to click on Advanced Options and follow the next instructions.

Limitations of Manually Connecting Pardot to BigQuery

Businesses can connect Pardot to BigQuery using standard APIs or manual processes. With the manual processes, businesses can easily export data from Pardot data to BigQuery. Although manual processes are manageable, you cannot work with real-time data. And in the case of standard APIs, you require a strong technical team to connect Pardot to BigQuery. As a result, to eliminate such issues, businesses can connect Pardot to BigQuery using third-party ETL tools like Hevo for automating pipelines between Pardot and Google BigQuery.

Conclusion

In this article, you learned about connecting Pardot to BigQuery. Pardot helps businesses route leads to sales, create automated marketing campaigns, analyze prospect activity, and more. Companies can integrate this Pardot to BigQuery, where powerful BI tools can be used to gain meaningful insights and help businesses make better decisions.

Visit our Website to Explore Hevo

Hevo offers a No-code Data Pipeline that can automate your data transfer process, hence allowing you to focus on other aspects of your business like Analytics, Marketing, Customer Management, etc.

This platform allows you to transfer data from 150+ sources (including 40+ Free Sources) such as Pardot and Cloud-based Data Warehouses like Snowflake, Google BigQuery, etc. It will provide you with a hassle-free experience and make your work life much easier.

Want to take Hevo for a spin? 

Sign Up for a 14-day free trial and experience the feature-rich Hevo suite first hand. You can also have a look at the unbeatable pricing that will help you choose the right plan for your business needs.

No-Code Data Pipeline for Google BigQuery