The journey that a customer takes through an eCommerce platform is complicated. They use various devices, such as smartphones and computers, to research items before making a purchase. This research includes visiting numerous websites, utilizing review platforms, and looking into various products. If you want to target a specific group of people, use marketing solutions like Outbrain, that leverage emails, SMS, search engine ads, and remarketing to efficiently target potential buyers. You can learn more about product demand trends by looking at ad impressions, click-through rates, conversion rates, and search histories on marketing platforms.

In this article, you will learn how to integrate Outbrain to BigQuery. You will also learn about Outbrain and BigQuery and their key features. 

What is Google BigQuery?

Google BigQuery is a Data Warehouse hosted on the Google Cloud Platform that helps enterprises with their analytics activities. This Software as a Service (SaaS) platform is serverless and has outstanding data management, access control, and Machine Learning features (Google BigQuery ML). Google BigQuery excels in analyzing enormous amounts of data and quickly meets your Big Data processing needs with capabilities like exabyte-scale storage and petabyte-scale SQL queries.

Google BigQuery’s columnar storage makes data searching more manageable and effective. On the other hand, the Colossus File System of BigQuery processes queries using the Dremel Query Engine via REST. The storage and processing engines rely on Google’s Jupiter Network to quickly transport data from one location to another.

Key Features of Google BigQuery

  • Fully Managed: An in-house setup is not required since Google BigQuery is a fully managed Data Warehouse. To use Google BigQuery, you only need a web browser to log in to the Google Cloud project. By offering serverless execution, Google BigQuery takes care of complicated setup and maintenance processes, including Server/VM Administration, Server/VM Sizing, Memory Management, etc.
  • Exceptional Performance: Due to the column-based design, Google BigQuery provides several advantages over traditional row-based storage, like higher storage efficiency and quicker ability to scan data. These features minimize slot consumption, querying time, and data use by supporting nested tables for practical data storage and retrieval.
  • Security: Google BigQuery offers Column-level protection, verifies identity and access status, and establishes security policies as all data is encrypted and in transit by default. Since it is a component of the Google Cloud ecosystem, it complies with security standards like HIPAA, FedRAMP, PCI DSS, ISO/IEC, SOC 1, 2, and 3.
  • Partitioning: Google BigQuery’s decoupled Storage and Computation architecture employs column-based segmentation to lower the quantity of data retrieved from discs by slot workers. Once the slot workers have finished reading their data from the disc, Google BigQuery automatically finds the most optimum data sharing method and instantly repartition data using its in-memory shuffle function.

Why Integrate Outbrain to BigQuery?

The most challenging aspect of Outbrain’s marketing efforts for marketers is the money wasted on redundant adverts. Consider the case of advertisements for products that are temporarily unavailable, which represent a significant financial loss. Outbrain’s ability to solve this problem lies in its ability to receive vital data from other platforms. Your Outbrain marketing campaigns are not producing more income for you for several reasons, one of which is that you do not have enough specific data. It is not possible to send Outbrain this information in its original format at this time. 

Before you use the information to launch marketing campaigns with Outbrain, you should first collect the relevant data and conduct a proper analysis of it in a data warehouse. Google BigQuery is a tool that can assist in the process of gleaning useful information from massive amounts of data. Using the straightforward interface that Google BigQuery provides, you can construct a new cluster in just a few minutes without having to worry about the management of the underlying infrastructure in Outbrain to BigQuery Integration.

Reliably Integrate data with Hevo’s Fully Automated No Code Data Pipeline

If yours is anything like the 1000+ data-driven companies that use Hevo, more than 70% of the business apps you use are SaaS applications Integrating the data from these sources in a timely way is crucial to fuel analytics and the decisions that are taken from it. But given how fast API endpoints etc can change, creating and managing these pipelines can be a soul-sucking exercise.

Hevo’s no-code data pipeline platform lets you connect over 150+ sources in a matter of minutes to deliver data in near real-time to your warehouse. What’s more, the in-built transformation capabilities and the intuitive UI means even non-engineers can set up pipelines and achieve analytics-ready data in minutes. 

All of this combined with transparent pricing and 24×7 support makes us the most loved data pipeline software in terms of user reviews.

Take our 14-day free trial to experience a better way to manage data pipelines.

Get started for Free with Hevo!

Methods to Integrate Outbrain to BigQuery

In this section, you will learn how to integrate Outbrain to BigQuery. There are two ways to go about integrating Outbrain to BigQuery. 

Method 1: Outbrain to BigQuery Integration Using Hevo

Outbrain to BigQuery: hevo banner
Image Source

Hevo provides Google BigQuery as a Destination for loading/transferring data from any Source system, which also includes Outbrain for Outbrain to BigQuery Integration. You can refer to Hevo’s documentation for Permissions, User Authentication, and Prerequisites for Google BigQuery as a destination here

Configure Outbrain as a Source

Configure Outbrain as the Source in your Pipeline by following the instructions below to connect Outbrain to BigQuery:

  • Step 1: In the Asset Palette, choose PIPELINES.
  • Step 2: In the Pipelines List View, click + CREATE.
  • Step 3: Select Outbrain on the Select Source Type page to connect Outbrain to BigQuery.
  • Step 4: Set the following in the Configure your Outbrain Source page in Outbrain to BigQuery Connector:
Outbrain to BigQuery: config outbrain
Image Source
  • Pipeline Name: A name for the pipeline that is unique and does not exceed 255 characters.
  • Username: The username included in the Outbrain access credentials.
  • Password: The password found in the Outbrain access credentials.
  • Historical Sync Duration: The amount of time that the previous data must be consumed.
  • Step 5: TEST & CONTINUE should be selected to perform Outbrain to BigQuery Migration.
  • Step 6: Proceed to setting up the Destination and configuring the data ingestion.

Configure BigQuery as a Destination

To configure BigQuery as a Destination in Outbrain to BigQuery Connector, follow these steps:

  • In the Asset Palette, choose DESTINATIONS.
  • In the Destinations List View, click + CREATE for Outbrain to BigQuery Integration.
  • Select Google BigQuery as the Destination type on the Add Destination page.
  • Select the authentication method for connecting to BigQuery on the Configure your Google BigQuery Account page to migrate Outbrain to BigQuery.
Outbrain to BigQuery: Configure your Google BigQuery
Image Source
  • Perform one of the following:
    • To connect with a Service Account, follow these steps:
      • Attach the Service Account Key file.
      • Click on CONFIGURE GOOGLE BIGQUERY ACCOUNT.
    • To join using a User Account, follow these steps:
    • Click on + ADD A GOOGLE BIGQUERY ACCOUNT.
    • Sign in as a user with BigQuery Admin and Storage Admin permissions.
    • Provide Hevo access to your data by clicking Allow.
  • Configure your Google BigQuery Warehouse page in Outbrain to BigQuery Connection with the following information:
    • Destination Name: Give your Destination a distinctive name.
    • Project ID: The BigQuery instance’s Project ID.
    • Dataset ID: The dataset’s name.
    • GCS Bucket: A cloud storage bucket where files must be staged before being transferred to BigQuery.
    • Sanitize Table/Column Names: Select this option to replace any non-alphanumeric characters and spaces in table and column names with an underscore (_).
    • Populate Loaded Timestamp: Enabling this option adds the __hevo_loaded_at_ column to the Destination Database, indicating the time when the Event was loaded to the Destination.
  • To test the connection, click TEST CONNECTION and then SAVE DESTINATION to finish the setup to complete Outbrain to BigQuery Integration.
Deliver Smarter, Faster Insights with your Unified Data

Using manual scripts and custom code to move data into the warehouse is cumbersome. Changing API endpoints and limits, ad-hoc data preparation, and inconsistent schema makes maintaining such a system a nightmare. Hevo’s reliable no-code data pipeline platform enables you to set up zero-maintenance data pipelines that just work.

  • Wide Range of Connectors: Instantly connect and read data from 150+ sources including SaaS apps and databases, and precisely control pipeline schedules down to the minute.
  • In-built Transformations: Format your data on the fly with Hevo’s preload transformations using either the drag-and-drop interface or our nifty python interface. Generate analysis-ready data in your warehouse using Hevo’s Postload Transformation.
  • Near Real-Time Replication: Get access to near real-time replication for all database sources with log-based replication. For SaaS applications, near real-time replication is subject to API limits.   
  • Auto-Schema Management: Correcting improper schema after the data is loaded into your warehouse is challenging. Hevo automatically maps the source schema with the destination warehouse so that you don’t face the pain of schema errors.
  • Transparent Pricing: Say goodbye to complex and hidden pricing models. Hevo’s Transparent Pricing brings complete visibility to your ELT spend. Choose a plan based on your business needs. Stay in control with spend alerts and configurable credit limits for unforeseen spikes in the data flow.
  • 24×7 Customer Support: With Hevo you get more than just a platform, you get a partner for your pipelines. Discover peace with round-the-clock “Live Chat” within the platform. What’s more, you get 24×7 support even during the 14-day free trial.
  • Security: Discover peace with end-to-end encryption and compliance with all major security certifications including HIPAA, GDPR, and SOC-2.
Get started for Free with Hevo!

Get Started for Free with Hevo’s 14-day Free Trial.

Method 2: Outbrain to BigQuery Integration Manually

This method manually integrates Outbrain to BigQuery:

Step 1: Get Data out of Outbrain

In Outbrain to BigQuery Integration, the RESTful Amplify API from Outbrain allows you to extract data about marketers, campaigns, performance, and more. With a call like GET/reports/marketers/[id]/content, you can create an API call that specifies performance metrics like impressions, clicks, clickthrough rate, and cost. To limit, filter, and sort the results, you can use any of a dozen alternative options.

Step 2: Sample the Data

The second step in Outbrain to BigQuery Connection is sampling the data. The following is an example of a JSON response from an API query for performance data:

{
    "results": [
        {
            "metadata": 
            {
                "id": "00f4b02153ee75f3c9dc4fc128ab041962",
                "text": "Yet another promoted link",
                "creationTime": "2017-11-26",
                "lastModified": "2017-11-26",
                "url": "http://money.outbrain.com/2017/11/26/news/economy/crash-disaster/",
                "status": "APPROVED",
                "enabled": true,
                "cachedImageUrl": "http://images.outbrain.com/imageserver/v2/s/gtE/n/plcyz/abc/iGYzT/plcyz-f8A-158x114.jpg",
                "campaignId": "abf4b02153ee75f3cadc4fc128ab0419ab",
                "campaignName": "Boost 'ABC' Brand",
                "archived": false,
                "documentLanguage": "EN",
                "sectionName": "Economics",
            },
            "metrics":
            {
                "impressions": 18479333,
                "clicks": 58659,
                "conversions": 12,
                "spend": 9187.16,
                "ecpc": 0.16,
                "ctr": 0.32,
                "conversionRate": 0.02,
                "cpa": 765.6
            }
        }
    ],
    "totalResults": 27830,
    "summary": {
        "impressions": 1177363701,
        "clicks": 2615150,
        "conversions": 2155,
        "spend": 455013.97,
        "ecpc": 0.17,
        "ctr": 0.22,
        "conversionRate": 0.08,
        "cpa": 211.14
    },
    "totalFilteredResults": 1,
    "summaryFiltered": {
        "impressions": 18479333,
        "clicks": 58659,
        "conversions": 12,
        "spend": 9187.16,
        "ecpc": 0.16,
        "ctr": 0.32,
        "conversionRate": 0.02,
        "cpa": 765.6
    }
}

Step 3: Prepare the Data

You’ll need to construct a schema for your data tables if you don’t already have a data structure in which to store the data you obtain in Outbrain to BigQuery Integration. Then, for each value that is returned in the response, you will need to determine a predefined datatype (such as INTEGER, DATETIME, etc.) and make a table that can store it. You should be able to find information about the available fields and the data types that they relate to in the documentation for each endpoint.

The fact that the records retrieved from the source might not always be “flat” – some of the objects might be lists – is an additional factor that makes things more complicated. Because of this, it is almost certain that you will require the creation of additional tables to accommodate the unpredictability of the cardinality of each record.

Step 4: Load Data into BigQuery

The next step in Outbrain to BigQuery Integration is to load data into Google BigQuery. You can generate data with SQL, and then store the results in Google BigQuery. The following are some of the options for loading data:

  • You can perform bulk inserts into an existing table using statements from the data manipulation language (DML), or you can store the results of a query in a new table.
  • When you want to create a new table based on the results of a query, use the CREATE TABLE… AS statement.
  • Execute a query, and then save the results of that query to a table. You have the option of writing the results to a new table or appending them to an existing table.

Step 5: Keeping Outbrain Data Up to Date

At this point, you have successfully created a script or written a program to gather the data you want and moved it into your data warehouse. You may also say that you have completed this task. But how exactly do you plan to add new data or update the existing data in Outbrain to BigQuery Connection? It is not a good idea to create duplicates of all of your data each time you update your records. This would be an extremely laborious and time-consuming process that would require a lot of resources.

Instead, pick out the most significant fields that your script could use to save its position in the data so that it can come back to them later while it is looking for updated information. For this purpose, the utilization of auto-incrementing fields such as updated at or created at is strongly suggested. Once you’ve added this functionality, your script can be set up as a cron job or as a continuous loop to acquire new data as it occurs in Outbrain. This can be done in a variety of ways.

Once you’ve written the code, you are responsible for its maintenance, just as you would be with any other kind of code. If Outbrain updates its API or the API sends a field with a datatype that your code is not able to comprehend, you may be required to make modifications to the script. If your users require slightly different information, you will undoubtedly have no choice but to comply.

Conclusion

In this article, you learned two methods of Integrating Outbrain to BigQuery. The first method was using Hevo and the second was manually transferring data. You also learned why you need to integrate Outbrain to BigQuery. 

However, as a Developer, extracting complex data from a diverse set of data sources like Databases, CRMs, Project management Tools, Streaming Services, and Marketing Platforms to your Database can seem to be quite challenging. If you are from non-technical background or are new in the game of data warehouse and analytics, Hevo can help!

Visit our Website to Explore Hevo

Hevo will automate your data transfer process, hence allowing you to focus on other aspects of your business like Analytics, Customer Management, etc. Hevo provides a wide range of sources – 150+ Data Sources (including 40+ Free Sources) – that connect with over 15+ Destinations. It will provide you with a seamless experience and make your work life much easier.

Want to take Hevo for a spin? Sign Up for a 14-day free trial and experience the feature-rich Hevo suite firsthand.

You can also have a look at our unbeatable pricing that will help you choose the right plan for your business needs!

Sharon Rithika
Content Writer, Hevo Data

Sharon is a data science enthusiast with a hands-on approach to data integration and infrastructure. She leverages her technical background in computer science and her experience as a Marketing Content Analyst at Hevo Data to create informative content that bridges the gap between technical concepts and practical applications. Sharon's passion lies in using data to solve real-world problems and empower others with data literacy.

No-Code Data Pipeline for Google BigQuery