Facebook Page Insights to Snowflake: 2 Easy Methods

on Data Integration, Data Warehouses, Facebook Ads, Snowflake • July 7th, 2022 • Write for Hevo

facebook page insights to snowflake: FI

Insights Data for a Facebook page as a whole (like followers and impressions) and page posts are fetched by Facebook Pages using the Graph API (such as likes, shares, comments, and video plays).

Snowflake’s Data Cloud is built on a cutting-edge data platform that is delivered as Software-as-a-Service (SaaS). Snowflake provides data storage, processing, and analytic solutions that are faster, easier to use, and more flexible than traditional options.

This article talks about the ways to load data instantly from Facebook Page Insights to Snowflake. It also gives a brief introduction to Facebook Page Insights and Snowflake.

Table of Contents

What are Facebook Page Insights?

facebook page insights to snowflake: facebook page insights logo
Image Source

You can monitor user behavior and the effectiveness of your posts on your Facebook business page using Facebook Insights, an analytics dashboard. The platform also recommends competitor pages to watch and track in addition to providing important metrics for paid and organic posts, such as page views and post reach.

Facebook Insights makes incoming data from your company page visually appealing so you can understand how visitors interact with it, what content they find interesting, and how it compares to rival pages.

You can retrieve ad data to monitor the effectiveness of your ads using the Facebook Insights API, also referred to as the Ads Insights API. The API is especially helpful because it enables marketers to extract the precise data they need to track, regardless of how detailed.

To ensure that only your team has access to the data you pull from the API, you must select the data you want to use and create an access token for that ad.

What is Snowflake?

facebook page insights to snowflake: snowflake logo
Image Source

Snowflake is a fully managed SaaS (Software as a Service) that combines Data Warehousing, Data Lakes, Data Engineering, Data Science, Data Application Development, and Secure Sharing and Consumption of Real-time / Shared Data into a single platform.

Snowflake isn’t based on any existing database technology or “Big Data” software platforms like Hadoop. However, it combines a brand-new SQL query engine with cutting-edge Cloud Architecture. Snowflake also gives users all of the features and capabilities of an enterprise analytic database, plus a lot more.

Snowflake is a hybrid of shared-disk and shared-nothing Database architectures, which allows it to deliver results quickly. It uses a central repository for persisted data, similar to a shared-disk database, which is accessible from all compute nodes. It uses MPP (Massively Parallel Processing) compute clusters, in which each node stores a portion of the entire data set locally, similar to shared-nothing architectures.

Key Features of Snowflake

Here are some of the features of Snowflake as a Software as a Service (SaaS) solution:

  • Snowflake enables you to enhance your Analytics Pipeline by transitioning from nightly Batch Loads to Real-time Data Streams, allowing you to improve the quality and speed of your analytics. By enabling Secure, Concurrent, and Monitoring Access to your Data Warehouse across your organization, you can improve the quality of analytics at your company.
  • Snowflake uses the Caching Paradigm to swiftly deliver the results from the cache. To avoid re-generation of the report when nothing has changed, Snowflake employs Persistent (within the session) Query results.
  • Snowflake allows you to better analyze Customer Behaviour and Product Usage. You can also use the whole scope of data to ensure Customer Satisfaction, drastically improve product offers, and foster Data Science innovation.
  • Snowflake allows you to create your own Data Exchange, which allows you to securely communicate live, controlled data.

Explore These Methods to Connect Facebook Page Insights to Snowflake

Facebook Page Insights aids media companies, brands, and public figures in understanding their audience, the effectiveness of their posts, and the long-term health of their Pages. Snowflake provides data warehouse-as-a-service, a cloud-based data storage, and analytics service. Employing hardware and software-based in the cloud, businesses can use it to store and analyze data.

When integrated, moving data from Facebook Page Insights to Snowflake could solve some of the biggest data problems for businesses. In this article, we have described two methods to achieve this:

Method 1: Using Hevo to Connect Facebook Page Insights to Snowflake 

Hevo Data, an Automated Data Pipeline, provides you a hassle-free solution to connect Facebook Page Insights to Snowflake within minutes with an easy-to-use no-code interface. Hevo is fully managed and completely automates the process of not only loading data from Facebook Page Insights to Snowflake but also enriching the data and transforming it into an analysis-ready form without having to write a single line of code.

Get Started with Hevo for Free

Method 2: Using Custom Code to Move Data from Facebook Page Insights to Snowflake 

This method of connecting Facebook Page Insights to Snowflake would be time-consuming and somewhat tedious to implement. Users will have to write custom codes to enable Facebook Page Insights to Snowflake migration. This method is suitable for users with a technical background.

Both the methods are explained below.

Facebook Page Insights to Snowflake Integration

Method 1: Using Hevo to Connect Facebook Page Insights to Snowflake

facebook page insights to snowflake: hevo logo
Image Source

Hevo provides an Automated No-code Data Pipeline that helps you move your Facebook Page Insights to Snowflake. Hevo is fully-managed and completely automates the process of not only loading data from your 150+ Sources (including 40+ Free Sources) that connect with over 15+ Destinations. It also enriches and transforms the data into an analysis-ready form without writing any code. Its fault-tolerant architecture ensures that the data is handled in a secure, consistent manner with zero data loss. This is an automated method for Facebook Page Insights to Snowflake migration.

Using Hevo Data Facebook Page Insights to Snowflake Migration can be done in the following 2 steps:

  • Step 1: Configure Facebook Page Insights as the Source in your Pipeline by following the steps below:
    • Step 1.1: In the Asset Palette, select PIPELINES.
    • Step 1.2: In the Pipelines List View, click + CREATE.
    • Step 1.3: Select Facebook Page Insights on the Select Source Type page.
    • Step 1.4: Click + ADD FACEBOOK PAGES ACCOUNT on the Configure your Facebook Pages Account page.
facebook page insights to snowflake: configure facebook pages account
Image Source
  • Step 1.5: To continue as Company Name>, log into your Facebook Pages account and click Continue.
facebook page insights to snowflake: company name
Image Source
  • Step 1.6: Click Next after choosing one or more pages whose data you want to copy.
facebook page insights to snowflake: pages with hevo
Image Source
  • Step 1.7: Enable all of the permissions and click Done to approve Hevo.
facebook page insights to snowflake: hevo access
Image Source
  • Step 1.8: In the Confirmation dialogue, click OK.
facebook page insights to snowflake: access confirmation
Image Source
  • Step 1.9: Enter the following information on the Configure your Facebook Pages Source page:
    • Pipeline Name: A name for the Pipeline that is unique.
    • Pages: Choose one or more pages whose information you want to copy
    • Historical Sync Duration: The amount of time that the historical data must be ingested. 
facebook page insights to snowflake: configure Facebook page insights as source
Image Source
  • Step 1.91: TEST & CONTINUE is the button to click.
  • Step 1.92: Set up the Destination and configure the data ingestion.
  • Step 2: To set up Snowflake as a destination in Hevo, follow these steps:
    • Step 2.1: In the Asset Palette, select DESTINATIONS.
    • Step 2.2: In the Destinations List View, click + CREATE.
    • Step 2.3: Select Snowflake from the Add Destination page.
    • Step 2.4: Set the following parameters on the Configure your Snowflake Destination page:
      • Destination Name: A unique name for your Destination.
      • Snowflake Account URL: This is the account URL that you retrieved.
      • Database User: The Hevo user that you created in the database. In the Snowflake database, this user has a non-administrative role.
      • Database Password: The password of the user.
      • Database Name: The name of the Destination database where data will be loaded.
      • Database Schema: The name of the Destination database schema. Default value: public.
      • Warehouse: SQL queries and DML operations are performed in the Snowflake warehouse associated with your database.
facebook page insights to snowflake: configure snowflake as destination
Image Source
  • Step 2.5: Click Test Connection to test connectivity with the Snowflake warehouse.
  • Step 2.6: Once the test is successful, click SAVE DESTINATION.

Here are more reasons to try Hevo:

  • Smooth Schema Management: Hevo takes away the tedious task of schema management & automatically detects the schema of incoming data and maps it to your schema in the desired Data Warehouse.
  • Exceptional Data Transformations: Best-in-class & Native Support for Complex Data Transformation at fingertips. Code & No-code Flexibility is designed for everyone.
  • Quick Setup: Hevo with its automated features, can be set up in minimal time. Moreover, with its simple and interactive UI, it is extremely easy for new customers to work on and perform operations.
  • Built To Scale: As the number of sources and the volume of your data grows, Hevo scales horizontally, handling millions of records per minute with very little latency.
  • Live Support: The Hevo team is available round the clock to extend exceptional support to its customers through chat, email, and support calls.

Try Hevo Today!

Get Started with Hevo for Free

Method 2: Using Custom Code to Move Data from Facebook Page Insights to Snowflake

In this method of Data Migration from Facebook Page Insights to Snowflake, you use an indirect method, first you connect Facebook Page Insights to Goggle Sheets and then Google Sheets to Snowflake. Google Sheets acts as a Facebook Page Insights to Snowflake connector.

Facebook Page Insights to Google Sheets

As the first step for Facebook Page Insights to Snowflake connection, connect Facebook Page Insights to Google sheets by following the steps.

Connect to the Facebook Pages API

If you haven’t already established a connection between API Connector and Facebook Pages, you must first do so as follows:

  • Click Extensions > API Connector > Manage Connections in Google Sheets after opening it.
  • Locate Facebook Pages in the list of connections and click Connect.
facebook page insights to snowflake: list of connections
Image Source
  • The pages you want to connect will be asked for. Click through to connect your account.
  • After that, you’ll be taken back to your Google Sheet where you can check the Connections screen to see if your Facebook Pages connection is live.
Get your Facebook Page ID

Now that you have it, let’s get your Facebook Page ID since you’ll need it for future requests.

  • Enter the following URL into the API URL path field in API Connector by clicking Add New or Create to access the Create screen:
https://graph.facebook.com/v12.0/mixedanalytics

Replace “mixedanalytics” with the name of your page. When you open your Facebook page, you should see the name of your page in the URL bar.

  • Select Facebook Pages from the dropdown menu for OAuth.
  • Pick a destination sheet and enter the name you want.
  • To view your Facebook page ID, click Run.
  • Instead, access your Facebook Business Manager account to obtain your Facebook Page ID.
Create Your API Request URL

Let’s access your Page’s most recent posts and metrics now that you have your Facebook Page ID.

  • API Root: https://graph.facebook.com/v12.0
  • Endpoint: /{page_id}/posts
  • Query Strings: ?fields=id,message,created_time,insights.metric(post_impressions,post_reactions_like_total,post_clicks)&limit=100

Although it is optional, the “limit” parameter can be used to return more records (otherwise the default is 25). Combining them, you obtain the complete API Request URL:

https://graph.facebook.com/v12.0/507738226062106/posts?fields=id,message,created_time,insights.metric(post_impressions,post_reactions_like_total,post_clicks)&limit=100

Simply replace it with your Page ID.

Pull Facebook Page Data into Google Sheets

Let’s now import the data into Sheets.

  • Enter the Request URL you just created in the Create tab once more.
  • From the dropdown menu for OAuth, select Facebook Pages. A “Connected” badge ought to appear. Make sure the page you’ve chosen matches the page ID in your URL if you have access to more than one page. Please completely disconnect the API Connector for Sheets connection here and then reconnect, making sure to choose the page(s) you want to access if you can’t find the page you’re looking for.
  • Simply leave that section blank; you don’t need any additional headers.
  • Create a new tab, then select it as your data destination by clicking Set Current.
  • (Optional) Facebook responds with a massive amount of data. Consider one of the following to clean it up:
    • You can find the fields with metrics and rename them to something like “post impressions,” “post reactions like total,” and “post clicks” by clicking Edit Fields and selecting/renaming only the fields you need.
    • Alternatively, add the following JMESPath expression to the JMESPath field:
data[].{id:id,message:message,created_time:created_time, post_impressions:insights.data[?name=='post_impressions'].values, post_reactions_like_total:insights.data[?name=='post_reactions_like_total'].values, post_clicks:insights.data[?name=='post_clicks'].values}
  • Press Run. Your most recent posts and post statistics should then appear in your Google Sheet shortly after that.

Google Sheets to Snowflake

To further connect Facebook Page Insights to Snowflake, you connect Google Sheets to Snowflake. To migrate your data from Google Sheets to Snowflake, you may opt for a custom-built data migration script to get the job done. Google Sheets is a connector for Facebook Page Insights to Snowflake integration.

Step 1: Setting Up Google Sheets API Access for Google Sheets
facebook page insights to snowflake: Setting Up Google Sheets API Access for Google Sheets
Image Source

As a first step, you would need to set up Google Sheets API access for the affected Google Sheets. Start by doing the following:

  • Log in to the Google account that owns the Google Sheets.
  • Point your browser to the Google Developer Console (copy and paste the following in your browser: console.developers.google.com).
  • After the console loads create a project by clicking the “Projects” dropdown and then clicking “New Project”.
  • Give your project a name and click “Create”.
  • After that, click “Enable APIs and Services”.
  • Search for “Google Sheets API” in the search bar that appears and select it.
  • Click “Enable” to enable the Google Sheets API.
  • Click on the “Credentials” option on the left navbar in the view that appears, then click “Create Credentials”, and finally select “Service Account”.
facebook page insights to snowflake: create credentials
Image Source
  • Provide a name for your service account. You will notice it generates an email format for the Service Account ID. In my example in the screenshot below, it is “gsheets-migration@migrate-268012.iam.gserviceaccount.com”. Take note of this value. The token “migrate-268012” is the name of the project You created while “gsheets-migration” is the name of my service account. In your case, these would be your own supplied values.
  • Click “Create” and fill out the remaining optional fields. Then click “Continue”
facebook page insights to snowflake: create service account
Image Source
  • In the view that appears, click “Create Key”, select the “JSON” option and click “Create” to download your key file (credentials). Please store it in a safe place. You will use this later when setting up our migration environment.
  • Finally, click “Done”

At this point, all that remains for the Google Sheets setup is the sharing of all the Google Sheets you wish to migrate with the email-format Service Account ID mentioned in step 9 above. 

Note: You can copy your Service Account ID from the “client-email” field of the credential file you downloaded.

facebook page insights to snowflake: share with others
Image Source
  • For this demonstration, you will be migrating a sheet called “data-uplink-logs” shown in the screenshot below. You will now share it with my Service Account ID:
  • Click “Share” on the Google Sheet, paste in your Service Account ID, and click “Send”. Repeat this process for all sheets you want to migrate. Ignore any “mail delivery subsystem failure” notifications you receive while sharing the sheets, as your Service Account ID is not designed to operate as a normal email address.
Step 2: Configuring Target Database in Snowflake

You’re now ready to get started on the Snowflake side of the configuration process, which is simpler.

To begin, create a Snowflake account. Creating an account furnishes you with all the credentials you will need to access Snowflake from your migration script. 

Specifically:

  • After creating your account, you will be redirected to your Cloud Console which will open up in your browser
  • During the account creation process, you would have specified your chosen username and password. You would have also selected your preferred AWS region, which will be part of your account.
  • Your Snowflake account is of the form <Your Account ID>.<AWS Region> and your Snowflake cloud console URL will be of the form https://<Your Account ID>.<AWS Region>.snowflakecomputing.com/
  • Prepare and store a JSON file with these credentials. It will have the following layout:
{

    "user": "<Your Username>",

    "account": "<Your Account ID>.<AWS Region>",

    "password": "<Your Password>"

}
  • After storing the JSON file, take some time to create your target environment on Snowflake using the intuitive User Interface.
  • You are initially assigned a Data Warehouse called COMPUTE_WH so you can go ahead and create a Database and tables in it.
facebook page insights to snowflake: create database
Image Source
  • After providing a valid name for your database and clicking “Finish”, click the “Grant Privileges” button which will show the form in the screenshot below.
facebook page insights to snowflake: grant privileges
Image Source
  • Select the “Modify” privilege and assign it to your schema name (which is “PUBLIC” by default). Click “Grant”. Click “Cancel” if necessary, after that, to return the main view.
  • The next step is to add a table to your newly created database. You do this by clicking the database name on the left display and then clicking on the “Create Table” button. This will pop up the form below for you to design your table:
facebook page insights to snowflake: create table
Image Source
  • After designing your table, click “Finish” and then click on your table name to verify that your table was created as desired:
facebook page insights to snowflake: migration finished
Image Source
  • Finally, open up a Worksheet pane, which will allow you to run queries on your table. Do this by clicking on the “Worksheets” icon, and then clicking on the “+” tab.
  • You can now select your database from the left pane to start running queries.
facebook page insights to snowflake: snowflake data migration
Image Source
  • You will run queries from this view to verify that our data migration process is correctly writing our data from the Google sheet to this table.
Step 3: Preparing a Migration Environment on Linux Server

In this step, you will configure a migration environment on our Linux server.

  • SSH into your Linux instance. You are using a remote AWS EC2 instance running Ubuntu, so my SSH command is of the form:
ssh -i <keyfile>.pem ubuntu@<server_public_IP>
  • Once in your instance, run sudo apt-get update to update the environment.
  • Next, create a folder for the migration project and enter it:
sudo mkdir migration-test; cd migration-test
  • It’s now time to clone the migration script you created for this post:
sudo git clone 
https://github.com/cmdimkpa/Google-Sheets-to-Snowflake-Data-Migration.git
  • Enter the project directory and view the contents with the command:
cd Google-Sheets-to-Snowflake-Data-Migration; ls
  • This reveals the following files:
facebook page insights to snowflake: result files
Image Source
  • googlesheets.json: Copy your saved Google Sheets API credentials into this file.
  • snowflake.json: Likewise, copy your saved Snowflake credentials into this file.
  • migrate.py: This is the migration script.
Using the Migration Script
  • Before using the migration script (a Python script), you must ensure the required libraries for both Google Sheets and Snowflake are available in the migration environment. Python itself should already be installed – this is usually the case for Linux servers, but check and ensure it is installed before proceeding.
  • To install the required packages, run the following commands:
sudo apt-get install -y libssl-dev libffi-dev

pip install --upgrade snowflake-connector-python

pip install gspread oauth2client PyOpenSSL
  • At this point, you are ready to run the migration script.
  • The required command is of the form:
sudo python migrate.py <Source Google Sheet Name> 
<Comma-separated list of columns in the Google Sheet to Copy> 
<Number of rows to copy each run> <Snowflake target Data Warehouse> 
<Snowflake target Database> <Snowflake target Table> <Snowflake target table Schema> 
<Comma-separated list of Snowflake target table fields> <Snowflake account role>
  • For our example process, the command becomes:
sudo python migrate.py data-uplink-logs A,B,C,D 24 
COMPUTE_WH TEST_GSHEETS_MIGRATION GSHEETS_MIGRATION PUBLIC CLIENT_ID,NETWORK_TYPE,BYTES,UNIX_TIMESTAMP SYSADMIN
  • To migrate 24 rows of incremental data (each run) from our test Google Sheet data-uplink-logs to our target Snowflake environment, you simply run the command above. The following is a screenshot of what follows:
  • The reason you migrate only 24 rows at a time is to beat the rate limit for the free tier of the Google Sheets API. Depending on your plan, you may not have this restriction.
Step 4: Testing the Migration Process

To test that the migration ran successfully, you simply go to our Snowflake Worksheet which you opened earlier, and run the following SQL query:

SELECT * FROM TEST_GSHEETS_MIGRATION.PUBLIC.GSHEETS_MIGRATION

Indeed, the data is there. So the data migration effort was successful.

Conclusion

In a few steps, this blog explains the different ways to load data from Facebook Page Insights to Snowflake. It also gives an overview of Facebook Page Insights and Snowflake.

Visit our Website to Explore Hevo

Hevo Data offers a No-code Data Pipeline that can automate your data transfer process, hence allowing you to focus on other aspects of your business like Analytics, Marketing, Customer Management, etc.

This platform allows you to transfer data from 150+ Sources (including 40+ Free Sources) such as Facebook Page Insights that connect with over 15+ Destinations Cloud-based Data Warehouses like Snowflake, Google BigQuery, etc. It will provide you with a hassle-free experience and make your work life much easier.

Want to take Hevo for a spin? 

Sign Up for a 14-day free trial and experience the feature-rich Hevo suite firsthand. You can also have a look at the unbeatable pricing that will help you choose the right plan for your business needs.

No-code Data Pipeline For Snowflake