Do you use Localytics and are looking for ways to move your data from Localytics to Redshift for further analysis? This blog is aimed at helping you with ways to achieve the data load. It discusses two methods of loading data from Localytics to Redshift allowing you to analyze the benefits and drawbacks of these approaches. Read on.

Introduction to Localytics

Localytics is an engagement platform that offers tools and insights to marketers. It helps the users to acquire, engage, grow and keep the customers. Its mission is to deliver meaningfulness and personalization in all experiences by putting people at the center of brands’ digital experiences. Localytics aims to be the core behind all digital engagements. This helps brands to build deeper, successful, and more meaningful relationships. 

Deeper, successful, and more meaningful relationships mean marketing and messaging that enhance users’ lives. Personalized engagements that recognize the difference between different customers.  This helps to build relationships by understanding who the customer is and what they need.

Introduction to Redshift

Redshift logo

AWS Redshift is a data warehouse built using MPP (massively parallel processing) architecture and managed by Amazon Web Services (AWS). It has the capacity to store large volumes of data for advanced analytics. Due to its ability to run complex analytical workloads in a cost-efficient way, It has become a popular data warehouse for modern data teams.

Methods to Migrate Data from Localytics to Redshift

Following are the 2 popular methods to perform Localytics to Redshift data migration:

Methods to Migrate Data from Localytics to Redshift

Method 1: Loading Data by Building Custom Scripts

A large investment in the engineering team would be needed for this approach. Your engineers would need to understand Localytics and AWS Redshift and program a custom solution. Data is extracted from Localytics, prepared for transfer, and loaded into Redshift by manually writing custom scripts.

Method 2: Loading Data using a No-code Data Pipeline, Hevo

Hevo Data provides a hassle-free solution and helps you directly transfer data from Localytics to Redshift and numerous other Databases/Data Warehouses or destinations of your choice instantly without having to write any code. Hevo allows you to configure your Localytics source and load data in real-time.

Get Started with Hevo for Free

Method 1: Loading Data by Building Custom Scripts

Broadly, the process of loading data manually from Localytics to Redshift involves the following steps:

Prerequisites

  1. A Localytics account with your preferred plan.
  2. A decent understanding of RESTful APIs. Explore the localytics developer documentation here.
  3. A fully set-up Redshift data warehouse.

Step 1: Extracting and Transforming Data from Localytics

The Localytics API and the backend services that powers it are organized around dimensions, metrics, and conditions.

Metrics are sums or counts that form the basis of your query. Some examples of metrics include sessions, users, occurrences, etc. View more examples of metrics here.

Dimensions are categories used to break down data into various groups. Some examples of Dimensions include event_name, attribute_name, custom_dimension_name, etc. View more examples of dimensions here.

Requirements that are applied to one or more of the dimensions in a query are called Conditions. You specify conditions by a hash and you key by a dimension name. Some examples of conditions are in, not_in, between, is_null, etc. View more examples of conditions here

The Localytics use Restful API with a root URL of https://api.localytics.com/v1/. For security purposes, unencrypted HTTP is not supported by the Localytics API.  Users are authenticated with an API key and API secret combination. There is one of two ways of passing these parameters to the API:

  1. HTTP Basic authentication that is commonly known as “gray box” authentication and is generally supported by HTTP clients. The API key and API secret are passed as HTTP Basic username and HTTP Basic password respectively. 
  2. The API key and API secret may also be passed as api_key and api_secret parameters. 

Use the GET command to pull necessary data from a Localytics endpoint.

GET /v1/apps/:app_id 

By default, the responses from the Localytics API are in JSON format. Below is an example of a JSON response. An object representing an app is returned by the JSON response. The object includes the name, id, creation time, etc.

{
  "name": "App Name",
  "app_id": "...",
  "stats": {
    "sessions": 52291,
    "closes": 46357,
    "users": 7008,
    "events": 865290,
    "data_points": 963938,
    "platforms": ["HTML5"],
    "client_libraries": ["html5_2.6", "html5_2.5", "html5_2.4"],
    "begin_date": "2013-08-10",
    "end_date": "2013-09-10"
  },
  "icon_url": "https://example.com/app-icon.png",
  "custom_dimensions": {
    "custom_0_dimension": "Blood Type",
    "custom_1_dimension": "Moon Phase"
  },
"created_at": "2012-04-10T04:07:13Z",
  "_links": {
    "self": { "href": "/v1/apps/..." },
    "apps": { "href": "/v1/apps" },
    "query": {
      "templated": true,
      "href": "/v1/apps/.../query{?app_id,metrics,dimensions,conditions,limit,order,days,comment,translate}"
    },
    "root": { "href": "/v1" }
  }
}

If you wish to get the results in CSV format, specify the text/csv in the Accept request header.

Step 2: Loading the Transformed Data into Redshift

Identify all the columns you need to insert and use the CREATE TABLE Redshift command to create a table.

Using INSERT Redshift command is not the right choice because it inserts data row by row.  This slows the process because Redshift is not designed to load data like that. So, load the data into Amazon S3 and use the copy command to load it into Redshift.

Limitations of Building Custom Scripts

  • Accessing Localytics Data in Real-time:  You have successfully created a program that loads data from Localytics to Redshift. However, you still have the challenge of loading new and updated data. You may replicate the data in real-time every time a new record is created but this process is slow and resource-intensive. 
  • Infrastructure Maintenance: Any written code needs to be constantly monitored because Localytics modifies or updates its API regularly. Therefore, you will need to invest in a team of engineers for this job.

Method 2: Loading Data using a No-code Data Pipeline, Hevo

A simpler, elegant, and hassle-free alternative is to use Hevo, a No-code Data Pipeline, and an official AWS Technology Partner that can easily load data from Localytics to Redshift in real-time. Hevo provides an easy-to-use visual interface that allows you to load data in two easy steps:

Step 1: Authenticate and Connect Localytics Data Source

Hevo can bring raw data from Localytics into your Destinations for deeper analytics. If you are a Localytics enterprise customer, you can get in touch with them to enable raw JSON exports into an Amazon S3 bucket.

Once you have done that, you can use Hevo’s S3 Source to replicate the raw data into your destinations. Read more about setting up a Pipeline with S3 Source.

Step 2: Configure the Redshift Data Warehouse where you need to Move the Data

Hevo can load data from any of your Pipelines into an Amazon Redshift data warehouse. You can set up the Redshift Destination on the fly, as part of the Pipeline creation process, or independently. The ingested data is first staged in Hevo’s S3 bucket before it is batched and loaded to the Amazon Redshift Destination.

Localytics to Redshift - S2

That is all! Now, relax and watch your data move from Localytics to Redshift.

Conclusion

The article explained to you the two methods of connecting Localytics to Redshift in a step-by-step manner. It also discussed the limitations of writing custom scripts to set up your ETL process. If you have no qualms about facing those limitations then you can try the manual ETL setup.

Hevo Data on the other hand can simplify your task by eliminating the need to write any code. It will automate the process of data transfer from Localytics to Redshift and provide you with a hassle-free experience. Hevo provides granular logs that allow you to monitor the health and flow of your data. 

In addition to Localytics, Hevo can load data from a multitude of 150+ other data sources including Databases, Cloud Applications, SDKs, and more. This allows you to scale up your data infrastructure on demand and start moving data from all the applications important for your business.

Sign Up here for a 14-day free trial and experience the feature-rich Hevo suite first hand. You can also have a look at our unbeatable pricing that will help you choose the right plan for your business needs!

FAQ on Localytics to Redshift

How much does Localytics cost?

Localytics is priced according to data volume, with the Community edition being free, the Premium edition priced at $95 per month per app, the Enterprise edition starting at $895 per month for all apps, and the Enterprise Analytics and Marketing edition — which includes a marketing feature set — starting at $1,790 per month for all apps.

What is localytics in Android?

Localytics is a mobile app analytics and marketing platform that provides insights into user behavior and engagement and offers tools for personalized marketing and push notifications on Android.

We would be thrilled to know what you think of the methods to transfer data from Localytics to Redshift detailed in this article. Let us know your thoughts and your preferred way of moving data in the comments section below.

Eva Brooke
Technical Content Writer, Hevo Data

Eva is passionate about data science and has a keen interest in writing on topics related to data, software architecture, and more. She takes pride in creating impactful content tailored for data teams and aims to solve complex business problems through her work.