Do you use Localytics and are looking for ways to move your data from Localytics to Redshift for further analysis? This blog is aimed at helping you with ways to achieve the data load. It discusses two methods of loading data from Localytics to Redshift allowing you to analyze the benefits and drawbacks of these approaches. Read on.

Introduction to Localytics

Image Source

Localytics is an engagement platform that offers tools and insights to marketers. It helps the users to acquire, engage, grow and keep the customers. Its mission is to deliver meaningfulness and personalization in all experiences by putting people at the center of brands’ digital experiences. Localytics aims to be the core behind all digital engagements. This helps brands to build deeper, successful, and more meaningful relationships. 

Deeper, successful, and more meaningful relationships mean marketing and messaging that enhance users’ lives. Personalized engagements that recognize the difference between different customers.  This helps to build relationships by understanding who the customer is and what they need.

Introduction to Redshift

Image Source

AWS Redshift is a data warehouse built using MPP (massively parallel processing) architecture and managed by Amazon Web Services (AWS). It has the capacity to store large volumes of data for advanced analytics. Due to its ability to run complex analytical workloads in a cost-efficient way, It has become a popular data warehouse for modern data teams.

Click here for more information about Amazon Redshift.

Methods to Migrate Data from Localytics to Redshift

Method 1: Loading Data by Building Custom Scripts

A large investment in the engineering team would be needed for this approach. Your engineers would need to understand Localytics and AWS Redshift and program a custom solution. Data is extracted from Localytics, prepared for transfer, and loaded into Redshift by manually writing custom scripts.

Method 2: Loading Data using a No-code Data Pipeline, Hevo

Hevo Data provides a hassle-free solution and helps you directly transfer data from Localytics to Redshift and numerous other Databases/Data Warehouses or destinations of your choice instantly without having to write any code. Hevo comes with a graphical interface that allows you to configure your Localytics source and load data in real-time. Hevo is fully managed and completely automates the process of not only loading data from your desired source but also enriching the data and transforming it into an analysis-ready form without having to write a single line of code. Hevo’s pre-built with 100+ data sources (including 40+ free data sources) will take full charge of the data transfer process, allowing you to focus on key business activities.

Get Started with Hevo for Free

Methods to Migrate Data from Localytics to Redshift

Following are the 2 popular methods to perform Localytics to Redshift data migration:

Method 1: Loading Data by Building Custom Scripts

Broadly, the process of loading data manually from Localytics to Redshift involves the following steps:


  1. A Localytics account with your preferred plan.
  2. A decent understanding of RESTful APIs. Explore the localytics developer documentation here.
  3. A fully set-up Redshift data warehouse.

Step 1: Extracting and Transforming Data from Localytics

The Localytics API and the backend services that powers it are organized around dimensions, metrics, and conditions.

Metrics are sums or counts that form the basis of your query. Some examples of metrics include sessions, users, occurrences, etc. View more examples of metrics here.

Dimensions are categories used to break down data into various groups. Some examples of Dimensions include event_name, attribute_name, custom_dimension_name, etc. View more examples of dimensions here.

Requirements that are applied to one or more of the dimensions in a query are called Conditions. You specify conditions by a hash and you key by a dimension name. Some examples of conditions are in, not_in, between, is_null, etc. View more examples of conditions here

The Localytics use Restful API with a root URL of For security purposes, unencrypted HTTP is not supported by the Localytics API.  Users are authenticated with an API key and API secret combination. There is one of two ways of passing these parameters to the API:

  1. HTTP Basic authentication that is commonly known as “gray box” authentication and is generally supported by HTTP clients. The API key and API secret are passed as HTTP Basic username and HTTP Basic password respectively. 
  2. The API key and API secret may also be passed as api_key and api_secret parameters. 

Use the GET command to pull necessary data from a Localytics endpoint.

GET /v1/apps/:app_id 

By default, the responses from the Localytics API are in JSON format. Below is an example of a JSON response. An object representing an app is returned by the JSON response. The object includes the name, id, creation time, etc.

  "name": "App Name",
  "app_id": "...",
  "stats": {
    "sessions": 52291,
    "closes": 46357,
    "users": 7008,
    "events": 865290,
    "data_points": 963938,
    "platforms": ["HTML5"],
    "client_libraries": ["html5_2.6", "html5_2.5", "html5_2.4"],
    "begin_date": "2013-08-10",
    "end_date": "2013-09-10"
  "icon_url": "",
  "custom_dimensions": {
    "custom_0_dimension": "Blood Type",
    "custom_1_dimension": "Moon Phase"
"created_at": "2012-04-10T04:07:13Z",
  "_links": {
    "self": { "href": "/v1/apps/..." },
    "apps": { "href": "/v1/apps" },
    "query": {
      "templated": true,
      "href": "/v1/apps/.../query{?app_id,metrics,dimensions,conditions,limit,order,days,comment,translate}"
    "root": { "href": "/v1" }


If you wish to get the results in CSV format, specify the text/csv in the Accept request header.

Step 2: Loading the Transformed Data into Redshift

Identify all the columns you need to insert and use the CREATE TABLE Redshift command to create a table.

Using INSERT Redshift command is not the right choice because it inserts data row by row.  This slows the process because Redshift is not designed to load data like that. So, load the data into Amazon S3 and use the copy command to load it into Redshift.

Limitations of Building Custom Scripts

  • Accessing Localytics Data in Real-time:  You have successfully created a program that loads data from Localytics to Redshift. However, you still have the challenge of loading new and updated data. You may replicate the data in real-time every time a new record is created but this process is slow and resource-intensive. 
  • Infrastructure Maintenance: Any written code needs to be constantly monitored because Localytics modifies or updates its API regularly. Therefore, you will need to invest in a team of engineers for this job.

Method 2: Loading Data using a No-code Data Pipeline, Hevo

A simpler, elegant, and hassle-free alternative is to use Hevo, a No-code Data Pipeline, and an official AWS Technology Partner that can easily load data from Localytics to Redshift in real-time. Hevo provides an easy-to-use visual interface that allows you to load data in two easy steps:

Step 1: Authenticate and Connect Localytics Data Source

Hevo can bring raw data from Localytics into your Destinations for deeper analytics. If you are a Localytics enterprise customer, you can get in touch with them to enable raw JSON exports into an Amazon S3 bucket.

Once you have done that, you can use Hevo’s S3 Source to replicate the raw data into your destinations. Read more about setting up a Pipeline with S3 Source.

Step 2: Configure the Redshift Data Warehouse where you need to Move the Data

Hevo can load data from any of your Pipelines into an Amazon Redshift data warehouse. You can set up the Redshift Destination on the fly, as part of the Pipeline creation process, or independently. The ingested data is first staged in Hevo’s S3 bucket before it is batched and loaded to the Amazon Redshift Destination.

Localytics to Redshift - S2
Source: Self

That is all! Now, relax and watch your data move from Localytics to Redshift.

Advantages of Using Hevo

  • Minimal Setup: The process involved in setting up Hevo is simple to follow and easy to execute.
  • Completely Managed Platform: Hevo is fully managed. You need not invest any time and effort to maintain or monitor the infrastructure.
  • Real-time Data Capture: Hevo will do all the grunt work and ensure that your Localytics data is always up-to-date in Redshift.
  • More Data Sources: Hevo can bring data from not just Localytics but over 100+ data sources. This ensures that you can scale your data pipeline any time, at will.
  • Ability to Transform Data: Hevo allows you to transform the data before and after transferring it to Redshift. Data is ever ready for analysis with Hevo on your side.
  • 24×7 Customer Support: Hevo provides you with impeccable support around the clock over call, email, and chat.
Sign up here for a 14-Day Free Trial!


The article explained to you the two methods of connecting Localytics to Redshift in a step-by-step manner. It also discussed the limitations of writing custom scripts to set up your ETL process. If you have no qualms about facing those limitations then you can try the manual ETL setup.

Hevo Data on the other hand can simplify your task by eliminating the need to write any code. It will automate the process of data transfer from Localytics to Redshift and provide you with a hassle-free experience. Hevo provides granular logs that allow you to monitor the health and flow of your data. 

Visit our Website to Explore Hevo

In addition to Localytics, Hevo can load data from a multitude of 100+ other data sources including Databases, Cloud Applications, SDKs, and more. This allows you to scale up your data infrastructure on demand and start moving data from all the applications important for your business.

Want to take Hevo for a spin? Sign Up here for a 14-day free trial and experience the feature-rich Hevo suite first hand. You can also have a look at our unbeatable pricing that will help you choose the right plan for your business needs!

We would be thrilled to know what you think of the methods to transfer data from Localytics to Redshift detailed in this article. Let us know your thoughts and your preferred way of moving data in the comments section below.

Freelance Technical Content Writer, Hevo Data

Eva loves learning about data science, with an intense passion for writing on data, software architecture, and related topics. She enjoys creating an impact through content tailored for data teams, aimed at resolving intricate business problems.

No-code Data Pipeline for Redshift