Load data from Google sheet to AWS RDS using Fivetran

@saras analytics

Hey Everyone !! Hope you are doing well :)

So finally after a long break , I wanted to document the work which I recently did to help people who are doing something similar.

Cutting to the chase, being my first time with Gsheet & AWS, creating this simple pipeline was a new thing. So I am gonna make sure, if this is your first time. It's easier for you :)

Some definitions —

Google sheet — Similar to MS Excel which we have used from primary classes. It is widely used for storing data.

AWS RDS (Relational Database Service) — It is a distributed relational database provided by AWS.

Fivetran — ELT tool that is becoming more popular every day.

These are all the services which Fivetran provides

The one which I am using from the above image is : Extract + Load

So I am using the inbuilt connectors which Fivetran provides to load data from one place to another without any hassle. Let’s start !!

Step 1: Sign up on Fivetran through your work account (assuming this is not a personal work)

Step 2: Select the destination where you want to create your connector. Below is the list of all the destinations which Fivetran supports.

p.s. one user account can have multiple destinations
I was using PostgreSQL
Fivetran destinations

My destination was PostgreSQL.

Step 3: Once you select the destination, you will be able to see your dashboard like below image.

p.s. your destination can also have 0 connectors if its newly created.) Destination is nothing but the credentials validation for final data location

FiveTran Dashboard

Step 4: Go to the connector option on the left tab and click on Add Connector button (refer the image above)

Step 5: You will be able to see a list of available data sources for which you can create the connector. Choose the type of the data source (for me it was Google sheet) and click on Continue Setup

Step 6: Here you need to make sure sure you have all these details handy to do the final step —

  • Destination schema name
  • Destination table name
  • Google sheet URL
  • A named range defined in your google sheet (p.s. — this is used to define the amount of data to be loaded in the destination)

The connection page looks like this —

Once you enter the URL of your google sheet, it will ask you to select the named ranges available in the sheet (if you have defined any)

Step 7: For the authentication method, choose accordingly. I didn't wanted to give access to all google sheets , so I selected the first one.

Step 8: Grant Fivetran read-only access to your Google Sheet — copy the email and make sure to share your sheet with the copied email (go to your sheet -> click on share option -> add the email -> done !)

Step 9: Once all the fields are populated in the setup page, click on Save & Test. Now your setup will begin & Fivetran will start validating the sheet and named range. It may take upto a minute.

Step 10: Once the validation is done, you need to start the initial sync. Select the option & wait for your data to be synced in your destination. Done finally !!

Note 1You may face issue with the permissions, so make sure fivetran has access to the destination table where you want to load the data. Without permission the sync will fail 100%.

I gave access to all these for my connector to work—

grant all on table schema_name.dest_table to fivetran;
grant all on schema schema_name to fivetran;
grant all on database db_name to fivetran;
alter table schema.dest_table OWNER TO fivetran;

Note 2 — If you don’t have table created in your destination, Fivetran will create one for you with its own schema and it will be the owner of the table.

For creating named range —

  • Go to your sheet, click on Data & select named ranges
  • The next tab will give you the option like this —
  • Click on Add a range & first select the rectangular box on the right & then drag all the columns which you want to load in your destination (p.s. selected number of records or attributes can also be loaded )
  • Save the selected range. Done here !!

That’s all folks !! Happy learning :)

--

--

--

Understanding Data Everyday || Noob Data Engineer

Love podcasts or audiobooks? Learn on the go with our new app.

Recommended from Medium

Bugs and Debugging

Red Hat Openshift Streams for Apache Kafka — Kafka Service Fleet Manager

API plugin of HaProxy for Canary deployment

Creating Modular Powerup Systems

Quantum Algorithmic Structures

Building a Cheap Quadcopter At Home (1) — Lift Off

10 Tips on How Developers Increase Their Productivity 🔥

Using the Spring Web Scopes, Part 2

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store
Harshita Singh

Harshita Singh

Understanding Data Everyday || Noob Data Engineer

More from Medium

Migrating Data from PostgreSQL to Amazon Redshift via AWS DMS

PySpark 3.2 on AWS EC2 Free Tier?!

Creating Amazon EMR Cluster

Amazon Kinesis Data Streams