Apache Sqoop Introduction

Updated On February 12, 2021 | By Mahesh Mogal

One reason which made Hadoop ecosystem popular is its ability to process different forms of data. But not all data is present in HDFS i.e Hadoop Distributed File System. We have been using relational databases to store and process structured data for a long time. That is why a lot of data still resides in RDBMS and we need some tool to bring that data to HDFS. To solve this problem, We can use Sqoop.

What is Sqoop

Apache Sqoop is an open-source tool that helps user transfer data between structured data sources and Hadoop. Using it, we can extract data from RDBMS, load it to Hadoop, process it and again store it back to RDBMS.

Sqoop uses map-reduce to import and export data which provides parallel operation. After we submit the Sqoop command, it converts that to a map-reduce job and submits it to the Hadoop cluster. Hadoop cluster then executes that job.

Sqoop commands create Java classes that have the structure of data that is being transferred. We can use Java source code and classes which are generated in the same directory from where we invoke Sqoop command.

Sqoop Tools

We can use the command '$sqoop help' to display the collection of Sqoop tools.

Apache Sqoop Introduction to tools

We can see all the options and configuration parameters of each tool by using '$sqoop  --help'.

Apache sqoop getting help for all commands

We can check connectivity with a relational database or run an ad-hoc query using 'eval' tool.

For this, we need to give a JDBC connection URL of the database. Along with that, we also need to provide username and password which will be used to connect to our database. We can use 'eval' to list all tables in the "employees" database which is already present in MySQL.

Sample Apache Sqoop Command

Here we have used localhost in our connection string that is because MySQL is on the same machine as Sqoop. If our MySQL database resides on another machine we will need to give its address. You can directly give the password in the command line using --password argument but this can be risky in the production environment.

We do have a better way to pass connection string, username, and password. We can store all this information in one file and use that file as an argument in the Sqoop command. Consider we have the following conf file.

We can rewrite eval command like this using the conf file.

simple isn't it?

We have seen what Sqoop is and it's basic commands. We will explore it in detail in the next few articles in this section.

Mahesh Mogal

I am passionate about Cloud, Data Analytics, Machine Learning, and Artificial Intelligence. I like to learn and try out new things. I have started blogging about my experience while learning these exciting technologies.

Stay Updated with Latest Blogs

Get latest blogs delivered to your mail directly.

Recent Posts

Spark Join Types With Examples

In this blog, we are going to learn different spark join types. We will also write code and validate data output for each join type to better understand them.

Read More
Integrate Spark with Jupyter Notebook and Visual Studio Code

In this blog, we are going to integrate spark with jupyter notebook and visual studio code to create easy-to-use development environment.

Read More
Reading Data From SQL Tables in Spark

In this blog, we are going to learn about reading data from SQL tables in Spark. We will create Spark data frames from tables and query results as well.

Read More

Leave a Reply

Your email address will not be published. Required fields are marked *

linkedin facebook pinterest youtube rss twitter instagram facebook-blank rss-blank linkedin-blank pinterest youtube twitter instagram