Course Summary :
Taught by a 4 person team including 2 Stanford-educated, ex-Googlers and 2 ex-Flipkart Lead Analysts. This team has decades of practical experience in working with Java and with billions of rows of data.
Use Flume and Sqoop to import data to HDFS, HBase and Hive from a variety of sources, including Twitter and MySQL
Let’s parse that.
Import data : Flume and Sqoop play a special role in the Hadoop ecosystem. They transport data from sources like local file systems, HTTP, MySQL and Twitter which hold/produce data to data stores like HDFS, HBase and Hive. Both tools come with built-in functionality and abstract away users from the complexity of transporting data between these systems.
Flume: Flume Agents can transport data produced by a streaming application to data stores like HDFS and HBase.
Sqoop: Use Sqoop to bulk import data from traditional RDBMS to Hadoop storage architectures like HDFS or Hive.
Practical implementations for a variety of sources and data stores ..
- Sources : Twitter, MySQL, Spooling Directory, HTTP
- Sinks : HDFS, HBase, Hive
.. Flume features :
Flume Agents, Flume Events, Event bucketing, Channel selectors, Interceptors
.. Sqoop features :
Sqoop import from MySQL, Incremental imports using Sqoop Jobs
What am I going to get from this course?
- Use Flume to ingest data to HDFS and HBase
- Use Sqoop to import data from MySQL to HDFS and Hive
- Ingest data from a variety of sources including HTTP, Twitter and MySQL
- Knowledge of HDFS is a prerequisite for the course
- HBase and Hive examples assume basic understanding of HBase and Hive shells
- HDFS is required to run most of the examples, so you'll need to have a working installation of HDFS
Target Audience :
- Yep! Engineers building an application with HDFS/HBase/Hive as the data store
- Yep! Engineers who want to port data from legacy data stores to HDFS
Section 1 - You, This Course and Us
You, this course and Us01:46
DOWNLOAD SECTION 1 Flume-Sqoop
Section 2 - Why do we need Flume and Sqoop?
Why do we need Flume and Sqoop?
Section 3 - Flume
Flume Agent - the basic unit of Flume
DOWNLOAD Flume Examples
Example 1 : Spool to Logger
Flume Events are how data is transported
Example 2 : Spool to HDFS
Example 3: HTTP to HDFS
Example 4: HTTP to HDFS with Event Bucketing
Example 5: Spool to HBase
Example 6: Using multiple sinks and Channel selectors
Example 7: Twitter Source with Interceptors
Section 4 - Sqoop
Example 8: Sqoop Import from MySQL to HDFS
Example 9: Sqoop Import from MySQL to Hive
Example 10: Incremental Imports using Sqoop Jobs
Loonycorn A 4-ppl team;ex-Google.
Loonycorn is us, Janani Ravi, Vitthal Srinivasan, Swetha Kolalapudi and Navdeep Singh. Between the four of us, we have studied at Stanford, IIM Ahmedabad, the IITs and have spent years (decades, actually) working in tech, in the Bay Area, New York, Singapore and Bangalore. Janani: 7 years at Google (New York, Singapore); Studied at Stanford; also worked at Flipkart and Microsoft Vitthal: Also Google (Singapore) and studied at Stanford; Flipkart, Credit Suisse and INSEAD too Swetha: Early Flipkart employee, IIM Ahmedabad and IIT Madras alum Navdeep: longtime Flipkart employee too, and IIT Guwahati alum We think we might have hit upon a neat way of teaching complicated tech courses in a funny, practical, engaging way, which is why we are so excited to be here on Unanth! We hope you will try our offerings, and think you'll like them :-)