Flume and Sqoop for Ingesting Big Data

SGD 12 | 599
SGD 40

Loading the player...
Lectures
19
Language
English
Students
0
Reviews
Category
Development
Sub-Category
Database

15 days Money back Gurantee

Unlimited Access for 1 year

Android, iPhone and iPad Access

Certificate of Completion

Course Summary :

Taught by a 4 person team including 2 Stanford-educated, ex-Googlers  and 2 ex-Flipkart Lead Analysts. This team has decades of practical experience in working with Java and with billions of rows of data. 

Use Flume and Sqoop to import data to HDFS, HBase and Hive from a variety of sources, including Twitter and MySQL

Let’s parse that.

Import data : Flume and Sqoop play a special role in the Hadoop ecosystem. They transport data from sources like local file systems, HTTP, MySQL and Twitter which hold/produce data to data stores like HDFS, HBase and Hive. Both tools come with built-in functionality and abstract away users from the complexity of transporting data between these systems. 

Flume: Flume Agents can transport data produced by a streaming application to data stores like HDFS and HBase. 

Sqoop: Use Sqoop to bulk import data from traditional RDBMS to Hadoop storage architectures like HDFS or Hive. 

What's Covered:

Practical implementations for a variety of sources and data stores ..

  • Sources : Twitter, MySQL, Spooling Directory, HTTP
  • Sinks : HDFS, HBase, Hive

.. Flume features : 

Flume Agents, Flume Events, Event bucketing, Channel selectors, Interceptors

.. Sqoop features : 

Sqoop import from MySQL, Incremental imports using Sqoop Jobs

What am I going to get from this course?

  • Use Flume to ingest data to HDFS and HBase
  • Use Sqoop to import data from MySQL to HDFS and Hive
  • Ingest data from a variety of sources including HTTP, Twitter and MySQL

Pre-Requisites :

  • Knowledge of HDFS is a prerequisite for the course
  • HBase and Hive examples assume basic understanding of HBase and Hive shells
  • HDFS is required to run most of the examples, so you'll need to have a working installation of HDFS

Target Audience :

  • Yep! Engineers building an application with HDFS/HBase/Hive as the data store
  • Yep! Engineers who want to port data from legacy data stores to HDFS
Curriculum
Section 1 - You, This Course and Us
      1 : You, this course and Us01:46
      2 : DOWNLOAD SECTION 1 Flume-Sqoop
    Section 2 - Why do we need Flume and Sqoop?
        3 : Why do we need Flume and Sqoop?
      Section 3 - Flume
          4 : Installing Flume
          5 : DOWNLOAD FlumeInstall
          6 : Flume Agent - the basic unit of Flume
          7 : DOWNLOAD Flume Examples
          8 : Example 1 : Spool to Logger
          9 : Flume Events are how data is transported
          10 : Example 2 : Spool to HDFS
          11 : Example 3: HTTP to HDFS
          12 : Example 4: HTTP to HDFS with Event Bucketing
          13 : Example 5: Spool to HBase
          14 : Example 6: Using multiple sinks and Channel selectors
          15 : Example 7: Twitter Source with Interceptors
        Section 4 - Sqoop
            16 : Installing Sqoop
            17 : Example 8: Sqoop Import from MySQL to HDFS
            18 : Example 9: Sqoop Import from MySQL to Hive
            19 : Example 10: Incremental Imports using Sqoop Jobs

        Reviews

Instructor :

Loonycorn A 4-ppl team;ex-Google.

Biography

Loonycorn is us, Janani Ravi, Vitthal Srinivasan, Swetha Kolalapudi and Navdeep Singh. Between the four of us, we have studied at Stanford, IIM Ahmedabad, the IITs and have spent years (decades, actually) working in tech, in the Bay Area, New York, Singapore and Bangalore. Janani: 7 years at Google (New York, Singapore); Studied at Stanford; also worked at Flipkart and Microsoft Vitthal: Also Google (Singapore) and studied at Stanford; Flipkart, Credit Suisse and INSEAD too Swetha: Early Flipkart employee, IIM Ahmedabad and IIT Madras alum Navdeep: longtime Flipkart employee too, and IIT Guwahati alum We think we might have hit upon a neat way of teaching complicated tech courses in a funny, practical, engaging way, which is why we are so excited to be here on Unanth! We hope you will try our offerings, and think you'll like them :-)

Reviews

Average Rating