Be Author Be Seller Become Member now and avail new Offers!
.com Category
Login
kachhua
Redirecting you.... kachhua

Log in

Mobile No / Email


Password



or
Sign Up
kachhua
Redirecting you.... kachhua

Registration

Already have an account? Login Here

Mobile No.
Name
Email
Password

or Signup with
Cart (0 Items)
Subtotal: $0.00
Your cart is empty!
 


Flume and Sqoop for Ingesting Big Data

This Flume and Sqoop online course for ingesting Big Data covers the basics of data transportation in Hadoop. Flume and Sqoop are two tools used to draw data from different sources and load. t
Flume and Sqoop for Ingesting Big Data
  • 2999
logo
: Online Course
: English
: Loonycorn
Displaying 1-4 of 4 result(s).

About course

  • Use Flume and Sqoop to import data to HDFS, HBase and Hive from a variety of sources, including Twitter and MySQL

    Import data : Flume and Sqoop play a special role in the Hadoop ecosystem. They transport data from sources like local file systems, HTTP, MySQL and Twitter which hold/produce data to data stores like HDFS, HBase and Hive. Both tools come with built-in functionality and abstract away users from the complexity of transporting data between these systems. 

    Flume: Flume Agents can transport data produced by a streaming application to data stores like HDFS and HBase. 

    Sqoop: Use Sqoop to bulk import data from traditional RDBMS to Hadoop storage architectures like HDFS or Hive. 

    What is Covered:

    Practical implementations for a variety of sources and data stores ..

    • Sources : Twitter, MySQL, Spooling Directory, HTTP
    • Sinks : HDFS, HBase, Hive

    Flume features : 

    Flume Agents, Flume Events, Event bucketing, Channel selectors, Interceptors

    Sqoop features : 

    Sqoop import from MySQL, Incremental imports using Sqoop Jobs

     

Topics covered in this course

What you get from this course?

    • Use Flume to ingest data to HDFS and HBase
    • Use Sqoop to import data from MySQL to HDFS and Hive
    • Ingest data from a variety of sources including HTTP, Twitter and MySQL

Who should buy this course?

    • Knowledge of HDFS is a prerequisite for the course
    • HBase and Hive examples assume basic understanding of HBase and Hive shells
    • HDFS is required to run most of the examples, so you'll need to have a working installation of HDFS
    • Engineers building an application with HDFS/HBase/Hive as the data store
    • Engineers who want to port data from legacy data stores to HDFS
 
Discussion
 
Provided by
L

Loonycorn is us, Janani Ravi, Vitthal Srinivasan, Swetha Kolalapudi and Navdeep Singh. Between the four of us, we have studied at Stanford, IIM Ahmedabad, the IITs and have spent years (decades, actually) working in tech, in the Bay Area, New York, Singapore and Bangalore.

Show more

Invite friends and earn upto 20% of all orders placed by them.

Earn by sharing url

Share the link:

COPY



  kachhua
Help & Support Request a Callback Call us on | India : +919662523399/66