Hadoop Data Ingestion techniques Training & Certification Courses
 800 Ratings

Enquire Now


 
Loading...  

Processing data, please wait...

Expert Chat
Guarantee to Run the Classes
Guarantee to Run the Classes
Get Trained by Industry Expert
Get Trained by Industry Expert
#1 Offshore IT Training Company
#1 Offshore IT Training Company

Overview

This course is entitled for participants wanting to learn data ingestion from different sources like log files, streaming data, real time data from social media and much more.

Need more info ? Email info@koenig-solutions.com  or   Enquire now!

Schedule & Prices

Delivery Mode Location Course Duration Fees  Schedule
Instructor-Led Online Training (1-on-1) Client's Home/Office5 Days $ 1,700 As per mutual convenience (4-Hours Evenings & Weekends Possible
Classroom Training * Dubai 5 Days $ 2,600 On Request
Delhi, Bangalore, Dehradun (Rishikesh), Goa, Shimla, Chennai 5 Days $ 1,700
3-7 Dec 2018 (1 Seat Left)
Fly-Me-a-Trainer Client's Location5 Days On Request As per mutual convenience
Need more clarity on schedule and prices? Email info@koenig-solutions.com  or   Enquire now!

Course Content / Exam(s)

Schedule for Hadoop Data Ingestion techniques

Course Name Duration (days)
Hadoop Data Ingestion techniques5

Course Prerequisites

Basic understanding of linux commands.


Need more info ? Email info@koenig-solutions.com  or   Chat with the Experts Now

Hadoop Data Ingestion techniques Benefits

Upon Completion of this Course, you will accomplish following:

DAY 1

Course Overview

  •  What is Hadoop and HDFS?
  •  What is Data Ingestion?
  •  Use cases for SQOOP, FLUME and KAFKA

Data ingestion from RDBMS using SQOOP

  •  What is SQOOP
  •  How SQOOP works – Map only Job
  •  SQOOP Tools
  • List-databases
  • List-tables
  • Eval
  • import

DAY 2

SQOOP Tools – continue

  •   Import-all-tables
  •   Export
  •   Job
  •   Merge

SQOOP use cases

Capturing Data with Apache Flume

  •  What is Apache Flume?
  •  Basic Flume Architecture
  •  Flume Sources
  •  Flume Sinks
  •  Flume Channels
  •  Flume Configuration

Getting Started with Flume

  •  Various Flume Source configuration
  •  Various Flume Sink configuration

Flume Configuration to get data from TWITTER

DAY 3

Message Processing with Apache Kafka

  •  What is Apache Kafka?
  •  Apache Kafka Overview
  •  Scaling Apache Kafka
  •  Apache Kafka Cluster Architecture
  •  Apache Kafka Command Line Tools

Kafka Cluster Basics

  •  Working with Kafka Command line tool
  •  Create and manage topics
  •  Starting Producer
  •  Starting Consumer

DAY 4

Integrating Apache Flume and Apache Kafka

  •  Overview
  •  Use Cases
  •  Configuration – Kafka as Source, Sink and Channel

SPARK Streaming with Kafka

  • Spark Streaming basics
  • Connecting Apache kafka with Spark Streaming
  • Process Apache Kafka messages with Apache Spark streaming
  • Using Kafka as a direct Data source

DAY 5

Twitter Data Analysis with Kafka

  • Getting data with Kafka from twitter

Consumer and Producer Java API

  • Create custom consumer
  • Create custom producer
  • Running custom producer and Consumer

 

Need more info ? Email info@koenig-solutions.com  or   Enquire now!