Big Data Hadoop Industrial Training and Online Classes by Deepak Smart Programming
Introduction
Course Syllabus
In today's era of technology we all are surrounding with the data that is in a huge amount to handle such data we use a technology that is "Hadoop" . Hadoop enables the company to do just that with its data storage needs. Hadoop deals with the distributed data that is on cloud. It is designed to scale up from single server to thousands of machines and each is offering the local computation and storage. To deals with the tools of handling the data or operations over the data is only possible with the perfect guidance of experts. Hadoop career opportunity are on a rise to mount everest and is becoming a must known technology now a days. settle up your career in Hadoop its the best platform to enroll.
CAREER OPPORTUNITY :
Software Developers and Architects.
Analytics Professionals.
Data Management Professionals.
Business Intelligence Professionals.
Project Managers.
Aspiring Data Scientists
and many more....!!

    • 1. Introduction to Big Data
    • ⇒ What is Big Data and where it is produced?

    • ⇒ Rise of Big Data

    • ⇒ Compare Hadoop vs traditional systems

    • ⇒ Limitations and Solutions of existing Data Analytics Architecture

    • ⇒ Attributes of Big Data

    • ⇒ Types of data, other technologies vs Big Data

    • 2. Hadoop Architecture and HDFS
    • ⇒ What is Hadoop?

    • ⇒ Hadoop History

    • ⇒ Distributing Processing System

    • ⇒ Core Components of Hadoop

    • ⇒HDFS Architecture

    • ⇒ Hadoop Master – Slave Architecture

    • ⇒ Daemontypes - Learn Name node, Data node, Secondary Name node.

    • 3. Hadoop Clusters and the Hadoop Ecosystem
    • ⇒ What is Hadoop Cluster?

    • ⇒ Pseudo Distributed mode

    • ⇒ Type of clusters

    • ⇒ Hadoop Ecosystem, Pig, Hive, Oozie, Flume, SQOOP

    • 4. Hadoop MapReduce Framework
    • ⇒ Overview of MapReduce Framework

    • ⇒ MapReduce Architecture

    • ⇒ Learn about Job tracker and Task tracker

    • ⇒ Hadoop Ecosystem, Pig, Hive, Oozie, Flume, SQOOP

    • ⇒ Use cases of MapReduce

    • ⇒ Anatomy of MapReduce Program

    • 5. MapReduce programs in Java
    • ⇒ Basic MapReduce API Concepts

    • ⇒ Writing MapReduce Driver

    • ⇒ Mappers, and Reducers in Java

    • ⇒ Speeding up Hadoop Development by Using Eclipse

    • ⇒ Unit Testing MapReduce Programs

    • ⇒ Demo on word count example

    • 6. Hive and HiveQL
    • ⇒ What is Hive?

    • ⇒ Hive vs MapReduce

    • ⇒ Hive DDL – Create/Show/Drop Tables

    • ⇒ Internal and External Tables

    • ⇒ Hive DML – Load Files & Insert Data

    • ⇒ Hive Architecture & Components

    • ⇒ Difference between Hive and RDBMS

    • ⇒ Partitions in Hive

    • 7. PIG
    • ⇒ PIG vs MapReduce

    • ⇒ PIG Architecture & Data types

    • ⇒ Shell and Utility components

    • ⇒ PIG Latin Relational Operators

    • ⇒ PIG Latin: File Loaders and UDF

    • ⇒ Programming structure in UDF

    • ⇒ PIG Jars Import, limitations of PIG.

    • 8. Apache SQOOP, Flume
    • ⇒ Why and what is SQOOP?

    • ⇒ SQOOP Architecture

    • ⇒ Benefits of SQOOP

    • ⇒ Importing Data Using SQOOP

    • ⇒ Apache Flume Introduction

    • ⇒ Flume Model and Goals

    • ⇒ Features of Flume

    • ⇒ Flume Use Case

    • 9. NoSQL Databases
    • ⇒ What is HBase?

    • ⇒ HBase Architecture

    • ⇒ HBase Components

    • ⇒ Storage Model of HBase

    • ⇒ HBase vs RDBMS

    • ⇒ Introduction to Mongo DB, CRUD

    • ⇒ Advantages of MongoDB over RDBMS

    • ⇒ Use case

    • 10. Oozie and Zookeeper
    • ⇒ Oozie – Simple/Complex Flow

    • ⇒ Oozie Workflow

    • ⇒ Oozie Components

    • ⇒ Demo on Oozie Workflow in XML

    • ⇒ What is Zookeeper?

    • ⇒ Features of Zookeeper

    • ⇒ Zookeeper Data Model

Batches Details

  • Duration

    2-3 Months
  • Available Seats

    15
  • Online Training Schedule

    8.00 pm to 10.00 pm
  • Industrial Training Schedule in Chandigarh

    8.00 am to 7.00 pm
    (2 hours per batch)


Register in Big Data Hadoop Course

If you want to take online/offline classes from us, please mail us directly