Hadoop Administration Certification Training

Learning Objectives:

     Understand Big Data and analyse limitations of traditional solutions. You will learn about the Hadoop and its core components and you will get to know about the difference between Hadoop 1.0 and Hadoop 2.x.
Topics:
  • Introduction to big data
  • Common big data domain scenarios
  • Limitations of traditional solutions
  • What is Hadoop?
  • Hadoop 1.0 ecosystem and its Core Components
  • Hadoop 2.x ecosystem and its Core Components
  • Application submission in YARN

Learning Objectives:

     In this module, you will learn about Hadoop Distributed File System, Hadoop Configuration Files and Hadoop Cluster Architecture. You will also get to know the roles and responsibilities of a Hadoop administrator.
Topics:
  • Distributed File System
  • Hadoop Cluster Architecture
  • Replication rules
  • Hadoop Cluster Modes
  • Rack awareness theory
  • Hadoop cluster administrator responsibilities
  • Understand working of HDFS
  • NTP server
  • Initial configuration required before installing Hadoop
  • Deploying Hadoop in a pseudo-distributed mode

Learning Objectives:

     Learn how to build a Hadoop multi-node cluster and understand the various properties of Namenode, Datanode and Secondary Namenode.
Topics:
  • OS Tuning for Hadoop Performance
  • Pre-requisite for installing Hadoop
  • Hadoop Configuration Files
  • Stale Configuration
  • RPC and HTTP Server Properties
  • Properties of Namenode, Datanode and Secondary Namenode
  • Log Files in Hadoop
  • Deploying a multi-node Hadoop cluster

Learning Objectives:

     In this module, you will learn how to add or remove nodes to your cluster in adhoc and recommended way. You will also understand day to day Cluster Administration tasks like balancing data in cluster, protecting data by enabling trash, attempting a manual failover, creating backup within or across clusters.
Topics:
  • Commisioning and Decommissioning of Node
  • HDFS Balancer
  • Namenode Federation in Hadoop
  • High Availabilty in Hadoop
  • .Trash Functionality
  • Checkpointing in Hadoop
  • Distcp
  • Disk balancer

Learning Objectives:

     Get to know about the various processing frameworks in Hadoop and understand the YARN job execution flow. You will also learn about various schedulers and MapReduce programming model in the context of Hadoop administrator and schedulers.
Topics:
  • Different Processing Frameworks
  • Different phases in Mapreduce
  • Spark and its Features
  • Application Workflow in YARN
  • YARN Metrics
  • YARN Capacity Scheduler and Fair Scheduler
  • Service Level Authorization (SLA)

Learning Objectives:

     In this module, you will understand the insights about Cluster Planning and Managing, what are the aspects one needs to think about when planning a setup of a new cluster.
Topics:
  • Planning a Hadoop 2.x cluster
  • Cluster sizing
  • Hardware, Network and Software considerations
  • Popular Hadoop distributions
  • Workload and usage patterns
  • Industry recommendations

Learning Objectives:

     Get to know about the Hadoop cluster monitoring and security concepts. You will also learn how to secure a Hadoop cluster with Kerberos.
Topics:
  • Monitoring Hadoop Clusters
  • Hadoop Security System Concepts
  • Securing a Hadoop Cluster With Kerberos
  • Common Misconfigurations
  • Overview on Kerberos
  • Checking log files to understand Hadoop clusters for troubleshooting

Learning Objectives: 

    In this module, you will learn about the Cloudera Hadoop 2.x and various features of it.
Topics:
  • Visualize Cloudera Manager
  • Features of Cloudera Manager
  • Build Cloudera Hadoop cluster using CDH
  • Installation choices in Cloudera
  • Cloudera Manager Vocabulary
  • Cloudera terminologies
  • Different tabs in Cloudera Manager
  • What is HUE?
  • Hue Architecture
  • Hue Interface
  • Hue Features

Learning Objectives:

     Get to know the working and installation of Hadoop ecosystem components such as Pig and Hive.
Topics:
  • Explain Hive
  • Hive Setup
  • Hive Configuration
  • Working with Hive
  • Setting Hive in local and remote metastore mode
  • Pig setup
  • Working with Pig

Learning Objectives:

     In this module, you will learn about the working and installation of HBase and Zookeeper.
Topics:
  • What is NoSQL Database
  • HBase data model
  • HBase Architecture
  • MemStore, WAL, BlockCache
  • HBase Hfile
  • Compactions
  • HBase Read and Write
  • HBase balancer and hbck
  • HBase setup
  • Working with HBase
  • Installing Zookeeper

Learning Objectives:

     In this module, you will get to know about Apache Oozie which is a server-based workflow scheduling system to manage Hadoop jobs.
Topics:
  • Oozie overview
  • Oozie Features
  • Oozie workflow, coordinator and bundle
  • Start, End and Error Node
  • Action Node
  • Join and Fork
  • Decision Node
  • Oozie CLI
  • Install Oozie

Learning Objectives:

     Learn about the different data ingestion tools such as Sqoop and Flume.
Topics:
  • Types of Data Ingestion
  • HDFS data loading commands
  • Purpose and features of Sqoop
  • Perform operations like, Sqoop Import, Export and Hive Import
  • Sqoop 2
  • Install Sqoop
  • Import data from RDBMS into HDFS
  • Flume features and architecture
  • Types of flow
  • Install Flume
  • Ingest Data From External Sources With Flume
  • Best Practices for Importing Data
Month End Offer - Flat 20% Off + 20% Cashback  
+
WhatsApp us