hadoop development - essentials

Hadoop Development - Essentials training course gives awareness about the Hadoop framework which is the de facto platform for Big Data computation. Apache Hadoop is an open-source software framework that supports data-intensive distributed applications, licensed under the Apache v2 license. It supports the running of applications on large clusters of commodity hardware. The Hadoop framework transparently provides applications with both reliability and data motion. Hadoop implements a computational paradigm named map/reduce, where the application is divided into many small fragments of work, each of which may be executed or re-executed on any node in the cluster. In addition, it provides a distributed file system that stores data on the computer nodes, providing very high aggregate bandwidth across the cluster.

  • Some programming experience (preferably Java)
  • Knowledge of Hadoop is not required

  • Project / Program / Technical managers
  • Technical / Team leads
  • Software analysts/ engineers
  • Pre-sales consultant
  • Business development managers

In Hadoop Development - Essentials workshop, delegates will learn to:

  • Use the Hadoop & HDFS platform
  • Loading data into HDFS
  • Introduction to MapReduce
  • Writing and debugging MapReduce jobs
  • Implementing common algorithms on Hadoop
  • Using Mahout for advanced data mining
  • Benchmarking and optimizing performance

COURSE AGENDA

  • Big Data and the questions
  • Hadoop and the answers
  • Hadoop Cluster Configuration
  • Hadoop framework Internals
  • MapReduce Internals
  • MapReduce Design Patterns and Use-Cases
  • Hive
  • Pig
  • HBase
  • Impala
  • Best practices for Hadoop cluster
  • Best Practices for MapReduce
  • Hadoop in the cloud
  • Big Data and Social Media