pentaho big data - essentials

By attending Pentaho Big Data - Essentials workshop, Delegates will:

  • Identify the purpose and value of various big data technologies: Hadoop, HDFS, Hive, MapReduce, NoSQL databases, and so on
  • Read and write data using HDFS
  • Orchestrate big data jobs in Pentaho Data Integration
  • Use Pentaho Data Integration (and Pentaho MapReduce) to manipulate big data
  • Read and write data using a NoSQL data source
  • Visualize big data using Pentaho InstaView

Pentaho Big Data - Essentials training course provides an overview of big data technologies and an overview of the Pentaho tools for both working with big data and for visualizing it.

Delegates should attend a course: Pentaho Data Integration or have equivalent field experience with Pentaho Data Integration. Big data knowledge is helpful but not required. Some basic knowledge of the Linux operating system (CentOS) is required.

  • Business Analysts
  • Solution Developers