Take your knowledge to the next level and solve real-world problems with training for Hadoop and the Enterprise Data Hub
Cloudera University’s four-day course for designing and building Big Data applications prepares you to analyze and solve real-world problems using Apache Hadoop and associated tools in the enterprise data hub (EDH).
You will work through the entire process of designing and building solutions, including ingesting data, determining the appropriate file format for storage, processing the stored data, and presenting the results to the end-user in an easy-to-digest form. Go beyond MapReduce to use additional elements of the EDH and develop converged applications that are highly relevant to the business.
Through instructor-led discussion and interactive, hands-on exercises, participants will navigate the Hadoop ecosystem, learning topics such as:
Creating a data set with Kite SDK
Developing custom Flume components for data ingestion
Managing a multi-stage workflow with Oozie
Analyzing data with Crunch
Writing user-defined functions for Hive and Impala
Transforming data with Morphlines
Indexing data with Cloudera Search
Audience and Prerequisites
This course is best suited to developers, engineers, and architects who want to use use Hadoop and related tools to solve real-world problems. Participants should have already attended Cloudera Developer Training for Apache Hadoop or have equivalent practical experience. Good knowledge of Java and basic familiarity with Linux are required. Experience with SQL is helpful.
Upon completion of the course, attendees receive a Cloudera Certified Developer for Apache Hadoop (CCDH) practice test. Certification is a great differentiator; it helps establish you as a leader in the field, providing employers and customers with tangible evidence of your skills and expertise.