Certified Training Partner
This four-day Apache Hadoop 2.x training course is designed for administrators who deploy and manage Apache Hadoop 2.x clusters. Through a combination of lecture and hands-on exercises you will learn how to install, configure, maintain and scale your Hadoop 2.x environment. At the end of this course you will have a solid understanding of how Hadoop works with Big Data and through the hands-on exercises will have completed the Hadoop deployment lifecycle for a multi-node cluster.
This course utilises a Linux environment. Attendees should know how to navigate and modify files within a Linux environment. Existing knowledge of Hadoop is not required.
This course is designed for IT administrators and operators responsible for installing, configuring and supporting an Apache Hadoop 2.x deployment in a Linux environment.
This 4-day hands-on training course teaches students how to develop applications and analyse Big Data stored in Apache Hadoop 2.0 using Pig and Hive. Students will learn the details of Hadoop 2.x, YARN, the Hadoop Distributed File System (HDFS), an overview of MapReduce, and a deep dive into using Pig and Hive to perform data analytics on Big Data. Other topics covered include data ingestion using Sqoop and Flume, and defining workflow using Oozie.
Students should be familiar with programming principles and have experience in software development. SQL knowledge is also helpful. No prior Hadoop knowledge is required.
Data Analysts, BI Analysts, BI Developers, SAS Developers and other types of analysts who need to answer questions and analyse Big Data stored in a Hadoop cluster.
This four-day course provides Java programmers a deep-dive into Hadoop 2.x application development. Students will learn how to design and develop efficient and effective MapReduce applications for Hadoop 2.0 using the Hortonworks Data Platform. Students who attend this course will learn how to harness the power of Hadoop 2.x to manipulate, analyse and perform computations on their Big Data.
This course assumes students have experience developing Java applications and using a Java IDE. Labs are completed using the Eclipse IDE and Maven. No prior Hadoop knowledge is required.
Experienced Java software engineers who need to understand and develop Java MapReduce applications for Hadoop 2.x.
This course is designed for big data analysts who want to use the HBase NoSQL database which runs on top of HDFS to provide real-time read/write access to sparse datasets. Topics include HBase architecture, services, installation and schema design.
Attendees must have basic familiarity with data management systems. Familiarity with Hadoop or databases is helpful but not required. Attendees new to Hadoop are encouraged to attend the Apache Hadoop Essentials course.
Architects, software developers, and analysts responsible for implementing non-SQL databases in order to handle sparse data sets commonly found in big data use cases.
This 2-day hands-on training course introduces students to the development of custom YARN applications for Apache Hadoop. The course covers details of the YARN architecture, steps involved in writing a YARN application, writing a YARN client and ApplicationMaster, and how to launch Containers.
Student should be experienced Java developers who possess Hadoop development knowledge plus an understanding of HDFS and the MapReduce framework.
Hadoop Developers who need to develop YARN applications on Hadoop 2.x by writing custom YARN clients and ApplicationMasters in Java.
This 3-day, hands-on training course introduces the fundamentals of Data Science and how to apply these concepts in Hadoop using machine learning, Mahout, Pig, Python and various machine learning libraries like SciPy and Scikit-Learn.
Students must have:
• Experience with at least one programming or scripting language
• Knowledge of statistics and/or mathematics
• A basic understanding of big data and Hadoop principles.
Architects, software developers, analysts and data scientists who need to understand how to apply data science and machine learning on Hadoop will get the most from this course.
This one-day course provides a technical overview of Apache Hadoop for decision makers and business users. Students will obtain a deeper understanding of what is Big Data, Hadoop 2.x, the architecture and various technologies in the Hadoop ecosystem, and the business value that Hadoop provides.
No prior Hadoop knowledge is required. No programming experience is required. Students have the option of following along with hands-on demonstrations using the Hortonworks Sandbox.
Data architects, data integration architects, managers, C-level executives, decision makers, technical infrastructure team, and Hadoop administrators and developers who want to understand the fundamentals of big data and Hadoop 2.0 architecture.
At course completion, all students will have the necessary skills and information required to take the Hortonworks Apache Hadoop Certified Administrator, Certified Java Developer or Certified Developer exams which are administered via Kryterion.
Thank you very much for a very interesting and very well-conducted training. After some other vendors’ training, it was a nice change, as regards both content and quality of the teacher from Big Data Partnership.