You, This Course and Us
Learn By Example : Apache Storm
Quick Facts
particular | details | |||
---|---|---|---|---|
Medium of instructions
English
|
Mode of learning
Self study
|
Mode of Delivery
Video and Text Based
|
Course overview
Apache Storm is open and free access distributed real-time computation platform, that is used for real-time processing of everything Hadoop achieved for batch systems by making it simple to safely process unlimited streams of data. Apache Storm is easy to use, compatible with all programming languages, and a wonderful experience. Learn By Example: Apache Storm online certification was designed by Loony Corn, a team of ex-Google, Stanford, and Flipkart employees, which is provided by Udemy.
Learn By Example: Apache Storm online training is developed for individuals who are looking for a practical training program that could help them master the principles and strategies involved with Apache Storm using real examples. With Learn By Example: Apache Storm online classes, individuals will be provided with 4 hours of digital sessions supported by 15 downloadable study materials which cover topics like application development, data processing, stream processing, distributed processing, grouping, parallelism, transformations, ML algorithms and more.
The highlights
- Certificate of completion
- Self-paced course
- 4 hours of pre-recorded video content
- 15 downloadable resources
Program offerings
- Online course
- Learning resources
- 30-day money-back guarantee
- Unlimited access
- Accessible on mobile devices and tv
Course and certificate fees
Fees information
certificate availability
Yes
certificate providing authority
Udemy
Who it is for
What you will learn
After completing the Learn By Example: Apache Storm certification course, individuals will develop an understanding of the strategies and concepts involved with apache storm for developing applications, word count topology, and storm topology using python. Individuals will explore the fundamentals associated with data streaming, data processing, distributed processing, and grouping using apache storm. Individuals will learn about concepts involved with reliability, fault tolerance, Twitter spout, parallelism, transformation, HDFS, and DRPC calls. In addition, individuals will gain knowledge on how to use machine learning algorithms in storm applications.
The syllabus
Start Here
Stream Processing with Storm
- How does Twitter compute Trends?
- Improving Performance using Distributed Processing
- Building blocks of Storm Topologies
- Adding Parallelism in a Storm Topology
- Components of a Storm Cluster
Implementing a Hello World Topology
- A Simple Hello World Topology
- Ex 1: Implementing a Spout
- Ex 1: Implementing a Bolt
- Ex 1: Submitting the Topology
Processing Data using Files
- Ex 2: Reading Data from a File
- Representing Data using Tuples
- Ex 3: Accessing data from Tuples
- Ex 4: Writing Data to a File
Running a Topology in the Remote Mode
- Setting up a Storm Cluster
- Ex 5: Submitting a topology to the Storm Cluster
Adding Parallelism to a Storm Topology
- Ex 6 : Shuffle Grouping
- Ex 7: Fields Grouping
- Ex 8: All Grouping
- Ex 9: Custom Grouping
- Ex 10: Direct Grouping
Section 7: Building a Word Count Topology
Ex 11: Building a Word Count Topology
Remote Procedure Calls Using Storm
- Ex 12: A Storm Topology for DRPC calls
Managing Reliability of Topologies
Ex 13: Managing Failures in Spouts
Integrating Storm with Different Sources/Sinks
- Ex 14: Implementing a Twitter Spout
- Ex 15: Using a HDFS Bolt
Using the Storm Multilang Protocol
Ex 16: Building a Storm Topology using Python
Complex Transformations using Trident
- Ex 17: Building a basic Trident Topology
- Ex 18: Implementing a Map Function
- Ex 19: Implementing a Filter Function
- Ex 20: Aggregating data
- Ex 21: Understanding States
- Ex 22: Windowing operations
- Ex 23: Joining data streams
- Ex 24: Building a Twitter Hashtag Extractor