• LOGIN
  • No products in the cart.

Login

How do we build a scalable data ingestion process? Modern companies need platforms which can redirect gigabytes of data per second, while handling interruptions gracefully and preserving the integrity of the data. We introduce the concept of event ingestors and stream processors as tools to accomplish that, and teach you how to use Azure Event Hubs and Azure Stream Processing to build a scalable pipeline. We conclude with a discussion of Stream Query Languages, extensions to traditional SQL designed for the particular problems of high velocity and volume data.

Course Curriculum

Welcome
Learning Objectives 00:00:00
Data Ingeston
Credit Card Streamer 00:06:00
Setting up an Event Hub 00:18:00
Typical Event Processing 00:07:00
Stream Processing
Creating a Stream Analytics job 00:00:00
Setting up Inputs and Outputs 00:08:00
Credit Card Stream Dashboard 00:06:00
Using Power BI 00:16:00

Course Reviews

N.A

ratings
  • 5 stars0
  • 4 stars0
  • 3 stars0
  • 2 stars0
  • 1 stars0

No Reviews found for this course.

15 STUDENTS ENROLLED
© Copyright - Data Science Dojo