Getting Started with Streaming Analytics
- 1. Building an End-to-End Stream Application
- 2. Prepare Your Environment
- 3. Creating a Dataflow Application
- 4. Creating a Stream Analytics Application
- Two Options for Creating the Streaming Analytics Applications
- Creating a Service Pool and Environment
- Creating Your First Application
- Creating and Configuring the Kafka Source Stream
- Connecting Components
- Joining Multiple Streams
- Filtering Events in a Stream using Rules
- Using Aggregate Functions over Windows
- Implementing Business Rules on the Stream
- Transforming Data using a Projection Processor
- Streaming Alerts to an Analytics Engine for Dashboarding
- Streaming Violation Events to an Analytics Engine for Descriptive Analytics
- Streaming Violation Events into a Data Lake and Operational Data Store
- 5. Deploy an Application
- 6. Advanced: Performing Predictive Analytics on the Stream
- Logistical Regression Model
- Export the Model into SAM's Model Registry
- Enrichment and Normalization of Model Features
- Upload Custom Processors and UDFs for Enrichment and Normalization
- Scoring the Model in the Stream using a Streaming Split Join Pattern
- Streaming Split Join Pattern
- Score the Model Using the PMML Processor and Alert
- 7. Creating Visualizations Using Superset
- 8. SAM Test Mode
- Four Test Cases using SAM’s Test Mode
- Test Case 1: Testing Normal Event with No Violation Prediction
- Analyzing Test Case 1 Results
- Test Case 2: Testing Normal Event with Yes Violation Prediction
- Analyzint Test Case 2 Results
- Test Case 3: Testing Violation Event
- Analyzing Test Case 3 Results
- Test Case 4: Testing Multiple-Speeding-Events
- Analyzing Test Case 4 Results
- Running SAM Test Cases as Junit Tests in CI Pipelines
- Four Test Cases using SAM’s Test Mode
- 9. Creating Custom Sources and Sinks
- 10. Stream Operations