Announcing PhotonIQ: The New AI CDN For Accelerating Apps, APIs, Websites and Services
Tutorial

Tutorial: Implement Complex Event Processing In Macrometa Within Minutes

Post Image

When you think about data and event-driven analytics, it’s natural to think of stream processing as the most effective way to analyze streams of data in real time. Many organizations have used legacy event stream processing tools like Spark and Flink for the past decade. But what if your organization handles very complex events (like an insurance company processing thousands of claims on a monthly basis)?

Would stream processing still be a good fit if you were trying to track insurance scams across many different geographies? Or would Complex Event Processing (CEP) provide more accurate, real-time insights? With Macrometa, you can support CEP use cases like supply chain management and event stream processing use cases like orders and payments. Either way, Macrometa provides a CEP engine “out of the box” that is easy to learn with an SQL-like language and there are many extensions for different sources and sinks (Kafka, MQTT, etc.) and functions (math, pii, statistics, etc).

In this blog post, we’ll take a look at Macrometa’s CEP engine, how to implement it, and its benefits. Read on to learn more!

How does complex event processing work?

CEP helps you to automate certain processes, and even predict future outcomes based on your historical and real-time data. Make the most of your data with CEP - like understanding customer behavior patterns - by analyzing the events that trigger it.

In a nutshell, CEP helps you to analyze streams of events from multiple data sources on a very low-latency level to uncover patterns and derive insights. This way, you can take action on the stream of events as soon as they happen. With Macrometa’s CEP engine, you can set up CEP faster than traditional event stream processing tools like Spark or Flink. Macrometa Stream Workers offers a simple way of performing CEP. You can easily integrate streaming data and take appropriate actions to analyze and correlate data in different ways to meet your business goals.

Exploring Macrometa Streams and the CEP architecture

The Macrometa CEP architecture is designed to work with real-time data. This includes the data flows from any IoT devices like cameras, weather stations, and other sensors. The most important and distinguishable features of Macrometa’s Global Network (GDN) streams are as follows:

1. Geo-replication: If you create a global stream the data published will be replicated in all the regions. Also, it's possible to simultaneously publish and consume data to/from all the regions. This allows you to deliver low latency across all your locations and anywhere your customers interact with your business. The GDN also provides different ways to publish and consume stream events - REST API interface, drivers for Python, NodeJs, Java, Stream Workers, and Pulsar native clients. 

2. Collection stream: You can create and use a stream on a collection. This means all the data inserted, updated or deleted will be published to stream. One of the possible use cases is the change data capture (CDC).

3. Easy set-up from and to different data sources: The Stream Worker's data could come from different sources (GDN streams, GDN Database, HTTP endpoints, Kafka, MQTT and etc.) and can be published to different sinks. This is a feature available right “out of the box.” Stream Workers also allows you to transform data from one data type to another, aggregate streams, clean and filter data, and more. 

4.Objects created automatically: Stream Workers automatically create the needed objects like streams, collections, and even tables and remote databases. The CEP engine does this transparently for you behind the scenes. 

Now, let’s look at how you can implement CEP in Macrometa within minutes.

Setting up Macrometa CEP For Simple Producer-Consumer

The best way to get started is to use the samples located beneath Stream Workers listed within the GDN. Every sample has a good description of the use case it covers and how you can get it up and running.

To begin, you can log into the Macrometa platform and then find the Stream Workers menu item on the left. Stream Workers will only appear if it is already enabled within your account. You can enable it by contacting Macrometa support.

The producer events-generator below is based on the Sample-Cron-Trigger sample below is a great one to get started with since it is the most simple stream worker. It gives you a basic understanding of the Stream Workers structure and language. This stream worker generates random values every second and publishes them as events on SampleStream.

Image
@App:name("events-generator")
@App:description("This app produces random value as event after every 1 seconds")
@App:qlVersion('2')
CREATE TRIGGER MyTrigger WITH ( interval = 1 sec );
CREATE SINK STREAM SampleStream (random_value long);
-- 'eventTimestamp()' returns the timestamp of the processed/passed event.
INSERT INTO SampleStream
SELECT math:round(math:rand()*1000) as random_value FROM MyTrigger;

Once published the events-generator will start to create and publish events to SampleStream.

Image

The second stream worker here is a consumer of the events published by events-generator. It reads the events from SampleStream, collects 10 sequential events in a sliding window and processes them by retrieving the biggest one.

Image
@App:name('events-consumer')
@App:description("This stream worker consumes the trigger's events.")
@App:qlVersion('2')
-- Defines `SampleStream` Source.
CREATE SOURCE STREAM SampleStream (random_value long);
-- Define `SampleDestStream` Stream.
CREATE SINK STREAM SampleDestStream (random_value_max long);
-- Data Processing
@info(name='Query')
INSERT INTO SampleDestStream
SELECT MAX(random_value) AS random_value_max
FROM SampleStream WINDOW SLIDING_LENGTH(10);

Finally the result of the processed data is published to SampleDestStream stream.

Image

Any of the other Stream Workers cover different use cases for reading (source), processing, analyzing, and finally writing/storing/publishing (sink) the processed data. Go ahead and apply the ones that work best for your use cases from your GDN account or the Stream Workers documentation.

Getting started with Macrometa’s CEP

Now that you know the steps to implement Macrometa’s CEP engine in your organization. With Stream Workers, you can process events in real-time, trigger automated actions, and more. Overall, CEP can be an extremely powerful tool for event-driven analytics.

To get started with Macrometa and see Stream Workers in action, request a trial.

And to know more about how complex use cases require CEP that goes a step beyond just event stream, join us in this webinar Complex Event Processing with Macrometa: What’s Next After Streaming Analytics with Apache Spark and Apache Flink.

Join our newsletter and get the latest posts to your inbox
Form loading...





Featured Posts

Related Posts

Recent Posts

Platform

PhotonIQ
Join the Newsletter