Learn about PhotonIQ: AI Services at the Edge

Build A Real-Time Log Analytics Dashboard With Fastly And Macrometa

Post Image

POW! The challenges of log analytics

Real-time log analysis is complex and can require a huge amount of resources and tools. Having visibility of all these tools is an added challenge. At Macrometa, we give developers superpowers because we offer data storage and processing in one friendly interface. Tired of Kinesis or Splunk, or just want to explore new solutions in real-time log analytics? Macrometa and Fastly to the rescue!

Let’s talk about the technical architecture used to build a dashboard for real-time log analytics. This log analytics dashboard is part of a demo application called Edgely+ that Macrometa recently developed with our partner Fastly.

The log analytics dashboard is composed of our Stream Workers and Query Workers, a log capture service and a React JavaScript web app all built on Macrometa’s Global Data Network (GDN) and Fastly’s Compute@Edge. It leverages Macrometa’s pub/sub, event processing, and database capabilities to ingest web logs and process them to produce real-time charts and dashboards.

The volume of logs produced by a streaming service like Netflix or Disney+ would challenge anyone tasked with creating and supporting an analytics dashboard. They may end up just hoping a superhero will swoop in and save the day. Not all heroes wear capes and Macrometa is here to save the day!

Macrometa’s platform combines a performant database with stream processing capabilities for Complex Event Processing along with a globally-distributed architecture. This allows us to process and aggregate large volumes of log data in a single bound from across the internet with very low latency and a consistent database.

This blog will describe the architecture of the log analytics application and explain the various components of the solution. We'll also give you access to the GitHub repo so you can try it our for yourself!


Log analytics in action

Macrometa Fastly Log Demo Application

We have published the example log processing application code on GitHub and the repo’s README includes step-by-step instructions on how to install, configure, and run the demo both locally and on Fastly’s Compute@Edge infrastructure. The application consists of Stream Workers, Query Workers, Macrometa collections, a log capture service, and a web dashboard written in React.

Stream Workers

Stream Workers is Macrometa’s stream processing framework and can be used for data processing tasks including transformation, enrichment, and aggregation.

In our Fastly log processing application, we use three stream workers. One sends HTTP requests to the Fastly service every three seconds to generate logs to analyze and the other two are responsible for reading and aggregating those Fastly web logs.

Stream Worker Fastly-log-generator

This Stream Worker uses triggers to make three HTTP requests to the Fastly service every three seconds.

Stream Worker fastly-http-request-worker

This Stream worker reads Fastly logs from the `logs` collection and performs reordering for out of order logs. Based on the log timestamp, it determines the 10 second window and adds that window timestamp as an additional field.

Stream Worker Fastly-http-request-stats-1m-worker

This Stream worker reads logs from the `fastly-intermediate-stream` collection and performs windowing aggregation. It performs aggregation for response_status, url, response_body_size, and latency.

Query Workers

Query Workers allows developers to store parameterized Macrometa queries within the service and access them via a REST endpoint. Developers consuming data from Macrometa can thus access the data through this REST endpoint without having to provide the actual query, allowing a separation of business logic from presentation code.

Our log monitoring application uses the following Query Workers:






The dashboard in the React application is populated with the results of these queries.

Log Capture Service

Fastly pushes web logs to a Python application that inserts them into a Macrometa collection using the Macrometa Python driver. Once in the Macrometa stream collection, Stream Workers process and aggregate the logs.

React App

The dashboard application is written in the React JavaScript framework. The react-app/src/app/services directory contains service methods that communicate with the Macrometa Query Workers to provide data for the log analytics dashboard. These methods are using the jsC8 Javascript Macrometa driver.

Stay tuned for the next adventure…

We'll go into more details on Stream Workers and Query Workers in future blogs. Request a trial and see everything you can do in one platform.

Related posts

Join our newsletter and get the latest posts to your inbox
Form loading...

Featured Posts

Related Posts

Recent Posts


Global Data Network
Join the Newsletter