Announcing PhotonIQ: The New AI CDN For Accelerating Apps, APIs, Websites and Services
Company

What Macrometa Customers Are Building On The GDN

Post Image

It's day four of Developer Week, and today we're focusing on customer stories, more specifically, what people are building today on the GDN to solve real business problems that need a real-time edge to solve them.

The era of big data is now behind us, and the era of fast, global, and real-time data has begun. Coupled with data now becoming a phenomenon that polarizes with its aspects of privacy and exploitation, data is also becoming an area that nations and states are seeking to regulate, to protect themselves from big tech.

There are 3 business drivers behind the enterprise going beyond looking for tooling on historical data and looking at real time data tooling. In this post, we discuss how Macrometa’s customers are taking advantage of the Global Data Network to harness these business drivers:

Business Driver #1 - Data Monetization

Enterprises have realized that the hundreds of dashboards and thousands of reports that their tools produce every day are ineffective and will never enable better decisions, revenue or improve their cost efficiencies. For enterprises to be able to take data to the bank, they need data and their tools to enable better decisions by dealing with “what’s happening in the now” and not by analyzing what happened last month, quarter, season and year (except for compliance reasons).

But there are structural barriers to this adoption of real time tools. The tools that an enterprise puts to work on data today – like data warehouses, data lakes, and databases, are backward looking data infrastructure and are great at telling you what happened at noon last week Tuesday but have no ability to tell you what’s happening now. Snowflake can’t tell you what to show a visitor when they hit a website and is going to bounce 1.8 seconds later because they didn’t see anything interesting. It's also impossible for Databricks to analyze the energy data coming from PG&E grid and correlate it with electric car data in order to help Tesla or Rivian figure out when it would be the least risky time to charge their vehicles during a heat wave in California without bringing the entire power grid down by oversubscribing.

Data monetization requires a tool chain that is “purpose built” for ingesting vast quantities of data in the “now” and making business decisions (analytics and AI) and actions (actuations that trigger automated workflows).

Resulticks - a SaaS disruptor in real-time Customer Data & omni-channel marketing

Our customer Resulticks (short for - Real Time Results from Clicks) is doing exactly this by building a massively concurrent omni-channel customer behavior analytics and attribution technology layer on Macrometa. Resulticks is one of the fastest growing SAAS companies in the Omni Channel Marketing & Customer Data Platform category. They came to Macrometa a year back, building a small real time web service for real time complex event processing (CEP) for a verticalized marketing web service for financial service customers that would make real time offers. Having had enormous success with Macrometa, they are now looking to build substantially new real time capabilities into their Customer Data Platform and data stack on Macrometa to gain a differentiation with real time as the lead value proposition. But more importantly, forward thinking SAAS disrupters like Resulticks are recognizing that data monetization needs new approaches to connect the dots of data into monetizable insight in 100 milliseconds or less and Macrometa’s real time CEP framework is probably the only such platform in the market to date. 

The team at Resulticks also has the hefty task of complying with a broad range of data handling and privacy regulations across different industries and geographies. Resulticks found that the Aerospike solution they deployed didn’t provide the speed and performance demanded by the marketers that use their real-time conversation platform worldwide. That’s when they turned to Macrometa.

“Our customer’s profitability and growth depend on our platform’s ability to go from data ingestion to action in milliseconds,” says Resulticks technical leadership and architect MS Kumar. “This requires a data platform that’s not just ultra fast but also versatile enough to handle complex data sets that combine streaming and structured data, enable real-time and batch queues, indexing, and data pipeline support as well as messaging in one integrated architecture.”

Business Driver #2 - Operational Analytics

Data lives fossilized as transactional data of record across hundreds if not thousands of silos in an enterprise. But data changes all the time in the real world. A website visitor bounces or becomes a prospect, a prospect becomes a customer with a click of a button that adds a payment to a subscription. And inside these silos, only the last action of a transaction and the state change between payment and order is captured, and all the metadata of what led to the transaction is lost forever. Transaction data fossilizes, frozen permanently in the amber resin of static transactional metadata and losing all relevance to the present and future because of the loss of operational metadata.

For real time operational analytics to be a reality, developers need tools and stacks for connecting operational data and metadata across boundaries (and silos). A technology platform that provides a streaming causal graph for tracking data and metadata changes, with a built-in complex event processor and real time pipeline that recalculates second and tertiary implications of this changing state as it happens. And finally, the technology platform must be able to put APIs on dynamic data to propagate it globally, across all the boundaries we discussed above. It should be able to make data flow between disparate silos and systems, across data centers and clouds, and across supply chains of information between teams, departments, divisions and partners.

AppSecEngineer - Delivering AppSec threat simulation and remediation training in real time

Singapore based AppSecEngineer provides labs, cyber-ranges and lessons on security focused areas like Application Security, Cloud Security, Kubernetes Security and more. Thousands of application security professionals use the AppSecEngineer platform everyday to train on threat scenarios in the constant battle to beat bad actors who may take advantage of bugs, exploits and vulnerabilities in these complex cloud environments.

To be able to provide a real time and highly realistic threat simulation and remediation lab, AppSecEngineer needs to ingest all kinds of data and enrich it for possible security threat events. Their platform captures high velocity logs and events, and transforms them into operational data that can be queried and investigated in milliseconds from the time the event is generated to the time it needs to be acted on. AppSecEngineer initially used a full AWS stack composed of AWS Lambda, DynamoDB, S3, Kinesis and Cloudwatch for log streaming, processing alerts and so on.

However, they quickly realized that this was neither developer-friendly nor scalable. Cloudwatch, while a great logging service, also had some significant downsides for what they were trying to achieve. In the words of Abhay Bhargav, world renowned security expert and CEO of AppSecEngineer, “The native query interface of Cloudwatch was difficult to work with, especially for several thousand functions with multiple API Gateway instances that we need to query. We found this frustrating, to say the least." Because Cloudwatch comes with Log Limits, and the AppSecEngineer team wanted to retain logs for an indefinite period, they also dealt with major cost challenges of using CloudWatch. They also found annoyances like the fact that Dashboarding and Alerts were not easy to use with Cloudwatch. All of this resulted in becoming a pretty major bottleneck and constantly growing management overhead for them as interest and use of AppSecEnginer’s platform exploded.

So they decided to replace Cloudwatch with Macrometa, and stored the streamed logs as structured logs in JSON documents, and using query workers, they added useful transformations in the form of additional context to the structured logs with service information, and our streaming graph engine to make graph relationships in real time on the dataset.

As Abhay puts it, this capability was not just revolutionary but radically simple, “All of this is radically easy with Macrometa because all we have to do is just create a query worker with specific parameters that we pass. This is an API that we invoke with the right parameters and the data is ingested into Macrometa and automatically becomes a graph. As a result we were able to get AppSecEngineer up and running very quickly on the Global Data Network” he says. “Firstly, we are able to capture events quickly with zero human intervention. Secondly, with the graphing capability we are able to model complex relationships and identify security anomalies and events very quickly on our stack.” He continues.

The result of using Macrometa instead of AWS Cloudwatch? “The much needed observability into our stack allows us to out-think attackers without ever having to “manage'' things” concludes Abhay.

Read Abhay's Developer Week Blog

Business Driver #3 – Context Sensitive Analytics

Enterprises collect data like they might go out of fashion any instant. While analytics on historical data points can be useful for many use cases, the real power of data analysis and actuation is unlocked by enriching data by synthesizing metadata about the relationships between historical data and events arriving in real time. This provides the ability to bring context to the data and enables contextual understanding and actuation based on similarities (is the data we are seeing arriving now seasonal or anomalistic), constraints (does the data arriving now differ from expectations of what previous data sets have informed us), paths (does the data show us different paths to the same point of point for a customer), and clustering around certain attributes (is the person in the chat window likely to buy a premium product or service based on the question they are asking).

The power of streaming graphs cannot be overemphasized for enabling this type of contextual enrichment of data – by processing events as they arrive and creating a context graph, enterprises can combine historical data and real time data to build powerful cognitive systems for real time decision support and actuation.

DevCycle - Re-imagining Feature Flags as a Real-Time Service

Toronto-based DevCycle has reimagined feature flag management for the real time era. Used by large international enterprise customers like Royal Bank of Canada (RBC), and fast growing SaaS companies like Humi and drone technology for the renewables energy sector Heliolytics - DevCycle wanted to differentiate itself by bringing ultra low latency capabilities deep into its core architecture. So they invented EdgeDB - a new A lightning-fast, globally replicated edge storage engine that simplifies feature flagging in serverless environments enabling a new era of feature management with Edge Flags, Super Segments, Data Residency - all on Macrometa’s Global Data Network by building higher level SDKs and APIs for feature flags using the power of the Global Data Mesh, Edge Compute and Data Protection.

After initially trying to implement the EdgeDB vision with cloudflare Durable Objects and being underwhelmed by the slow and performance, and high latencies to update and fetch data from these edge storage solutions, DevCycle turned to Macrometa and the Global Data Network and the results were in their own words “revolutionary”. Per Aaron Glazer, DevCycle’s highly energetic and visionary CEO “Data can now be saved as soon as it’s available and accessed by lightning-fast APIs to make near-instantaneous flagging decisions. All engineers need to leverage that data is a simple User ID – this level of simplicity is unheard of. Feature flags have never been real time or this fast and seamless until now”

Read the DevCycle Developer Week Blog

Conclusion

While reading these fascinating scenarios of how our customers are solving some of the problems that are impossible to address with centralized public clouds, I hope that we have made a simple and powerful case that the edge can help solve unique problems in the way the cloud does not. Macrometa’s differentiated architecture as a hyper distributed Global Data Network and edge cloud for developers enables a very powerful and unique value proposition - take advantage of powerful business drivers like data monetization, operational analytics and contextual analytics to make your apps and APIs global and real time so that you can profit from the edge.

The future of the cloud is at the edge. And the Edge is already available on Macrometa. Come build on the cutting edge with Macrometa and friends like Resulticks, AppSecEngineer and DevCycle did.

Join our newsletter and get the latest posts to your inbox
Form loading...





Featured Posts

Related Posts

Recent Posts

Platform

PhotonIQ
Join the Newsletter