T

The time when web was very slow and broke easily

In 1998, the web’s largest websites were straining to meet the demand of a new class of power user customers who had upgraded from dial-up internet access and were now using DSL and ISDN to connect to the internet. No more content to surf the internet (what a dumb term - Jean Polly I will never forgive you for that), internet users were downloading audio and video and all kinds of big objects in a frenzied apoplectic feeding on multimedia.   

 They were using web-based email like Hotmail (and my personal favorite Rocketmail) and chat messaging services to instant message or IM each other. Powering this explosive growth of the cloud was a unique new infrastructure invented by Tom Leighton and Danny Lewin at a startup called Akamai in Boston. And the brilliance of the algorithms and technology that they invented ended up becoming one of the most important critical infrastructures that enabled the web to become a part of our daily lives. 

 In spite of the fact that the data these websites served was thousands of miles away in primitive and precambrian clouds run by companies like Exodus Communications, the Content Delivery Network or CDN enabled the web to scale to be used by billions of people and made all these websites run quickly and instantly. Had it not been for Tom and Danny, perhaps the web as we know it today would not exist. That’s how important the technology they invented and the company Akamai that commercialized it is. 

Today’s cloud is also slow and breaks easily

Fast forward a quarter century to today (November 9th, 2022), the cloud is having a similar struggle. The challenge to scale and deliver data from cloud-native apps and APIs to billions of people using apps on smartphones and tablets, interacting with content on smart TVs, smart refrigerators, smart home appliances, still prevails. And not just that, folks are commuting in connected cars on rideshare services, getting food and groceries delivered by robots and drones, and transporting themselves on electric scooters and bikes. 

The cloud desperately needs it’s Akamai-like breakthrough moment when a new technical infrastructure and paradigm enables the cloud to scale to meet the needs of serving data and apps to billions of humans, using tens of billions of devices nearly instantly, anywhere in the world. That infrastructure and paradigm are the Global Data Network and Edge Computing

The Global Data Network or GDN is a new kind of distributed computing infrastructure that solves the hard problems of distributing data and computing geographically to 1000s of locations around the world.  A Global Data Mesh can connect data from anywhere, on any system, in any format, and turn it into APIs. This can further feed edge compute and in-region processing and serving of apps, from the CDN and wireless networks to enable a seamless, private, and secure computing in compliance with sovereign data regulatory frameworks via a suite of Global Data Protection services.

Macrometa’s GDN is the edge computing platform that is designed to help developers build new innovative digital services that take advantage of time-sensitive events, spread around global geography, yet with hyper proximity to where data is being produced or needs to be consumed  - all with the power and simplicity of a serverless API or SDK. 

How Macrometa and Akamai are expanding the edge with a supercloud

Today, we are incredibly excited and pleased to announce that the original OG of edge - Akamai and the new upstart disruptor of Edge - Macrometa, are both partnering to create an Edge Cloud Super Platform that amalgamates the deep capillary network of Akamai’s edge (4200 locations) and its industry-leading network backbone, with Akamai’s Linode Cloud and Macrometa’s Global Data Network Platform for developers to build globally-distributed apps and APIs. 

By combining the power of Akamai's edge, Linode’s cloud, and Macrometa’s GDN via a set of product and technology integrations, enterprise developers can now seamlessly build apps across cloud and edge with a new level of simplicity, and speed. Where the old cloud was slower and breaks easily, the supercloud can power real-time apps anywhere. And a supercloud deserves a super platform for building those apps.

Macrometa, Linode and Akamai Compute

Macrometa has been a customer and partner of Linode for close to four years now.  We adopted Linode early because we were impressed with the performance of Linode’s infrastructure, the very reasonable transfer costs of their dedicated network, and their impossibly amazing developer support. 

 Our use of Linode has grown over the years to the point where we started collaborating on bespoke customizations that only we may need among all of Linode’s customers due to our unique needs as a Global Data Network. With Akamai’s acquisition of Linode earlier this year, we were delighted to hear about Akamai’s plans to expand Linode from its current less than dozen worldwide regions to potentially fifty regions worldwide and more importantly their deep integration with Akamai’s backbone network and edge platforms. 

This now means that our customers can leverage Macrometa to build apps that will run better than cloud economics in many more regions around the world on our scale tier. This also means that developers can leverage Macrometa on Linode for deep compute power to run CPU and memory-intensive workloads that may not fit a javascript service worker model (like Akamai’s edge workers or Cloudflare workers). We can build what we like to call poly cloud apps - that straddle not just different cloud providers like AWS and Google but also span from cloud to the edge across Linode and Akamai. 

To enable customers to adopt the edge for enterprise apps, Akamai has announced the Linode RISE program, that provides $120K annual usage credits for year 1, and discounts on usage of 75%, 50%, 25% in succeeding years. Customers can take advantage of Macrometa via the RISE program to build and deploy apps on the edge.  Macrometa themselves went through the RISE program a few years back and has first hand experience of the power of the program.  

Please watch the following video where Macrometa CEO & Co-founder Chetan Venkatesh, sits down with Akamai’s director for business development Prenil Kottayankandy to discuss how Akamai is scaling Linode to meet enterprise needs and the reasons for creating and then expanding the RISE program. 

The Integration of Akamai Edge Workers and Macrometa Global Data Network

We are making a powerful and seamless integration between Macrometa Global Data Network via console available as a preview (with a general release planned for Jan 2023 where we will also support a common API and SDK) of our direct integration between Akamai edge workers. 

Macrometa enables developers to quickly and easily build APIs on data at rest via Query Workers and data in motion via Stream Workers. These query workers and stream workers function as a service that runs inside our poly model database with data locality and low latency but more importantly are exposed externally as a RESTful API endpoint

With our new integration with Akamai edge workers, we can automatically compile the query worker or stream worker into a native Akamai Edge Worker and automatically deploy it on to Akamai’s edge. All that the developer needs to do is to provide their Akamai credentials and with the click of a button, the API gets generated as an Akamai Edge Worker and automatically deployed to 1000s of Akamai edge PoPs that query and serve data from adjacent Macrometa regions on Linode. You can even provision, de-provision, debug and log your edge workers directly from Macrometa (debug and logging will be in the general release and are not available in the preview). 

Demonstrating the power of the edge to enterprise developers

To showcase the power of our integration, Macrometa and Akamai have jointly developed a full video streaming app (like a Netflix or a Hulu) that runs on the Akamai edge and Linode leveraging Macrometa, Linode and Akamai Edge Workers.  We chose this particular example to show how a complex cloud based backend composed of many microservices could be moved to the edge to power a range of APIs and web services to power the front end experience of a video streaming app or Over The Top (OTT) video services. Netflix engineers have written extensively about their complex and arcane microservices architecture on AWS.

Our sample application suggests that much of that complexity and cost can be reduced and performance improved by 25X to 50X per web request from client to backend and back by leveraging Macrometa, Linode and Akamai.  

Macrometaflix OTT/Streaming Video Application

Northwinds EdgeWorkers Demo

Closing thoughts: The Edge is your oyster

Edge Computing for the enterprise is finally here and now - with Macrometa and Akamai. The Global Data Network or GDN does to the Cloud what CDNs have done to scale the web. We haven't just built a new cloud - we've built a supercloud for next-gen apps.

Enterprise developers can now write applications powered by real-time data that run unfettered by the constraints of a centralized cloud deployment model, across a world wide network of data centers and edge PoPs, delivering powerful new digital experiences, instantly, securely and cost effectively at a scale that is 1000x the cloud. 

The future of the cloud is edge, and the edge is already here on Macrometa and Akamai. Come build on the cutting edge with Macrometa and friends. Excited to try out Macrometa? Sign up for a free playground account with Macrometa.

Read the press release.

Posted 
Nov 9, 2022
 in 
Edge
 category

More from 

Edge

 category

View All

Join Our Newsletter and Get the Latest
Posts to Your Inbox