T

This blog post is written by DevCycle who partnered with Macrometa to launch world's first and fastest Feature-Flag-as-a-Service (all from the Edge). Let's hear from them on why they chose Macrometa and how they built an ultra low-latency global feature flags solution using Macrometa.

A Familiar Problem Reappears

The idea of saving user data to the server is not a new concept. It’s a tried and true archetype that is used widely. Our old architecture is built on this premise; a user is introduced into the system through our SDKs, the server receives the request, and the server then saves that user to our database and returns the user their configuration with all of the experiments and feature flags associated with that user.

In DevCycle, we reimagined our systems to separate that logic by creating a service that buckets users into experiments/feature flags. This service would live on edge servers hosted by Cloudflare, meaning any request sent to these workers will direct you to the closest node in the Cloudflare network with a lightning-fast response (think ~50ms). This is a step up from our old response times, which averaged around the 150ms mark.

The caveat for this fast response time was that we could not store user data in traditional databases as we valued the response time as paramount. Our old system had the user data fetched from the database to use in bucketing decisions, but it would add significant time to the request.

Imagine having a feature flag gating a feature on your landing page; a fast response time could mean the difference between your user seeing the new feature or seeing a blank page while the request waits to finish.

Instead of having the user data saved, we initially opted for an approach where the developer would pass in the user data in every request, eliminating the need to keep it as it would be available in the request itself. This was a satisfactory solution, but it made using the tool a little restricted. You’d have to pass all the user data in every request to get features instead of piecemeal when the user triggers changes to their attributes. Not having to give in all their details is the explicit goal, but how do we get there?

Exploring New Frontiers

We explored several different options when considering a solution to storing user data. We had specific criteria when researching the market for solutions, but the most important to us was having low latency read times.

A few options we boiled it down to were:

  • Macrometa
  • Cloudflare’s Durable Objects

Durable Objects seemed like a good option touting a low latency key-value storage solution that was strongly consistent. The tradeoff, however, was that the Object being updated lives in the data center where it was first created. After creation, the data currently cannot be moved to a new region. This is fine if the user creating the data remains in the exact geographical location and never moves. Still, it breaks down as soon as they attempt to access their data from a location other than where they initially created it. Over time, performance, on average, would decline as users move around the world.

Macrometa boasts low latency round trips and average response times of less than 50ms, with a tradeoff of replication lag for regions around the world, typically less than 300ms. We decided this was worth the tradeoff to maintain a faster average read performance. 

Implementation was straightforward. An endpoint was created for our SDKs to use when a developer has EdgeDB enabled as a feature, taking in user data as the request body and simple authorization handling by reading a client key associated with a DevCycle project from the request header. Macrometa’s library was easy to implement into our Cloudflare Workers, and their docs were concise and easy to follow.

We’ve also added an option to turn EdgeDB on in the SDKs so that it tells the API to use EdgeDB user data in the request to get user variables and combine any user data sent up in the request. You can find the public API in our docs.

Finally, we have a low latency solution to read/write user data into a geographically- replicated data store and a publicly accessible endpoint to save user data!

Okay, but how is a low latency solution useful?

A feature is only outstanding as its use-case, so how would you use EdgeDB?

Let’s imagine you’re creating a product that is a storefront to products regularly sold at a brick-and-mortar store. Maybe it could be artisanal coffee drinks. Let’s call it “Planet Dollars”. 

Now “Planet Dollars” will have an accompanying app that you can download on your phone and a website to order ahead and pick up your drink at a store.

You want to give members an incentive to keep spending money, giving them stars on every purchase. Once they accrue a certain number of stars, they’ll achieve a “Platinum” status on their account, unlocking new functionality in the app. The user orders on the website, and the request is sent to your backend:

router.post('/purchase', async (event) => {

 ...

 const purchasedItems = await purchaseItems(userId, items)

 const user = {

   user_id: userId,

   customData: {

     stars: purchasedItems.length

}

 }

// update the user's star count through our EdgeDB API

 await axiosClient.request({

   method: 'PATCH',

   url: `https://sdk-api.devcycle.com/v1/edgedb/${encodeURIComponent(user.user_id)}`,

   data: user,

   headers: { Authorization: '<YOUR-CLIENT-KEY>

})

 ...

})

Then, create a feature on DevCycle that segments users into the “Platinum Status” feature where only users with a certain amount of stars earned get to see the feature:

Finally, a user can log in to their mobile app using the same link to the website. This means the stars tracked from purchasing on the web storefront are associated with the user when logging in to the mobile app with the same user_id, and the app can show a new experience if they are targeted into the feature:

func login(userId) {

 // your log in logic

 ...

 // identify the new userId in DevCycle and grab their features

 let user = try DVCUser.builder()

                        .userId(userId)

                        .build()

 try dvcClient.identify(user: user) { error, variables in

   // check if the user gets the platinum status feature by accruing

   // enough stars (i.e. stars >= 100 in the example above)

   let platinumStatus = dvcClient.variable(key: "platinum-status-feature", defaultValue: false)

   if (platinumStatus.value == true) {

     // they qualified, show the user platinum status UI!

     showPlatinumStatusUI()

   }

 }

 ...

}

It’s that simple! With EdgeDB, you’ll have a system that can update users on the backend/frontend and mobile applications and have those same users consistently get features throughout all your platforms using only as an identifying attribute. 

Check out our docs if you want to learn more about EdgeDB and DevCycle. 

Read the press release:

Macrometa and DevCycle Join Forces on the World's First and Fastest Feature-Flag-as-a-Service (all from the Edge)

Posted 
Sep 15, 2022
 in 
Announcements
 category

More from 

Announcements

 category

View All

Join Our Newsletter and Get the Latest
Posts to Your Inbox