Announcing PhotonIQ: The New AI CDN For Accelerating Apps, APIs, Websites and Services

Edge Computing vs Cloud Computing

Explore Enterprise Solutions

Cloud computing abstracts the application infrastructure traditionally managed by enterprises by placing server hardware in private data centers using infrastructure as a service (IaaS) implementation, such as a remote virtual machine, or a platform as a service (PaaS) model, such as a managed database service. Edge computing complements cloud computing by bringing the cloud services close to end-user devices for data-intensive applications requiring fast roundtrip response time that can’t be guaranteed by a cloud computing service centralized in a geographic region.

The following table summarizes how the two technologies compare. This free educational guide offers primers in the technologies covered in this article to help readers who are less familiar with distributed stream processing concepts.

Cloud ComputingEdge Computing
Time to insightDelayed: Due to network latency between data centers and devices, it takes some time to process the data and provide insights.Near real-time: Eliminates slow response time and provides near-real-time results.
CostHigh: Moving a large amount of data to data centers incurs high data transfer costs.Low: Data is processed locally at the edge node, reducing backhaul costs.
ScalabilityHigh: Scale up and scale out as needed.High: Scale up or scale out as needed.
SecurityHigh: Both Cloud & Edge providers adhere to high-security standards.
ComplianceComplex: Compliance with new regulations (geofencing or depersonalizing) requires due diligence and may incur additional expenses.Easy: Edge computing provides geofencing and processes the data on local networks.
Data qualityHigh: Data is backhauled to a central location, so it requires less data synchronization.High (Conditional): Only if the edge provider offers Conflict-free Replicated Data Type (CRDT).
ScalabilityHigh: The cloud has virtually unlimited resources and can handle complex processing.High (Conditional): As long as the edge provider offers many points of presence.
StorageUnlimited: The cloud provides virtually unlimited storage.Unlimited: A qualified edge provider can offer virtually unlimited storage.

Table 1. Comparison of Cloud and Edge computing

What Is Cloud Computing?

Cloud computing is the on-demand delivery of computing resources while abstracting the complexities of the underlying infrastructure from end-users. Cloud computing systems are software-defined environments that offer computing services, including servers, storage, networking, databases, software intelligence, and analytics solutions, and much more. The cloud is implemented on the internet and created on top of data centers or server farms. Instead of buying and maintaining hardware, one can use services from a cloud provider as needed.

Some of the significant benefits of cloud computing include the following:

  • Cost: Cloud computing is cheaper because it has a pay-for-usage model rather than maintaining its own data centers.
  • Productivity: Data centers require a lot of maintenance, such as hardware setup and frequent software patches, to keep them up and running. With cloud computing, the team can focus on more important business goals and save the cost of having specialized personnel.
  • Speed: Computing services in the cloud are self-service and on-demand, which means you can be up and running in a few seconds; for example, setting up a new server in a cloud requires just a few clicks.
  • Scalability: Cloud computing resources are elastic and easy to scale, including adding more compute power, extra storage, or bandwidth. Furthermore, one can scale up close to customer bases across the globe. These days, major cloud providers even offer to scale-out applications without any downtime.
  • Performance: Typically, cloud vendors are connected across the globe using proprietary networks and regularly update to the latest hardware. This means they can provide top-notch performance.

There are various “as a service” models in the cloud, such as IaaS, PaaS, and SaaS. Infrastructure as a service (IaaS) refers to renting IT infrastructure such as servers, storage, and virtual machines. IaaS is one of the most commonly used models in cloud computing. Amazon Web Services (AWS), Google Cloud Platform(GCP), and Microsoft Azure are some examples of IaaS. Platform as a service (PaaS) adds another abstraction layer of Operating system or runtime on top of IaaS as it offers a software platform and hardware, as shown in Fig 1. Heroku, Windows Azure, Google App Engine, and SAP Cloud are examples of PaaS. Finally, software as a service (SaaS), also known as cloud application services, delivers a complete application from the cloud, as shown in Figure 1. The cloud provider manages the hardware, operating system, and application with SaaS, with the application usually accessible via a web browser. In addition, the cloud provider handles all software updates. Some well-known examples here are Gmail, web-based Outlook, Dropbox, and Salesforce.

Edge computing vs cloud computing

Fig 1. IaaS, Paas, and SaaS compared to custom. Source

There are various types of cloud: public, private, and hybrid. The public cloud is the most common type, where computing resources are owned by a third party and can be used over the internet. Multiple organizations share all the resources (hardware, storage, and network devices) simultaneously. A private cloud is a set of computing resources owned and used exclusively by a specific organization. It can be hosted on-premises or by a third-party vendor but will be accessible only on that private network. Private clouds are often used by financial institutions, government agencies, and other organizations having custom requirements to set up the cloud environment. Finally, a hybrid cloud is a combination of both public and private clouds. The organization moves the data between the public and private cloud using some middleware or a virtual private network (VPN).

Challenges with Cloud Computing

Cloud computing has been designed with centralized architecture in mind, where all the data is brought into a centralized data center for processing. As a result, it provides disaster recovery, scalability, unlimited storage, and computation, enabling application development. However, there are use cases where such centralized architecture doesn’t perform well, and the network becomes a bottleneck.

The cloud’s centralized approach simplifies the processing architecture, but the Achilles’ heel of the cloud is the network. The cloud can centralize data processing, but it is counterbalanced by the need to transfer the data on the web, especially when scaled across geographies. Also, it can introduce synchronization issues between different data centers. Devices can generate terabytes of data to be moved over the network, which incurs costs and adds network delays.

The other challenge is response time: the rate at which the cloud returns results based on the input data. Data is first uploaded to a centralized cloud, then processed, and finally, a result is sent back to the device. Each step takes time.

Imagine a smart car connected with the cloud and making decisions based on transferred data from car sensors. Suppose the vehicle has to make a critical decision: If it is using the cloud, it has to wait for the computation results as it transfers a great deal of data for object recognition and then gets a response. Many real-time applications like these are both critical and require answers in a small fraction of a second, which means it makes more sense to have the data processing be local.

Other use cases where cloud computing isn’t the optimal solution include content delivery networks, real-time safety monitoring, smart cities, and most importantly, the Internet of Things (IoT).

IoT is a set of physical devices or sensors that work together to communicate and transfer data over the network without human-to-human or human-to-computer interaction. IoT growth has enabled data collection from connected devices and allows businesses to derive value from the data. As a result, it has enhanced business decision-making and helped businesses proactively mitigate risks, and as a result, grown exponentially. However, it has the same challenge as the cloud in that a massive amount of data is moved from “things” (devices) to data centers, increasing cost, latency, and response time.

There was a dire need for an architecture that could quickly analyze data and provide better response time cost-effectively. This has led to various ways to tackle the cloud’s challenges, such as edge computing, fog computing, and mist computing.

Edge computing is one architecture that addresses the limitations of the centralized cloud and provides quick results for computing, more immediate insights, lower risk, more trust, and better security. 

What Is Edge Computing?

Edge computing is a distributed framework that brings computation and storage close to the geographical location of the data source. The idea is to offload less compute-intensive processing from the cloud onto an additional layer of computing nodes within the devices’ local network, as shown in Figure 2. Edge computing is often confused with IoT even though edge computing is an architecture while IoT is one of its most significant applications.

Edge computing vs cloud computing

Figure 2. Edge computing infrastructure. Source

Edge solutions provide low latency, high bandwidth, device-level processing, data offload, and trusted computing and storage. In addition, they use less bandwidth because data is processed locally. Compared to cloud computing, only aggregated results are uploaded to the cloud, where all the raw data is transferred to a centralized data center. Edge computing also provides better data security because only depersonalized data moves out of the local network.

Edge computing vs cloud computing

Figure 3. Edge computing in a nutshell. Source

Edge computing exists in different forms including device edge and cloud edge. Device edge is when processing happens on a machine with limited processing power next to the devices. Cloud edge uses a micro data center for data processing locally and communicating with the cloud. In some cases, endpoint devices are also capable of processing natively and communicating directly with the cloud.

Examples

Autonomous cars generate four terabytes of data every few hours. In such a use case, cloud computing will not be a viable solution as the network will become a bottleneck, and cars need to act in a split second. Edge computing can come to the rescue here and complement cloud computing, with significant data processing happening at the edge nodes. 

Similarly, edge computing is being used widely in augmented reality (AR) and virtual reality (VR) applications. A good example is a Pokémon game, where the phone does a lot of processing while acting as an edge node.

Machine learning can benefit from the edge as well. For example, machine learning models are trained using a massive amount of data on the cloud, but once they are trained, they are deployed on edge for real-time predictions.

The Apple iPhone is an excellent example of an edge device taking care of privacy and security. It does encryption and stores the user’s biometric information on the device itself, so it isn’t uploaded to the cloud or any other central repository. In addition, it takes care of all the authentication on the devices, and only depersonalized information is shared to the cloud.

Voice assistants still use cloud computing, and it takes a noticeable amount of time for the end-user to get a response after sending a command. Usually, the voice command is compressed, sent to the server, uncompressed, processed, and the results sent back. Wouldn’t it be amazing if the device itself or an edge node nearby could process those commands and respond to the queries in real-time? It’s possible to achieve such low latency using edge computing.

5G is also being rolled out offering higher wireless network bandwidth than older technologies. Telcos need to deploy data centers close to the telco towers to complement their infrastructure with edge computing and avoid bottlenecks while processing vast amounts of data generated by new 5G cell phone and tablet devices.

Finally, edge computing can be implemented inside enterprise networks or in factory buildings, trains, planes, or private homes. In that scenario, all the sensors will be connected to a local edge node that will process the data from the connected devices (sensors) and process it before sending it to the cloud servers. Such a network is more secure and privacy-compliant as it will send only aggregated data with the personal information taken out of it.

Usually, it’s an edge server on a local network that receives data from different devices and processes it in real-time. However, endpoint devices don’t have a great deal of processing power, and they have minimal battery capacity, so conducting any intensive processing on them can deplete their resources.

Challenge

Edge computing moves the compute and storage to edge nodes, which offers geographically distributed data storage, state management, and data manipulation across multiple devices. Edge locations must perform stateful computing and reconcile copies of data asynchronously to scale, but synchronizing local data copies with peer edge locations is complex and requires specialized technology. Another challenge in developing applications capable of taking advantage of edge computing is the need to integrate various technologies such as a NoSQL database, a graph database, application messaging, and event streaming processing.

Conclusion

The idea of edge computing is to get closer to devices to reduce the amount of data that needs to be transferred, which results in better response time. It is not a replacement for the cloud, but it complements cloud computing by addressing some of its shortcomings for specific use cases. Edge computing systems only transfer relevant data to the cloud, reducing network bandwidth and latency and providing near-real-time results for business-critical applications.

Edge computing is evolving rapidly, and some in the industry believe that the cloud will be used only for massive computations and storage in the future, while all other data will be processed in edge data centers.

Learn more about how Macrometa's ready-to-go industry solutions offer edge caching and analytics to deliver actionable real-time insights.

Articles on This Topic

Platform

PhotonIQ
Join the Newsletter