Photo: Edge computing is a major player in the Fourth Industrial Revolution, or industry 4.0, placing computation at or near the data source to reduce latency and provide real-time data processing and insights to enterprises.
Table of Contents
Edge Computing Scenario
Imagine that a driver of a military vehicle needs to travel to a nearby outpost to respond to an unexpected attack.
The driver doesn’t know exactly where the outpost is located. He just knows it’s about 15 miles from the military base where he's currently stationed.
Thankfully, the driver’s vehicle is equipped with an Internet of Things (IoT) virtual assistant that provides him with real-time navigation, geographical information and even weather-related updates.
This nifty assistant transmits the driver's requests for information to a distant server at a centralized cloud data center located thousands of miles away.
In turn, the cloud server uses this device-generated data to compute the information the driver requested.
Photo: A data center with multiple rows of fully operational server racks
Normally, this data is relayed back to the virtual assistant almost instantaneously. The assistant then uses it to guide the driver to the nearby outpost, all the while highlighting any obstacles or conditions that may impede his travel or jeopardize his safety.
But there’s a problem this time.
The assistant is silent. Buffering.
Retrieving the requested information is taking longer than usual.
Why?
Because hundreds of other IoT devices at the base – wearables, security cameras, drones, weapons systems, smart speakers and smart appliances – are also transmitting data to the cloud using the same connection, resulting in a network slowdown of nightmarish proportions.
As a result, the driver is experiencing a delay, or latency, in his device’s response time.
And in turn, the driver cannot receive geographical information or directions to the besieged outpost.
At least, not in a timely manner.
So, what’s the solution?
Enter edge computing.
Infographic: An illustration of an edge computing architecture
Edge computing is a type of network architecture in which device-generated data is processed at or near the data source.
Edge computing is actualized through integrated technology installed on Internet of Things (IoT) devices or through localized edge servers, which may be located inside smaller cloud-based data centers, known as cloudlets or micro data centers.
You can think of edge computing as an expansion or complement of the cloud computing architecture, in which data is processed or stored at data centers located hundreds of miles, or even continents, away from a given network.
Essentially, edge computing distributes part of a cloud server’s data processing workload to an integrated or localized computer that is proximal to the data-generating device, a process that mitigates latency issues caused by a substantial amount of data transfer to the cloud.
One example of these localized computers is an edge server.
Photo: Edge servers crunch data from IoT devices give enterprises the insights they need to achieve their goals faster and more efficiently.
An edge server is a computer that’s located near data-generating devices in an edge computing architecture.
It utilizes the edge computing architecture to reduce latency in data transmission and filter out unimportant or irrelevant data before it is ultimately sent to the cloud for storage.
Edge servers are used as an intermediary between a network and a cloud data center, absorbing a portion of an IoT device’s data processing activities and delivering results in real time.
Examples of edge servers include rugged servers, which are used in the military, manufacturing, and other industries.
The computational offload achieved by the edge computing architecture, in conjunction with the resilience and processing power of a high-performance rugged server, can make for quite a powerful combination at the edge.
Photo: Soldier wearables are just one of many edge computing examples.
There are several instances of edge computing architectures in military, commercial, and industrial applications today.
Examples of edge computing include various IoT sensors attached to soldier wearables and battlefield systems, systems on offshore oil rigs, modern cars and self-driving vehicles, security and assembly line cameras, virtual assistants, as well as the edge servers that take the data and measurements from these devices and crunch them to provide insights to their users.
We detail each of these examples below:
There are four main benefits of establishing an edge computing architecture: reduced bandwidth use, latency reduction, cost savings, and improved security and reliability, especially when supporting next-gen technology like 5G.
We detail each of these four benefits below:
Consider the scenario from earlier, involving the military vehicle driver and the IoT assistant.
Because there were so many additional IoT devices transmitting data to the cloud, the base's network was temporarily overloaded, causing the driver to experience a delay in response time.
Not to mention, the data required to fulfill all those requests was being computed thousands of miles away at a cloud data center, instead of on an integrated sensor or chip, or on a server at the edge of the network.
These latency issues could have been mitigated if the data was processed using an edge computing architecture.
Instead of running a hefty stream of data to the centralized data center for storage and computation, an edge computing architecture would have allowed the other IoT devices on the network to store and process a portion of the data locally.
In addition, the base would cut down on its operational costs by using less bandwidth, as well as reduce the impact of potential security or maintenance-related issues.
Edge computing is unlikely to replace cloud computing entirely.
Applications and devices that don’t require real-time data processing or analysis are likely to still use the cloud for storage and processing.
As the number of IoT devices increases, so, too, will the amount of data that needs to be stored and processed.
If businesses and organizations don’t switch to an edge computing architecture, their chances of experiencing latency in applications requiring real-time computation will increase as the number of IoT devices using their networks increase. In addition, they’ll spend more money on the bandwidth necessary to transfer such data.
Edge computing is an extension, rather than a replacement, of the cloud. And as more and more devices begin to use cloud data centers as a processing resource, it’s clear that edge computing is the future, at least if you want your program or application to function seamlessly, efficiently and affordably.
For more information about acquiring edge computing solutions for your program or application, reach out to Trenton Systems. Our engineers are on standby.