Trenton Systems Blog

Edge Computing vs. Fog Computing: Is There a Real Difference?

Written by Brett Daniel | Oct 21, 2024 6:11:29 PM

Graphic: Edge computing is here to stay. Are you prepared for the changes it will bring?

Table of Contents

Businesses and organizations are generating more raw data than ever before - so much, in fact, that sending it to the cloud for processing and storage has become a costly and inefficient endeavor.

The more data they send to the cloud for analysis and storage, the more money they spend on transferring said data. This newfound surge in throughput also creates a rise in latency, which translates to a delay in response times for a given business’s or organization’s human and physical capital - that is, the employees and autonomous machines using the insights derived from the raw data to do real work and make real decisions.


When businesses and organizations need to elicit insights from this raw data nearly instantaneously to remain competitive in an ever-technologically-demanding global market, such an inefficient architecture can become a cumbersome responsibility that wastes both time and money.

And you know the old saying: time is money.

Enter edge computing/fog computing.

In this blog post, we'll provide a brief background on how the Internet of Things (IoT) and cloud computing are driving edge computing/fog computing, discuss the benefits of edge computing/fog computing, and talk about whether there's an actual difference between the two. Stick around.

Graphic: The Internet of Things (IoT) and inefficiencies in cloud computing are driving the adoption of edge/fog computing during the Fourth Industrial Revolution.

Background: the Internet of Things (IoT) & cloud computing are driving the edge

The advent of the Internet of Things (IoT) is responsible for businesses’ and organizations’ newfound influx of raw data.

Basically, devices and machines that we’ve used for decades are becoming increasingly equipped with sensors that collect nearby data and push it to cloud data centers, which then crunch this data to return insights that we, the users, can use for learning, research, and decision-making.

Smartwatches are probably the most relatable example. Smartwatches use sensors to measure and collect data about your body – your temperature and your heart rate, for example – but this is just raw data. It must be computed to give you the insight you desire, and this wouldn’t be possible without sending it to the cloud for analysis.

On a much larger scale than mere smartwatches, these sensors have also found their place in or on military and industrial facilities and equipment - tanks, weapon systems, surveillance systems, manufacturing apparatuses, and others – where their sole purpose is to collect and transmit data about the environment.

The type of data gathered by an IoT sensor varies by sensor type. These sensor types include:

  • Temperature sensors
  • Humidity sensors
  • Chemical sensors
  • Acceleration sensors
  • Velocity sensors
  • Proximity sensors
  • Water quality sensors
  • Infrared sensors
  • Light sensors
  • Smoke sensors
  • Gas sensors

And that’s not even the end of the list.

That’s a lot of data about a lot of different phenomena. In a particularly large business or organization employing lots of these sensors, how could all the data it collects possibly be transmitted to a cloud data center for analysis without a significant increase in data transfer costs and a delay in response times?

The short answer is that it can’t be, and that’s why edge computing, also known as fog computing, is becoming an increasingly popular choice for businesses and organizations that need not only a cost-effective but virtually real-time computing solution.

Graphic: The edge and the cloud have an important symbiotic relationship. 

The benefits of edge computing/fog computing

IoT sensors collect raw data only. This data needs to be crunched into something usable for the end user, whether that’s an employee or a fully autonomous machine, and it needs to be crunched quickly so that businesses and organizations can remain competitive amid the ongoing Fourth Industrial Revolution.

Cloud data centers can crunch this data, of course, and they're a reliable solution for businesses and organizations with lower-volume networks, but as we’ve previously discussed, the cloud alone has become an inefficient computing paradigm for many multiplex business and organizational processes with an absolute necessity for nearly instantaneous response times.

Enter edge computing, also known as fog computing.

Establishing an edge computing architecture involves locating servers, commonly referred to as edge servers, closer to the data-generating IoT sensors that we discussed earlier.

These local servers are running the applications that crunch this data and provide user-oriented insights. In turn, less data travels to the cloud, and businesses and organizations save money on data transfer and improve response times.

And although some of this processed data can be stored at the edge, much of it is being sent back to the cloud for permanent storage, but remember, this is being done after the raw data has been processed by edge servers.

So, you’re effectively sending only important data back to the cloud instead of an endless stream of raw data, which costs more money to transfer and creates a rise in latency.

Think of edge computing as cloud computing’s helpful, fat-trimming aide in the field, pre-processing all that data and making it nice and tidy before it delivers it to the cloud.

Graphic: Edge computing and fog computing may seem like different architectures, but the truth isn't so clear.

Edge computing vs. fog computing: Is there an actual difference?

Whether there’s a difference between edge computing and fog computing depends on who you ask. Some say there are legitimate technical differences between the two, while others say the differences are purely semantic.

For example, it has been posited by Cisco, which coined the term "fog computing" in 2014, that "edge computing" merely refers to the concept of moving computational resources to or closer to data-generating devices, while "fog computing" refers to the literal implementation and management of this architecture at the edge of the cloud, a process known as "fogging."

Cisco has also categorized fog computing as just a type of edge computing technology.

To make matters even more convoluted, some industry blogs have posited that edge computing refers only to computation that occurs directly on smart devices, whereas with fog computing, the computation occurs within fog nodes or IoT gateways located in a business’s or organization’s local area network (LAN).

Here at Trenton Systems, when we use the term edge computing, we mean both. Our definition of edge computing is any data processing that’s done on, in, at, or near the source of data generation.

Scott Shadley, Vice President of Marketing at NGD Systems, a manufacturer of computational storage drives (CSDs), says that there really isn't a difference between edge computing and fog computing.

The fog was introduced by Cisco in 2014. It was the idea of 'cloud,’ then ‘fog,’ then ‘mist,’ as you came from the cloud to an edge to an endpoint. The term is still in use in some circles, but not as much as previously. For now, if you see 'fog,' you can simply say 'edge.'

       Scott Shadley, Vice President of Marketing, NGD Systems

Like Shadley, many also maintain that there’s no real difference between edge computing and fog computing – that edge computing and fog computing are interchangeable terms and that they refer to the same type of distributed computing architecture.

What do you think? Let us know in the comments below.

Photo: Trenton Systems' high-performance edge servers have proven highly successful for real-time military, industrial, and commercial computing applications.

Equipping your program or application with a reliable edge/fog server

Whether you refer to the architecture as edge computing or fog computing, the need remains the same: process data as close to its source as possible to lower costs and achieve near-real-time response.

Incorporating trusted, high-performance rugged servers closer to your IoT smart devices can help you do both, no matter the conditions of the environment on land, in space, in air, or at sea.

Trenton Systems' talented engineers are on standby to help you design a rugged computing solution for your unique edge computing application.