Edge Computing: What is it and why does it matter?

As the Internet of Things (IoT) expands with every new smartphone and Siri device, edge computing becomes a bigger part of the conversation as enterprises begin to deal with the massive data volumes being produced by these devices. Organizations may soon need to decide not just who will provide their data collection and processing services, but also where.

Edge computing is when you generate, collect, and analyze data on the edge of the network where the data is generated rather than in centralized servers and systems, commonly called the cloud.

While the concept of edge computing (sometimes referred to as fog computing) may not yet be top-of-mind for many, it will be soon. Experts predict that in the next 2 years, 5.6 billion IoT devices owned by enterprises and governments will utilize edge computing for data collection and processing (Business Insider report). The accumulation of IoT data is predicted to hit about 507.5 zettabytes (1 zettabyte = 1 trillion gigabytes) by next year (Cisco) and the need to process it at local collection points is what is driving fog computing.

Even if we are not aware of it, edge computing is already being utilized in devices we use every day, including smartphones, tablets, robotics and automated machines used in manufacturing. But, most agree, we have only scratched the surface of its possible applications – and benefits.

“Reducing latency by a few seconds on a smartphone certainly improves the user experience,” states Jon Markman, president of Markman Capital Insight in a recent Forbes article. “However, in a self-driving car, going 55 mph on a crowded road, those seconds could be the difference between life and death.”

According to TechRepublic.com, IoT brings “great promise” to the operational area, “where machine automation and auto alerts can foretell issues with networks, equipment, and infrastructure before they develop into full-blown disasters. For instance, a tram operator in a large urban area could ascertain when a section of track will begin to fail and dispatch a maintenance crew to replace that section before it becomes problematic. Then the tram operator could notify customers via their mobile devices about the situation and suggest alternate routes.”

Edge computing is also emerging in healthcare, as more organizations collect data from connected medical, IoT, and medical monitoring devices. According to Healthcare IT News, “Medical IoT devices are constantly collecting data and communicating with the network, and having actionable data in near real-time allows clinicians to make a more accurate diagnosis at the point of care, which can lead to a reduced number of return visits and save entities money.”

The pros and cons of edge computing include cybersecurity concerns on both ends of the spectrum.

“Proponents of edge computing claim that computing at the edge is safer because data is not traveling over a network; however, opponents claim that edge computing is less secure because the IoT devices are more susceptible to begin with. Vendors that provide products and services at the edge say they are tackling the problem,” cmswire.com reports.

While fog computing spurs debate around cybersecurity, the future of the cloud and the role of data centers (and endless other topics), few can argue that its adoption, in one form or another,  is inevitable. 
The Internet of Things is growing exponentially and it will need edge computing’s real-time information, particularly for applications such as self-driving cars or in medicine, when a split second can mean life or death.