Edge Computing: What a New Report Says You Need to Know

If you want to future-proof your technology job, you need to fast familiarize yourself with Edge computing. Cloud computing is about to be superannuated and Edge computing, which brings both computation and storage closer to the consumer, is a likely future for businesses that want greater speed (and for massive amounts of data to travel less distance).

In a new report on the coming opportunities for Edge computing, Barclays’ analysts Tim Long and Chang Liu predict the Edge market will be worth around $20 billion in terms of hardware, software and services by 2023. If you want to get in early, this is what you want to know.

What is Edge computing?

Edge computing means that “a significant portion of worldwide data traffic and workload will be processed and stored at the ‘Edge,'” as Long and Liu state in the report. This means “locations close to end users, as opposed to centralized hyperscale data centers whose service areas are measured in the thousands of square miles.”

There’s some debate as to what constitutes “the Edge.” It might be a cellphone or a cell tower; it might be a micro datacenter attached to a cell tower; it might be a computing device inside an autonomous vehicle; or it might even be a “ruggedized data center deployed at a power plant or manufacturing facility,” according to the report. Whatever the “Edge,” its relative closeness to the compute’s consumer has strong benefits.

What’s Edge computing used for?

Barclays sees some of the biggest opportunities being in utilities and agriculture. The Internet of Things (IoT), autonomous driving, cloud gaming, and virtual reality are also expected to be key users of Edge technologies, as all need massive amounts of data delivered at considerable speed.

How does this kind of computing differ from cloud computing?

Right now, cloud computing relies upon hyperscale, centralized datacenters. With Edge computing, the bulk of this computing power is pushed to the edge of the network. “Edge computing significantly shortens the distance between end user devices and data processing and storage resources that serve them” is how Long and Liu define it. This reduces the need to transmit everything back and forth to the centralized cloud, where most workloads are processed right now. 

Long and Liu suggest Edge computing is an inevitability, and that it constitutes the next stage in the evolution of centralized vs. decentralized computing, with history tending to alternate between the two. In the 1960s to ’80s, they point out, computing went from Mainframe to Rich Client. In the 1980s and 1990s, we went from PC to client-server. Between the 1990s and 2010s we went from client server to centralized cloud. And now we’re moving from centralized cloud to hybrid cloud to Edge computing.  

What’s an Edge datacenter?

An Edge datacenter is a datacenter near the end user. Long and Liu predict that, in the next three to five years, Edge datacenters will be “increasingly modular, compact, and ruggedized for outdoor or underground deployment.” They foresee the proliferation of “secure cabinet environments that include all the processing (CPU or A.I. computing accelerator like FPGA, GPU, TPU), storage (HDD or SSD), and networking necessary (switch, router, optical transport) to run applications.” In other words, optimized to bring a highly customized compute load as close as possible to the consumer.

What does Edge computing mean for developers? 

Edge computing will create plenty of new jobs for hardware and networking professionals, but it also has implications for software engineers. In particular, Long and Liu predict increased need for security applications, as more “Edges” create more potential points of access for threats (and increase the adoption of ‘Security-as-a-Service’). It could equally generate increased demand for software developers who can work with virtual and augmented reality. 

What are the issues with this kind of computing?

Although Edge computing is intended to reduce latency, Long and Liu note that it could potentially increase it. When Edge devices generate a lot of data, for example, an Internet of Things (Iot) gateway aggregates and streamlines it before sending it back to cloud data centers to process. “This architecture creates a number of issues, such as security, compliance and latency, as well as lacking scalability and automation,” they added. “All of those will exacerbate bottlenecks as edge devices generate exponentially larger amount of data.”

That could create opportunities for network specialists who are very good at streamlining systems, as well as developers and technologists great at software optimization. The future always needs some tweaking.

A modified version of this article appeared in eFinancialCareers.