Edge computing is the latest buzzword in the world of tech. But what is “the edge” and does it really stand to replace the cloud?
Many people have questioned what will come after the cloud, and the answer is here. Edge computing – also known as edge IoT – is a brand new method of optimising applications and cloud-based systems. It works by taking a portion of an application away from one or more central nodes and bringing it to the “edge” of the Internet, into the physical world.
Edge computing is done at or near the source of the data (i.e., the end user), rather than relying on the cloud at one of a dozen data centres around the world. Instead of data going away and coming back, edge computing brings the cloud closer to you.
What is the edge?
In this context, the word “edge” means the literal geographic distribution. In a recent speech describing the impact of edge computing, Peter Levine – a Silicon Valley investor at the venture capital firm Andreessen Horowitz – discussed how modern technology like drones and automatic cars require faster processing than the cloud can provide.
Levine explained that rather than cloud computing coming to an end, the cloud will still remain important for a number of everyday applications. However, the role of the cloud will begin to change with the rise of edge computing, becoming a supplement to more immediate data processing needs.
“Everything popular in technology always gets replaced by something else,” says Levine. “Part of the job of an investor is to consider not where the puck is today, but where the puck is going in the future.”
Every new technology has its benefits and drawbacks. So what can we expect from edge computing?
- Lower operating costs: Data is much cheaper (and easier) to manage at a local level, meaning lower expenses than the cloud.
- Real-time data analysis: Because data is analysed at a local device level, users will be presented with real-time (or near real-time) data, rather than information being held in a remote data centre or cloud. Edge computing essentially cuts out the middleman.
- Reduced network traffic: Data is transmitted directly from local devices, with less information passing through networks and clouds. Therefore, users will experience reduced network traffic and higher processing speeds.
- Improved application performance: Data will be much easier to access when it doesn’t sit in a faraway cloud, therefore applications that don’t tolerate latency will perform far better. This equates to less downtime and better productivity for businesses using cloud-based software.
Reported concerns about edge computing are mainly about the potential loss or corruption of data. Edge computing can increase computing power and lower latency, but it also poses the risk of expanding the attack surface, experts say. The edge can be phenomenally useful in distributed environments like smart cars or manufacturing facilities. However, those devices are connected to the internet, and are not adequately secured. This means that attackers could do damage to the device or machinery that the edge controls.
What’s more, a single edge computing device could potentially have a cascading effect. If one part of the system goes offline or is slowed down due to an attack, then other components that depend on it could also be affected.
Edge computing comes with many risks and opportunities, and only time will tell how it will fare in sensitive business environments. Experts are still working on ways to combat security concerns over edge computing. One proposed solution is to use encryption, like in the cloud.
If you’re interested in adopting edge computing, you should consider your current business data management system and whether the edge makes sense for your business. One way of doing this is to observe what other companies in your industry are doing as edge computing comes into the fore. Another is to consult with an expert to find out how edge could benefit your business.