The ‘edge’ in edge computing describes a style of data processing that’s done near the source of the data – in other words, close to where an IoT device is. Instead of the data being sent from a device to a central system that does the processing, some – if not all – the analysis happens at the device itself, or within a few feet of it.
You can imagine the benefits of this. Edge computing allows organisations to analyse important data in near real-time, which is vital for the progress of emerging technology such as autonomous vehicles because data doesn’t have to be sent to a data centre or the cloud – thus reducing latency.
Additionally, some of the critical data is triaged and analysed at the source, decreasing the overall traffic flowing from the IoT to data centres. Edge computing should allow for the mainstream adoption of the IoT without the crippling pressure on existing cloud infrastructure.
Challenges to implementation
While it’s impossible to deny the benefits of edge computing, not all businesses have the capacity. A lack of skills, security, the costs of the cloud and a lack of maintenance support will all hinder the growth of edge computing. For organisations looking to move towards this model, don’t underestimate the initial legwork involved. It’ll be as intensive and as moving to the cloud with the associated training, infrastructure investment and process changes – but it will also be just as beneficial.
Most edge implementations are going to occur in areas with little-to-no on-site IT support. That makes lack of internal skills a challenge. A potential solution is to use low-cost monitoring solutions that will forewarn of issues and allow tech support to be deployed to the scene quickly.
Security is another concern, especially given the scrutiny around potential IoT hacking. Due to edge device locations (in stores, for example) there needs to be physical security to stop tampering. Cybersecurity is paramount, as businesses will not want their IoT devices to be misused or their data to be breached.
Many organisations may also underestimate the costs of switching to the cloud and edge computing. The maintenance of remote devices must also be considered – less maintenance will be done in-house because many devices will be geographically dispersed. Outsourcing and predictive maintenance will soon become the norm.
Who’s doing it well?
There’s already a race to win amongst the tech giants Google, IBM, Microsoft and Amazon. That means a lot of investment in the technology, which will accelerate roll-out. It’s definitely gathering steam, with the most recentForresterAnalytics Global Business Technographics Mobility Survey finding that 25 per cent of global telecom decision-makers are already implementing or expanding edge computing in 2019.
As for the benefits of the IoT being realised, some applications such as finance and autonomous vehicles will benefit dramatically from edge computing. But that shouldn’t stop organisations without edge infrastructure from experimenting and benefitting from the IoT now. We’re already seeing many benefits of the IoT in agriculture, MedTech and renewable energy, to name but a few.
The potential partnership between edge computing and Artificial Intelligence (AI)
There’s a symbiotic relationship between AI and edge computing. Predictive maintenance of far-flung IoT devices makes edge computing more viable, as organisations don’t have to invest in local IT teams to constantly maintain devices on-location.
AI at the edge will be lucrative because it can better understand the nuances between different devices and sets of data. Plus, it’ll speed up and automate many processes, such as getting a speeding ticket. Whereas before, a police officer with a speed gun would issue a ticket, with edge computing and AI it could involve a camera detecting your speed, finding your licence plate and details, and automatically posting a ticket to your home address.
AI accelerators (the Google Edge Tensor Processing Units, for example) will be needed to speed up computation and allow for machine learning at the edge. Because edge computing was designed for low computation, without such accelerator chips, machine and deep learning at the edge would be impossible.
All content produced and published by Paratus People. All rights reserved. ©