Data is key to almost every decision that we make. Without accurate data, businesses directly fail. A key problem in computing today, as well as in Artificial Intelligence, is that reportedly, the bulk of data processing is currently done in the cloud. In the case of AI, as previously mentioned in other pieces, this comes down to the incapability of the average computer to handle the calculations that AI systems need to do to be built, to run, and to improve.
For a further explanation of said calculations, please see all of our pieces on machine learning. Edge computing is somewhat of a misnomer in what it hopes to consistently do. It primarily aims to move data processing and therefore, calculation, back to what is often called the starting point.
It all Comes Down to Locality
So, what is that starting point?
If we look at everything through the idea of the Internet of Things, it’s the device. Inside of the IOT, sensors are what take in data from the environment around the device or from other devices and process it. Following this, the device or devices then use this data to act and improve.
You can’t have one without the other?
Certain sources have suggested that the Internet of Things cannot, in fact, exist without Artificial Intelligence and vice versa. This argument can be said to begin with the connection between the IOT and Machine Learning. If we remember that the essence of Machine Learning is using algorithms in AI systems to learn from input data, then we can posit that IOT systems that use sensors to take in environmental data need Machine Learning to improve their performance. While this connection appears almost solely based on the current performance of Machine Learning in AI systems, it is logical.
The essence of IOT is to bring data processing and therefore, the learning that comes with it, into the devices themselves.
Still, how does edge computing come into all of this?
Learning on the Edge
Edge computing appears to connect to both Machine Learning and IOT. While it can work without the other two, there exists a theory that when all are connected, all can work at highly efficient levels. One of the key benefits of edge computing is suggested to be a large reduction in latency between communicating devices.
For those of you who may not know, latency is just the state of data being between locations. A popular word that’s used in its place, to an extent, is lag. At the same time, one could also consider that AI devices, of any kind, are almost constantly sending, receiving, and processing data due to the algorithms that run them. If the data stream isn’t continuous, then the learning algorithms can run into issues. Even so, this assumption rests on the premise that the more data a system takes in, the more it learns, so if the system’s data intake is more limited, then perhaps the need for edge computing will also be minimized.
If, however, an AI system runs on a continuous learning algorithm of some sort, then this might signal a way that Edge computing could be helpful.
Streamlining Learning in Neural Networks
The key theorized benefit to edge computing relates to data being acted upon in real-time. IOT connected devices can be simply termed as devices that use edge computing to deal with data. These devices actually appear to improve their own performance by connecting to each other as the blockchain does and sharing data. A primary example of this in practice today would be what smart thermometers do. Judging by the example of Nest thermometers, IOT connected devices can now adjust each other to perform at the highest level possible based on events in their owners’ homes that they analyze with their sensors.
For example, as Nest mentions on its company website, if someone breaks into your home, as long as all of your devices are connected via the same user account, they can talk to each other and provide different forms of data for you about the situation. The kicker here is that Nest mentions the need for a Wi-fi connection. Therefore, in this specific example, edge computing doesn’t eliminate the reliance on the often shaky features of a Wi-fi network.
Related to AI, if systems can talk to each other in the same way, then they can learn from each others’ triumphs and mistakes. One of the biggest theorized benefits of the Internet of Things is the ability of high performing models to improve low performing models as long as they are all connected on the same network.
While this is only currently working with examples like locks, outlets, thermometers and other small products, these use cases could still be extrapolated to AI systems. If multiple versions of the same AI were built to predict a certain sector of the stock market and one performed better than the others over a period of a few months, then the data sent from the high performer to the low performers could be input into their neural networks and therefore, into their algorithms.
In this fashion, every version could improve together and in the same way. None of this is certain but here’s to hoping that Edge computing enhances efficiency across the board and brings us closer to the age of reliable and truly human-like AI.
References:
AI and Edge Computing, A Case Study via Capgemini: https://internetofthingsagenda.techtarget.com/blog/IoT-Agenda/Edge-computing-and-AI-From-theory-to-implementation
AI and Predicting the Stock Market:
https://towardsdatascience.com/just-another-ai-trying-to-predict-the-stock-market-part-1-d0663673a30e
Edge Computing’s Advantages and Disadvantages: https://www.capgemini.com/2017/03/what-is-fog-and-edge-computing/
Edge Computing and AI via Computer Business Review: https://www.cbronline.com/feature/edge-computing-artificial-intelligence-iot
Edge Computing via NetworkWorld: https://www.networkworld.com/article/3224893/internet-of-things/what-is-edge-computing-and-how-it-s-changing-the-network.html
IOT Devices Today:
https://beebom.com/examples-of-internet-of-things-technology/
Nest Products Improving Themselves:
https://nest.com/support/article/Learn-how-Nest-products-work-together
The Innovation Enterprise- The Future of Machine Learning:
https://channels.theinnovationenterprise.com/articles/why-machine-learning-needs-edge-computing
Wired- AI and IOT- Symbiosis:
https://www.wired.com/insights/2014/11/iot-wont-work-without-artificial-intelligence/