Using AI To Optimize IoT at the Edge
Artificial intelligence has the potential to revolutionize the combined application of IoT and edge computing. Here are some thought-provoking possibilities.
Join the DZone community and get the full member experience.
Join For FreeAs more companies combine Internet of Things (IoT) devices and edge computing capabilities, people are becoming increasingly curious about how they could use artificial intelligence (AI) to optimize those applications. Here are some thought-provoking possibilities.
Improving IoT Sensor Inference Accuracy With Machine Learning
Technology researchers are still in the early stages of investigating how to improve the performance of edge-deployed IoT sensors with machine learning. Some early applications include using sensors for image-classification tasks or those involving natural language processing. However, one example shows how people are making progress.
Researchers at IMDEA Networks recognized that using IoT sensors for specific deep-learning tasks may mean the sensors cannot guarantee specific quality-of-service requirements, such as latency and inference accuracy. However, the people working on this project developed a machine learning algorithm called AMR² to help with this challenge.
AMR² utilizes an edge computing infrastructure to make IoT sensor inferences more accurate while enabling faster responses and real-time analyses. Experiments suggested the algorithm improved inference accuracy by up to 40% compared to the results of basic scheduling tasks that did not use the algorithm.
They found an efficient scheduling algorithm such as this one is essential for helping IoT sensors work properly when deployed at the edge. A project researcher pointed out that the AMR² algorithm could impact an execution delay if a developer used it for a service similar to Google Photos, which classifies images by the elements they include. A developer could deploy the algorithm to ensure the user does not notice such delays when using the app.
Reducing Energy Usage of Connected Devices With AI at the Edge
A 2023 study of chief financial officers at tech companies determined 80% expect revenue increases in the coming year. However, that’s arguably most likely to happen if employees understand customers’ needs and provide products or services accordingly.
The manufacturers of many IoT devices intend for people to wear those products almost constantly. Some wearables detect if lone workers fall or become distressed or if people in physically demanding roles are becoming too tired and need to rest. In such cases, users must feel confident that their IoT devices will work reliably through their workdays and beyond.
That’s one of the reasons why researchers explored how using AI at the edge could improve the energy efficiency of IoT devices deployed to study the effects of a sedentary lifestyle on health and how correct posture could improve outcomes. Any IoT device that captures data about how people live must collect data continuously, requiring few or no instances where information gathering stops because the device runs out of battery.
In this case, subjects wore wireless devices powered by coin-cell batteries. Each of these gadgets had inertia sensors to collect accurate data about how much people moved throughout the day. However, the main problem was the batteries only lasted a few hours due to the large volume of data transmitted. For example, research showed a nine-channel motion sensor that reads 50 samples every second produces more than 100 MB of data daily.
However, researchers recognized machine learning could enable the algorithms only to transfer critical data from edge-deployed IoT devices to smartphones or other devices that assist people in analyzing the information. They proceeded to use a pre-trained recurrent neural network and found the algorithm achieved real-time performance, improving the IoT devices’ functionality.
Creating Opportunities for On-Device AI Training
Edge computing advancements have opened opportunities to use smart devices in more places. For example, people have suggested deploying smart street lights that turn on and off in response to real-time traffic levels. Tech researchers and enthusiasts are also interested in the increased opportunities associated with AI training that happens directly on edge-deployed IoT devices. This approach could increase those products’ capabilities while reducing energy consumption and improving privacy.
An MIT team studied the feasibility of training AI algorithms on intelligent edge devices. They tried several optimization techniques and came up with one that only required 157 KB of memory to train a machine-learning algorithm on a microcontroller. Other lightweight training methods typically require between 300-600 MB of memory, making this innovation a significant improvement.
The researchers explained that any data generated for training stays on the device, reducing privacy concerns. They also suggested use cases where the training happens throughout normal use, such as if algorithms learn by what a person types on a smart keyboard.
This approach had some undoubtedly impressive results. In one case, the team trained the algorithm for only 10 minutes, which was enough to allow it to detect people in images. This example shows optimization can go in both directions.
Although the first two examples here focused on improving how IoT devices worked, this approach enhanced the AI training process. However, suppose developers train algorithms on IoT devices that will eventually use them to perform better. That’s a case where the approach mutually benefits AI algorithms and IoT-edge devices.
How Will You Use AI to Improve How IoT-Edge Devices Work?
These examples show some of the things researchers focused on when exploring how artificial intelligence could improve the functionality of IoT devices deployed at the edge. Let them provide valuable insights and inspiration about how you might get similar results. It’s almost always best to start with a clearly defined problem you want to solve. Then, start exploring how technology and innovative approaches could help meet that goal.
Opinions expressed by DZone contributors are their own.
Comments