IoT Applications Are Headed for Edge
At the conceptual level, edge computing refers to the idea of bringing computing closer to where it's consumed or closer to the sources of data.
Join the DZone community and get the full member experience.
Join For FreeEdge computing continues to gain force as an increasing number of companies get on board, even if they’re tipping their toes with small scale pilot deployments at the edge. The term edge computing has been broadly used to describe everything from actions performed by tiny IoT devices to datacenter-like infrastructure.
At the conceptual level, edge computing refers to the idea of bringing computing closer to where it's consumed or closer to the sources of data.
For more information on why companies are increasingly looking at edge computing, check out my blog We’re headed to edge computing.
This article looks at edge computing from the perspective of IoT application developers. After all, it will be the applications, leveraging emerging technologies like artificial intelligence and machine learning (AI/ML), that will provide insights to uncover opportunities to deliver new services or optimize costs.
Emerging use cases like IoT, AR/VR, robotics, and telco network functions are often cited as key drivers to move computing to the edge. However, traditional enterprises are also looking at edge computing to better support their remote/branch offices, retail locations, manufacturing plants, etc. Service providers can deploy an entirely new class of services at the network edge to take advantage of their proximity to the customers.
Edgy Applications
Although underlying infrastructure plays a key role, the benefits of edge computing will be realized on the backs of applications. If done right, edge applications can enable new experiences across a range of industries:
Healthcare: Advance patient care by integrating live data from a patient's fitness tracker, medical equipment, and environmental conditions.
Smart infrastructure: Enable cities to leverage real-time data from roadside sensors and cameras to improve traffic flow (traffic light synchronization, reduce/increase traffic lanes), improve safety (wrong-way driver, dynamic speed limit), or improve shipping port utilization (loading/unloading of cargo ships).
Autonomous driving: Real-time decisions to safely navigate the vehicle across a diverse range of driving conditions.
Industry 4.0: Enable real-time analytics with AI/ML capabilities at the factory floor to enable predictive maintenance improving equipment utilization.
Far edge services: Service providers using proximity to customers to offer low latency (sub-1ms), high bandwidth location-based services for use cases like AR/VR, or VDI (virtual desktop).
Best Practices
Edge computing gives companies the flexibility and simplicity of cloud computing for a distributed pool of resources across a large number of locations. In the context of IoT use case, the approach to application development is one of the many ways edge computing differs from embedded systems of yore. To develop embedded applications, the developer needed a deep understanding of hardware and interfaces. Heavily customized operating system with a strong dependency on underlying hardware required functional specialization. The development tools lacked the flexibility and capabilities of tools used by IT developers. Edge computing involves some of the following best practices:
Consistent tooling: Developers need to be able to use the same tools, irrespective of location the application is deployed at. This means, no special skills required to create edge applications, any more than for non-edge applications. One such example of such tooling, Red Hat CodeReady Workspaces, built on Eclipse Che, provides Kubernetes-native development solution with in-browser IDE for rapid application development that can be easily deployed at edge or cloud.
Open APIs: Well-defined and open APIs allow real-time data to be accessed in a programmatic manner enabling businesses to offer new classes of services not possible before. Developers need APIs to create standards-based solutions that can access data without worrying about the underlying hardware interfaces.
Accelerated application development: Although edge architectures are still evolving, design decisions made today will have a lasting impact on future capabilities. Instead of adopting offerings purpose-built for the edge that reduces developer agility, a better approach involves solutions that can work anywhere - cloud, on-premise, and edge. Consider, technologies like containers, Kubernetes, lightweight application services accelerate application development - from cloud to edge.
Containerization: Most new applications are being built as containers as they’re easy to deploy and manage at scale. Edge application requirements include modularity, segregation, and immutability, making use of containers an especially good fit. Applications will need to be deployed on many different edge tiers, each with their unique resource characteristics. Combined with microservices, containers representing instances of functions can be scaled up or down depending on underlying resources or conditions.
For more information on how resource requirements vary between edge and cloud, read article No more illusions of infinite capacity
Other Considerations
It’s important to note that it's not going to be an either/or choice between edge computing and centralized computing. As edge computing gains greater adoption in the marketplace, the overall solution would often encompass a combination of the two. In such a hybrid computing model, centralized computing would be used for compute-intensive workloads, data aggregation and storage, AI/machine learning, coordinating operations across geographies, and traditional back-end processing. Edge computing, on the other hand, could help solve problems at the source, in near real-time. Distributed architectures will allow the application to be placed at any tier from cloud to edge, where it makes the most sense.
Using monolithic edge solutions that use custom tooling that doesn’t integrate well with the rest of the IT infrastructure could cause major pain down the road when edge computing achieves mass deployment. Open source is an obvious choice that provides the flexibility of choice and future-proofing investments in edge computing.
Opinions expressed by DZone contributors are their own.
Comments