For IoT deployments, going to the edge may be the best choice when it comes to helping businesses deploy IoT technology across their network infrastructures.
Panduit’s white paper, “Edge Computing: Behind the Scenes of IoT,” explains the difference between the cloud and edge computing and three ways the edge can help IoT technology deployments.
It also discusses the following key areas for consideration when deploying edge computing: real-time requirements, environmental conditions, space limitations, and security.
Edge Computing
Edge computing is the opposite of cloud computing. With edge computing, the compute, storage, and application resources are located close to the user of the data, or the source of the data.
This is in contrast to a cloud deployment where those resources are in some distant data center owned by the cloud provider.
Although edge computing may appear to be a new concept, it is just the computing pendulum swinging to one side of the computing continuum.
Computing started with the advent of mainframes in the late 1950s. Mainframes are an example of centralized computing; they were too large and expensive for one to be on every user’s desk.
In the late 1960s, minicomputers appeared, which moved compute power away from centralized control and into research labs where they controlled experiments, the factory floor for process control, and many other use cases.
The pendulum moved all the way to the distributed side with the arrival of the PC in the mid-1980s. With the PC, individuals had computing power at their fingertips.
The computing pendulum swings back and forth, and today, it is swinging towards edge computing, which puts the processing and storage resources closer to where they are used and needed.
Why Edge Computing for IoT?
IoT deployments can benefit from edge computing in three ways:
- Reduced Network Latency
The latency in an IoT deployment is the amount of time between when an IoT sensor starts sending data and when an action is taken on the data.
Several factors impact network latency: The propagation delay through the physical media of the network; the amount of time it takes to route data through the networking equipment (switches, routers, servers, etc.); and the amount of time it takes to process the data. Implementing edge computing for IoT offers a reduction in network latency and improves real-time response.
- Reduced Network Jitter
The jitter in a network is the variation of latency over time. Some real-time IoT applications may not be tolerant of network jitter, if that jitter causes the latency to lengthen such that it prevents the system to act in the required time frame.
- Enhanced Security
Edge computing offers the opportunity to provide a more secure environment regardless of how one would deploy: co-location or directly owning the equipment.
Co-location facilities are physically secure locations. If one owns the edge computing equipment, it can be in the factory where the IoT sensors are located or in another company-owned facility.
To learn more about edge computing and why it is important for IoT, download Panduit’s “Edge Computing: Behind the Scenes of IoT” white paper – or subscribe to our blog to access all the papers in our IoT “101” white paper series.