IoT Platform Sensors: 5 Characteristics To Explore

Sensors and IoT Platforms

Much has been written about the promise of predictive analytics and how IoT data can improve operational efficiency, reduce downtime, and save money for the enterprise. In contrast, little is written about the sensors gathering the data that is fed into the predictive analytics engine. Panduit’s white paper, “E.S.P. for IoT Platforms,” discusses the characteristics to consider when deploying measurement sensors and how to determine the importance of specifications depending on sensor type and deployment location.

Sensor Types

There are three types of sensors: indicators, counters, and measurement.

  1. Indicators are relatively straight forward – they are either on or off. They show when something has occurred, for example, when someone has opened and accessed a panel.
  2. Counters can keep a running tally of a series of events. An example is a tachometer that counts the number of revolutions of a shaft or axle. Both indicators and counters are examples of digital sensors. They monitor and report discrete events. Relatively speaking, they are simple sensors.
  3. Measurement sensors are more sophisticated. They report on the amount of a physical entity, such as weight, or on an environmental attribute, such as temperature. Rather than reporting discrete events, they report where one is on a continuous scale.

Sensor Characteristics

When choosing sensors for your IoT platform, there are five characteristics you should consider.

  1. Accuracy
    Accuracy is the ability of a sensor to provide a true measurement of whatever the sensor is monitoring. There is an uncertainty with the measurement, usually represented as a percentage of full scale.
  2. Repeatability
    Repeatability is the ability of a sensor to provide a constant output when there is a constant input, when acquiring a new sample.
  3. Linearity
    Linearity is a measure of how well the sensor’s response curve approaches a straight line.
  4. Sensitivity
    A sensor’s sensitivity is the amount the input to the sensor must change to detect any change in the output.
  5. Environmental Impact
    Changes in the environment can impact the performance and accuracy of a sensor. For example, some sensors are particularly sensitive to temperature and humidity.

When selecting a sensor, you should also determine which attributes are important to your application. In a benign environment, the environmental impact on the sensor’s performance may not be that important, whereas it may be a consideration if the application is outdoors.

The tradeoff you need to make when selecting a measurement sensor is the level of specificity you require for that attribute versus cost. For example, a temperature sensor monitoring a pizza oven does not need to be as accurate as one monitoring a pharmaceutical process. A temperature sensor with an accuracy of ±0.01°C will be more expensive than one with an accuracy of ±1°C.

To learn more about why sensors are important for your IoT platform, download Panduit’s
E.S.P. for IoT Platforms” white paper – or subscribe to our blog to access all the papers in our IoT “101” white paper series.

3 Way IIoT Technology Benefits From Moore’s Law

IIoT and Moore’s Law and Their Relationship with Bandwidth

There are three ways IIoT technology reinforces Moore’s Law. Bandwidth is a key component of this reinforcement and is the foundation for current technological advances.

 

Moore’s Law predicted the technological advances that we are experiencing today, and bandwidth is helping to propel those advances forward, specifically for IIoT. Panduit’s white paper, “The Ubiquity of Bandwidth,” explains how Moore’s Law factors into IIoT network capabilities.

Moore’s Law

Gordon Moore is best remembered as a co-founder of Intel. But while he was the director of Research & Development at his previous employer, Fairchild Semiconductor, he authored a paper in 1965 titled, Cramming More Components onto Integrated Circuits. In the article, Moore predicted that the number of transistors contained within a semiconductor will double approximately every two years.

Moore’s Law is applicable along three axes:

Cost – The cost for many transistors drops by almost half with every reduction in the size of the transistors.

Performance – Processor speeds increase because the smaller the transistor, the faster it can operate. Additionally, the transistors become closer to each other which reduces the latency between them.

Complexity – For a given size, the number of transistors doubles with the reduction in feature size. This allows more complex implementations and circuitry.

Although all three aspects of Moore’s Law are important, it is the ability to implement ever-increasing complexity that might be the most important.

For example, if a smartphone was built using the semiconductor technology available in 1971, the phone’s microprocessor would be the size of a parking space. In fact, the communication theories needed for ubiquitous bandwidth evolved in the late 1940s and 1950s.

They could not have been implemented at that time, however, because it would have been impractical to build with vacuum tubes or discrete transistors.

IIoT Technology

IoT has captured product developers’ imagination. In the consumer space, it remains to be seen what applications will take hold, but wearables seem a certainty.

It is a similar situation on the factory floor as numerous deployment scenarios exist, but we will need some history for us to see which ones provide a suitable ROI.

Tracking packages, monitoring, and alerting applications are one thing. Implementing advanced analytics and complicated algorithms to extract meaning from IIoT data that has been gathered is something else.

None of this would happen without the ubiquity of bandwidth.

To learn more about bandwidth and why it’s essential for your IIoT network’s infrastructure, download Panduit’s “The Ubiquity of Bandwidth”  white paper – or subscribe to our blog to access our IoT “101” white paper series.

 

 

Julio Franco, (2015, April 20). “50 Years of Moore’s Law: Fun facts, a Timeline Infographic and Gordon’s Own Thoughts 5 Decades Later.” Techspot. [Online]. Available: https://www.techspot.com/news/60418-50-years-moore-law-fun-facts-timeline-infographic.html.

Gordon Moore, “Cramming More Components onto Integrated Circuits,” Electronics, volume 38, no. 8, 1965.

M. Patel, et. al. (2017, May 19) “What’s New with the Internet of Things?” McKinsey & Company. [Online]. Available: https://www.mckinsey.com/industries/semiconductors/our-insights/whats-new-with-the-internet-of-things.

3 Ways Edge Computing Stimulates IoT Technology Capabilities

3 Ways Edge Computing Enriches IoT Technology

There are three ways edge computing enhances IoT deployments. These areas are key to increasing data gathering capabilities in a real-time world.

For IoT deployments, going to the edge may be the best choice when it comes to helping businesses deploy IoT technology across their network infrastructures.

Panduit’s white paper, “Edge Computing: Behind the Scenes of IoT,” explains the difference between the cloud and edge computing and three ways the edge can help IoT technology deployments.

It also discusses the following key areas for consideration when deploying edge computing: real-time requirements, environmental conditions, space limitations, and security.

Edge Computing

Edge computing is the opposite of cloud computing. With edge computing, the compute, storage, and application resources are located close to the user of the data, or the source of the data.

This is in contrast to a cloud deployment where those resources are in some distant data center owned by the cloud provider.

Although edge computing may appear to be a new concept, it is just the computing pendulum swinging to one side of the computing continuum.

Computing started with the advent of mainframes in the late 1950s. Mainframes are an example of centralized computing; they were too large and expensive for one to be on every user’s desk.

In the late 1960s, minicomputers appeared, which moved compute power away from centralized control and into research labs where they controlled experiments, the factory floor for process control, and many other use cases.

The pendulum moved all the way to the distributed side with the arrival of the PC in the mid-1980s. With the PC, individuals had computing power at their fingertips.

The computing pendulum swings back and forth, and today, it is swinging towards edge computing, which puts the processing and storage resources closer to where they are used and needed.

Why Edge Computing for IoT?

IoT deployments can benefit from edge computing in three ways:

  1. Reduced Network Latency

The latency in an IoT deployment is the amount of time between when an IoT sensor starts sending data and when an action is taken on the data.

Several factors impact network latency: The propagation delay through the physical media of the network; the amount of time it takes to route data through the networking equipment (switches, routers, servers, etc.); and the amount of time it takes to process the data. Implementing edge computing for IoT offers a reduction in network latency and improves real-time response.

  1. Reduced Network Jitter

The jitter in a network is the variation of latency over time. Some real-time IoT applications may not be tolerant of network jitter, if that jitter causes the latency to lengthen such that it prevents the system to act in the required time frame.

  1. Enhanced Security

Edge computing offers the opportunity to provide a more secure environment regardless of how one would deploy: co-location or directly owning the equipment.

Co-location facilities are physically secure locations. If one owns the edge computing equipment, it can be in the factory where the IoT sensors are located or in another company-owned facility.

To learn more about edge computing and why it is important for IoT, download Panduit’s “Edge Computing: Behind the Scenes of IoT”  white paper – or subscribe to our blog to access all the papers in our IoT “101” white paper series.

How Packet Loss Occurs In Network Infrastructure

Causes of Packet Loss

Packet loss reduces network throughput and adds to latency.

 

Packet loss impacts a network in two ways: it reduces throughput and adds to latency.

But why does packet loss occur in the first place?

The following excerpt from Panduit’s “What is the Impact of Packet Loss?” white paper focuses on the root causes of packet corruption and its prevention.

Corrupted packets can occur when they encounter a bit error as the packet moves from one end of the network to the other. Bit errors almost always occur in the lowest layer of a protocol stack, the physical layer. The job of the physical layer is to move information from one end of the network to the other.

Typically this information is represented by a stream of 0s and 1s. The physical layer does not assign any meaning to the stream of 0s and 1s because the upper layers handle that task.

Causes of Bit Errors

Copper Cabling/Wireless Connection: Outside interference such as lightning or other electrical noise can cause the bit error if the physical layer uses copper cabling or wireless connection.

Optical Networks: In optical networks, a bit error could occur if the optical module fails, causing it to have difficulty determining the stream of 0s and 1s. Other causes could be improperly terminated cabling, dirty fiber optic connectors, or water penetrating the cable.

Preventing Packet Loss

Proper Installation and Maintenance of the Network:
When installing RJ45 jacks, you may untwist the copper pairs more than needed. This could unbalance the pair, allowing electromagnetic interference (EMI) to impact link performance. Cleaning the end-face of fiber optic connectors is always important, but even more so at higher network speeds.

Proper grounding and bonding eliminate differing ground potentials between different pieces of networking equipment. These are examples that impact the receiver’s ability to distinguish the transmitted bit sequence that leads to corrupted packets.

Media Type: Media type, for example, copper or fiber, should also be considered. CAT6A unshielded twisted pair copper cabling is ideal for new installations, as it provides the best performance for most applications without the added expense of shielded cable. For harsh environments where EMI is present, you may need to install shielded copper cable or fiber cabling, which are immune to EMI.

To learn more about how you can prevent good packets from going bad, download Panduit’s “What is the Impact of Packet Loss” white paper – or subscribe to our blog to receive our complete 4-part series of IoT 101 white papers.