In this article, you will learn how Industrial IoT (IIoT) platforms can improve device connectivity and data availability to support your business applications. We will explain the IIoT platform features you should consider when looking for the platform. We will explain features such as connectivity management, buffering, prioritization, edge computing, and so on.
When we talk about device connectivity, we refer to the ability of devices to establish and maintain a reliable connection with some central system. For the purposes of this article, we will discuss connectivity between the local network (and devices within) and the IIoT platform that lives in the cloud and is accessed over the Internet.
Connecting your devices to the IIoT platform seems straightforward. You need to find a reliable connectivity provider, integrate your devices with the platform, and that’s it. In theory, yes, but not in reality. Also, connectivity providers face issues occasionally, plus the offering is rich in urban areas, but what about rural parts without developed infrastructure? Or even places without permanent connectivity (such as offshore cargo ships). In such places, we face limited possibilities to connect your devices. And if possible, the connectivity characteristics are usually not optimal. You might face situations when devices get disconnected, experience packet drops or fluctuating up/down bandwidth, etc. Communicating with your device fleet and uninterrupted operations becomes highly challenging. That can result in delayed data delivery or, worse, in data loss. If your IoT solutions rely on uninterrupted data flow, the impact on business can be significant.
The rule of thumb here is to always think of the worst-case scenario, no matter how good the infrastructure is on paper.
For example, what if my site is down for 1 hour, one day, or a week, or your connection is so slow that it can’t keep up with incoming data? What will be the impact on the business? It’s important to run those disaster recovery scenarios to be ready for future incidents.
Imagine a fleet of welding robots operating at the factory:
With a simple calculation, the requirements for the connectivity will be: 50*10*5 kB = 2,500kB / minute.
Your connection must be able to upload at least 2,500kB per minute. But what if you’re down for 1 hour or a whole day? Data will start accumulating, putting stress on your local infrastructure. It’s simple math. If the connectivity of your factory is down for 24 hours, you need to buffer: 2500kB * 60 * 24 = ~3,600MB within your local network.
Is your local infrastructure able to handle it without losing data? Or if you are losing data, are you losing the least important, keeping your business-critical data intact? How do you handle the situation when the connectivity is back up?
No magic pill would resolve the situation, but IIoT platforms can help you significantly.
We will examine how IIoT platforms can mitigate connectivity issues and improve data delivery guarantees to ensure you use your IoT solutions’ full potential.
Not all IIoT platforms are built equally, but some are designed to cope with the challenges we described. When selecting the right IIoT platform for your IoT projects, you should also be looking for the following:
We will look into the features and capabilities of SDKs and Gateways. We will cover the following areas in detail:
The SDK manages a connection between devices and the platform for directly connected devices connected or via a gateway. No matter how harsh the conditions are, the device must always recover the connection without human intervention. All necessary reconnects, retries, and re-registrations are gracefully handled.
When the connection becomes unstable, or bandwidth is limited, the device or gateway starts buffering data. Messages are sent to the platform once connectivity is back up. It’s handy to buffer data alongside the whole pipeline:
It’s desirable to have both enabled to mitigate issues with a local network and the connectivity to the Internet. By doing so, you reduce the risk of losing data. However, keep in mind that your data storage is not limitless. Limited storage capacity especially applies to devices; the data gateway usually has more storage. Nevertheless, you might be facing a situation when your connection is down for a long time, and you can’t keep buffering your data because your storage is already full, and the components need to handle the constant flow of incoming messages. To cope with the issue, the buffering feature should support multiple strategies; let me name a few:
Selecting the right strategy depends on the requirements of your solution. Configuration of the strategies should be available from the cloud without the need to configure your devices manually.
Prioritization comes in handy when bandwidth on site is limited or often fluctuating. You know that from time to time, the upload speeds are so low that you can send only a fraction of the data over the Internet. However, not all data is equal. Some data are feeding business-critical IoT solutions with a need for low latency processing; some are just metrics tracking the overall performance of a machine with low requirements for data availability and durability (some data can be lost or delayed).
The SDK and gateway should consider message priority and deliver high-priority messages first. If necessary, start discarding low-priority messages.
It’s a good practice to chunk larger data to avoid retrying a big file upload. Imagine uploading a 100MB video file from your security camera and the situation when your device disconnects when 99MB is transferred. Often, you need to start over, unnecessarily consuming a site’s bandwidth. When you retry the file, you end up using 199MB from your upload capacity, and that’s only if the upload succeeded on a second try. The likelihood of failure for such a big file is high.
For such cases, chunking is handy, and it’s even better if it’s automatic. Let’s say that the size of a chunk is 1MB, and we again want to upload a 100MB video file. Once you initiate upload, the SDK will automatically split the file into ~100 chunks and upload them individually. In the worst-case scenario, you must re-upload chunks (1MB), which is much better than re-uploading 100MB.
The SDK and gateway should have support for chunking. If needed, even automatic. When the size of a message reaches a certain threshold, it’s automatically chunked and uploaded. The platform ensures that data is concatenated upon arrival, so you get your original file, not just chunks, so you don’t have to deploy any external workloads.
Compressing data is one of the helpful bandwidth optimization techniques you can use. When it comes to human-readable data, such as JSON messages, you can easily reduce the size by 90+ percent. When used at scale, it has a significant impact on bandwidth. On the other hand, compressing images and videos is not helpful since they’re internally already compressed. It’s also important to keep an eye on the capabilities of your device. Compressing data increases the CPU load of a machine, and not all machines have enough system and power resources. Also, decompression in the cloud can significantly increase costs because, typically, storage is cheaper than CPU.
The platforms should support compression as a first-class feature. On the SDK side, data can be automatically compressed in the background, sent to the cloud, and automatically decompressed and stored on the platform’s storage. Hence, you get the original file without thinking about external processing.
Additionally, it’s essential that you can further configure how the feature should behave. For example, how to treat compressed data in the cloud. Storing compressed data in the cloud can be helpful if you don’t plan to process your data often and just need to archive it. You save a lot of storage space.
The best way to optimize your bandwidth is by not sending data over the Internet. You should bring data processing as close to devices as possible. The ideal situation is to have your workload deployed on the device itself. However, it’s often impossible due to limited device HW and SW capabilities. If so, you must offload it to specialized HW within the local network. In general, processing data within the local network would reduce latencies and lower the need for your bandwidth capacity. Please remember that it's also not wise to depend on Internet connectivity for real-time applications. Let’s have a look at an example.
Imagine a factory producing toys and a processing line moving the final product to the warehouse. We can detect faulty products automatically by equipping the processing line with a camera and installing the proper infrastructure. The camera can stream video footage to an edge component that runs a machine-learning model that detects defective products. Once we reach a certain threshold, we can instruct machines to halt or change their settings to avoid further deterioration and unnecessary waste. It would be impractical to do such processing in the cloud because streaming bulky video files over the Internet results in high processing latencies and enormous bandwidth usage. Also, relying on Internet connectivity for business-critical applications like this is highly risky.
Edge computing need is usually addressed by an edge component that runs as a part of the gateway. The edge module can run containers (think, e.g., a docker image) encapsulating your workload. Handling such workloads shouldn’t be a hassle. The platforms should offer a simple cloud interface to support the whole lifecycle of the workloads. Deploying, updating, stopping, or deleting them should be easy, even at scale. No matter whether you want to use a nice web interface or APIs.
Uninterrupted data flow towards IoT solutions, usually deployed in the cloud, is crucial for business. Achieving a stable connection between your machines and some central system in the cloud is a false goal. The IoT infrastructure should be designed to cope with connectivity interruptions, always assuming the worst scenario possible.
IIoT platforms offer various out-of-the-box features and capabilities that can significantly improve device connectivity and data availability:
Those features and capabilities are usually implemented within two crucial IIoT platform components:
We at Spotflow have gone through those challenges helping our customers build reliable IoT infrastructure. We're building the Spotflow IIoT platform that includes all the features and capabilities we described to support uninterrupted device operations.