The Internet of Things (IoT) is changing everyday lives. Smart cities, smart infrastructure, driverless cars and real-time guidance are just a few exciting examples stemming from this brave, new and smart-sensing world.
But IoT and the huge amounts of data that flow through it are also bogging down the network. While the cloud has made this type of ecosystem possible, it’s also overwhelmed by both the quantity and latency of data being sent back and forth from the frontlines of sensors to the servers in the backroom.
Enter edge analytics. Instead of sending data to the cloud and waiting precious milliseconds for billions of decisions from millions of devices, computational analytics are taking place at or near the sensor or user level, thereby reducing the back-and-forth time required in traditional cloud approaches – not to mention lower data transmission costs.
In our view, traditional IoT approaches will fall short, cost more and provide slow service. That said, here’s what forward-thinking IT practitioners can do about it.
Three Tier Edge Architecture
The hierarchy of edge analytics can be represented as a three-tiered architecture (pictured). The flow of data begins with sourcing of raw data from smart devices or sensors, followed by more sophisticated analysis on gateways at the edge of the network, and finally some “heavy lifting” or big data analysis with cloud computing.
In the IoT age, nearly every connected and instrumented device generates huge amounts of data. The underlying metadata, however, is useless unless it is analyzed for meaning. Much of the data collected does not require complex analytics, however; hence data from these devices can be analyzed close to the source of origination (i.e., the edge) to deliver near-instant automated results.
This is done through gateways: small, low-cost memory devices that can make simple decisions on their own and then reroute more complex tasks back to the cloud for further analytics and processing power. By implementing these, organizations can reduce complexity, speed results and increase scalability through more distributed computing.
Real World Techniques
Making sense of high volume or otherwise “big” data is no easy task. Edge analytics, however, provides a method to ease the burden. For example, monitoring or first filtering data at the edge can deliver meaningful insights in the following ways:
Threshold crossing alerts (TCAs): The majority of data received at the edge may not be of much interest, assuming the system is working normally. To filter out such data, companies can install a tool or software with predefined threshold values for parameters. When the parameter value crosses the threshold, it will trigger an alert to the monitoring tool. Using this approach, companies can save a lot of time and money by averting the evaluation of otherwise meaningless data.
Summary extraction: With this technique, companies can extract a summary of the analyzed data at the network’s edge. This summary can then be sent to the cloud for more sophisticated analysis. To accomplish this, companies can set a timer to extract the summary on a periodic basis. Doing so can dramatically reduce the amount of data being sent to the cloud.
Parameterized models: This more sophisticated technique is an advanced version of summary extraction. Here, companies run an appropriate algorithm on the data periodically and extract only the parameters for the model and not the complete summary. This method is used with advanced computing techniques to transfer only the parameters to the cloud.
In short, edge analytics limits the amount of data sent to the cloud. Not only does this reduce transmission costs and turnaround times, but it also frees the cloud to sift through less chaff in order to find more analytical wheat. Furthermore, in-place analytics helps organizations avoid data duplication and overload.
In the future, cloud-only computing will continue to be costly and time-consuming. The expected growth in IoT-based applications will encourage vendors to commercialize edge analytics to deliver on the promises of lower latency, near-real-time response rates and better optimized user experiences than existing cloud analytics systems can provide.
The good news is an enormous number of applications are already available for edge analytics. And every business opportunity across any industry that requires low latency and communications accuracy — including automotive, consumer electronics, energy, utilities and healthcare — will find the implementation of edge analytics extremely helpful.
The trick is to incorporate both models to their best advantages: edge analytics where time is of the essence, and cloud analytics where security and data volumes are the deciding factors. Thus, it is imperative that IoT strategies make best use of both cloud computing and edge analytics to realize an ideal IoT ecosystem.