Choosing an Edge Computing Platform
Does 25ms sound unreasonable to process Terabytes of data? Well it is in today’s IoT world – to collect gigantic amounts of data, process and analyze it, and generate snappy dashboards in just 25ms.
Leaders across industries are emphasizing that data absolutely has no value if it cannot be processed and transferred to a decision maker quickly enough for him or her to act quickly enough. Today, it’s impossible to accept even a minute’s delay.
Edge computing is propelling those huge expectations. And it ought to be.
Edge computing dramatically amplifies the value of machine data. Compared to other distributed computing tech, peer-to-peer, cloud, etc. it’s the definitive answer to process data from heterogeneous networks, sporadically connected devices, low-power machines, and far flung remote devices.
How Valuable Is The Data At The Edge?
To understand this, let’s take the example of an offshore wind turbine-based micro-grid. A typical offshore grid ranges from a couple of installations to about hundred units. In a simplest of scenarios, access points are the remote cameras hosted near or on the turbine devices. These cameras feed visual data back to remote diagnostics center. On an average 100 cameras can send about 100 gigs of data every day. Monitoring experts can remotely fix more than half of the issues that occur. For instance, they may need to align the blades to match the wind speed and directions. These decisions are taken to improve the optimal performance of the units.
The round trips that 100 gigs of data makes to the remote datacenter are not cost-effective nor helps diagnostics engineers to maintain the unit at optimal performance. Consider the case of thousands of inter-connected units in heterogeneous network environments ranging from Bluetooth, 4G, 3G, private VPNs, WiFi, and so on. The latency, data collection, device discovery, and inter-relationships with all the devices on the grid edge are just the tip of iceberg when it comes to identifying challenges.
Three vital things are required to be considered when accessing the value of the data at the edge and making it human digestible from a Smart Grid (for that matter any device in the Things space).
- How accurately can we identify the data that is being generated and its usefulness
- How capable is the medium in integrating with the existing infrastructure
- How much of network disruption and latency degradation can the analytics infrastructure overcome
Cutting the Data Transmission Path to Near Zero
Glassbeam EdgeTM Analytics (GB EDGE ANALYTICS) makes it possible to negate the inherent losses of having data do the round-trips to a central data center to analyze millions of data points.Essentially, we have built the capability over the past year to create a highly scalable and robust data collection and analytical services at the edge. With GB Edge, near-real time services are specifically designed to integrate remote systems and devices, and collect disparate data and make sense of it. Whether it is the type of data format, the information it produces, taxonomy, semantics, and nomenclature, any machine data analytics developer now has the capability to quickly model the data and filter out information residing at the network edge that is important to run the business. These capabilities are highly challenging for data analysts to overcome; in fact, it cuts down the data processing and collection effort by over 80%.
No Edge Agent Software or Protocol Changes Required
With GB Edge, all the endpoints at the network edge continue to communicate using their existing data protocols requiring no change or specific application to collect data and filter out the information. With the ability for a developer handle the conversion of Edge data to human-readable data models, the power to harness only fraction of the necessary data right where the data resides making it possible to generate numerous decision parameters that take into consideration how the devices are integrated on the Edge network. Most of the developer’s effort will now go into the interpretation of the data that is coming in rather than in collecting Terabytes of data, most of which have no real business value.
What’s in Store with GB Edge?
Here are some of the salient points of reference when considering an Edge-based analytical medium:
- Propelling the way we use existing use cases and creating fresh real-world applications
- Developing best practices and Edge architectures that ease the development of data collection, data quality, and standard requirements
- Driving a global standard for making data usable where the device resides
- Restricting the discovery of some parts of the data that are not useful to generate revenue nor helpful in improving the product or technical services
- Consuming the least amount of bandwidth and infrastructure capacity to process only useful data and discarding the rest
- Reducing the loss of data to near zero or acceptable variances with the capability to stream real-time data
GB Edge in Action
We are already helping a US-based Smart Grid device manufacturing startup to harness the power of GB Edge. Some of the major areas where we’ve provided an impact include:
- Receiving high visibility into operations in a multi-city smart grid network
- Aiding the pre-sales and after sales departments with self-service dashboards on resource use and capacity dashboards
- Achieving a huge impact on technical services to understand in near-real time, the device operations through preemptive troubleshooting context information