Edge Computing: How to make the switch

Edge Computing: How to make the switch

For some IoT applications, data volumes and latency times of classic IT services are not sufficient. Edge computing promises a remedy. However, there are a few points to consider for successful implementation. […]

Edge computing and the Internet of Things (IoT) stand for a decentralized IT architecture: instead of forwarding data to a far-away data center – for example in the cloud – they are analyzed and processed closer to the place of their origin. The constantly growing number of IoT devices and their sensors are constantly generating new mountains of data. A single aircraft turbine generates about 333 gigabytes of data per minute, a medium-sized petroleum platform brings it to seven to eight terabytes per day, and an autonomous vehicle generates a whole petabyte or more every day.

The majority of this information is “disposable data”, characterized by a low or even lack of potential for subsequent reuse. However, the IoT device requires the majority of the data for immediate decision-making in real time. This combination of both – data volume and low latency – forces the conventional computing model to its knees.

Conventional, i.e. central data centers are almost by definition geographically too far away to guarantee the necessary fast response of data processing. Edge computing forms an intermediate layer between the core data center and the IoT sensors of the end devices. After the first analysis, real-time relevant data is deleted and only the knowledge derived from it is forwarded to the server or the cloud.

In order for companies to benefit from these advantages, good preparation before the actual implementation of an edge platform is essential. Preparation in this case means that companies must first understand and analyze what system requirements the individual applications have. It is also necessary to clarify what the current utilization of the IT infrastructure is, when peak loads occur and where there is optimization potential in order to better master requirements. In this context, tools for live analysis of the IT infrastructure are helpful, with which the IT environment can be visualized almost in real time and thus an understanding of the project requirements can be developed.

If companies take two time-shifted measurements, they get a feeling for their data growth and can thus make realistic forecasts. Such a tool, which should be “fed” with live data, thus provides the necessary transparency, thereby accelerating decision-making and minimizing business risks, since companies do not have to rely on pure guesswork when purchasing the platform. A costly oversizing of the IT systems is thus a thing of the past.

As far as the hardware itself is concerned, edge computing does not produce a fundamentally new architecture. Rather, under this name, small, multifunctional systems are offered in compact housings, which are often designed for use outside air-conditioned data centers.

However, when choosing the edge platform, companies should make sure that with the help of analytics software and deep learning functions, the data is filtered directly at the data entry point and then only the relevant information is sent to the cloud for further processing and evaluation. The fundamental prerequisite is streaming data platforms, which enable powerful real-time analysis by collecting, storing and evaluating data streams at the edge of the network.

Only in this way, for example, can predictive maintenance be used to predict when a machine part threatens to fail on the basis of defined recognition patterns, and automatically trigger the ordering of the spare part based on this and schedule a repair window. Industry-specific reference architectures for edge solutions developed together with industry specialists can also be helpful, although not absolutely necessary.

Another point that companies must have on their radar when it comes to edge computing is the issue of security: The numerous IoT systems at the network edge quickly become a security risk, because as a rule they are much less secure than central systems. Thus, they offer hackers numerous gateways. In addition, many end devices inherently have security flaws, such as weak credentials, zero-day gaps, missing security updates and the use of outdated protocols that are not equipped against modern hacking methods.

The fact that some communication protocols and standards around edge computing are not yet mature further exacerbates the problem. Therefore, companies should first of all pay attention to the security architecture and the handling of patches and updates when selecting devices, or retrofit them. Of course, all data should be encrypted both at rest and during transmission. In addition, it is necessary to implement tiered access controls for the individual devices and systems.

The migration itself then holds no stumbling blocks, provided good planning is provided – a fall-back scenario is nevertheless part of a professional preparation. However, experience shows that IoT or edge projects fail in the first three months if too many parties are on board.

If different niche solutions have to be merged into a functioning platform, incompatibilities and other approaches always cause problems. Used correctly, edge computing enables companies to save resources, automate processes, improve products and services, and build completely new business models. (hi)

*Uwe Wiest is General Manager &Director Sales OEM & IoT Solutions DACH at Dell Technologies.

Ready to see us in action:

More To Explore

IWanta.tech
Logo
Enable registration in settings - general
Have any project in mind?

Contact us:

small_c_popup.png