Vectra AI explains rapid development in artificial intelligence

Vectra AI erläutert rasante Entwicklung bei künstlicher Intelligenz

Machine Learning at the Edge, Graphene-neural networks and more

Even in 2020 and 2021, when there was no shortage of spectacular news, artificial intelligence (AI) was able to gain presence in the mainstream thanks to new advances. In particular, OpenAI’s GPT-3 showed new and surprising ways in which AI could soon enter daily life. Due to the rapid progress, predictions about the future of AI are difficult. However, some areas seem to be ripe for a breakthrough. Allan Ogwang, Ben Wiener and Christopher Thissen, all data scientists at Vectra AI , looking ahead to the coming months. You give some examples of AI that are very promising.

Transformers

Two of the biggest AI achievements of 2020 quietly had the same underlying AI structure. Both OpenAI’s GPT-3 and DeepMind’s AlphaFold are based on a sequence processing model called Transformer. Although transformer structures have been around since 2017, GPT-3 and AlphaFold have demonstrated the remarkable ability of transformers to learn deeper and faster than the previous generation of sequence models and perform well in problems outside of natural language processing.

In contrast to previous sequence modeling structures such as recurrent neural networks and LSTMs (long short-term memory), transformers deviate from the paradigm of sequential data processing. They process the entire input sequence at once and use a mechanism called Attention to learn which parts of the input are relevant in relation to other parts. This allows transformers to easily relate distant parts of the input sequence to each other – a task that recurrent models are known to struggle with. In addition, large parts of the training can be carried out in parallel, which makes it possible to make better use of the massively parallel hardware available in recent years and significantly shorten the training time. Researchers will no doubt be looking for new uses for this promising structure in 2021 – and there is good reason to expect positive results. In fact, already in 2021, OpenAI modified GPT-3 to generate images from text descriptions. The Transformer seems to be ready to dominate the year 2022 as well.

Graphene Neural networks

In many areas, there are data that are naturally suitable for graphene structures: computer networks, social networks, molecules/proteins and transport routes are just a few examples. Graphene neural networks (GNNs) enable the application of deep learning to graphene-structured data, and it can be assumed that GNNs will become an increasingly important AI method in the future. More precisely, it can be expected that in 2022 methodological advances in some key areas will drive the wider application of GNNs.

Dynamic graphs are the first area of importance. While most GNN research has so far assumed a static, immutable graph, the above scenarios inevitably involve changes over time: in social networks, for example, members join (new nodes) and friendships change (different edges). In 2021, there have already been some efforts to model time-varying graphs as a series of snapshots, but in 2022, this burgeoning line of research will be expanded with an emphasis on approaches that model a dynamic graph as a continuous time series. Such continuous modeling should enable GNNs to discover and learn from temporal structures in graphs in addition to the usual topological structure.

Improvements to the message passing paradigm will represent another important advance. A common method of implementing graph neural networks is message passing, in which information about nodes is summarized by “passing” information along the edges connecting neighbors to each other. Although it is intuitive, it is difficult to use messaging to capture effects where information has to be passed on over long distances in a graph. Next year, breakthroughs beyond this paradigm are expected, for example, by iterative learning which information propagation paths are most relevant, or even by learning a completely new causal diagram on a relational data set.

Apps

Many of the top news reports of the last year emphasized the emerging progress in the practical applications of AI. In 2022, it seems that it is time to capitalize on these advances. In particular, applications that rely on natural language understanding are likely to make progress when access to the GPT-3 API becomes available. The API allows users to access the capabilities of GPT-3 without having to train their own AI – an otherwise expensive undertaking. With the acquisition of the GPT-3 license by Microsoft, the technology could also be used in Microsoft products.

Other areas of application are also likely to benefit significantly from AI technology in the new year. AI and machine learning (ML) have already established themselves in the field of cybersecurity, but in 2022 the development could be a little steeper. As the security breach at SolarWinds has shown, companies have to deal with the looming threats from cybercriminals and state actors, as well as the ever-evolving configurations of malware and ransomware. In 2022, there will be an aggressive push by advanced behavior analysis AI to expand network defense systems. AI and behavioral analytics are critical for identifying new threats, including variants of previous threats.

In 2022, we can also expect an increase in the number of applications that run machine learning models on edge devices as standard. Devices such as Google’s Coral, which is equipped with an integrated Tensor Processing Unit (TPU), will continue to spread with advances in processing power and quantization technologies. Edge AI eliminates the need to send data to the cloud for conclusions, which saves bandwidth and reduces execution time. Both are crucial in areas such as healthcare. Edge computing can also open up new applications in other areas where data protection, security and low latency are required, as well as in regions of the world that do not have access to high-speed Internet.

bottom line

AI technology is spreading more and more in practical areas. The advances in transformer structures and GNNs are likely to lead to advances in areas that have not yet been readily suitable for existing AI techniques and algorithms. This post highlighted some areas that seem to be ready for progress this year. However, there will undoubtedly be surprises as the year progresses. Predictions are difficult, especially when it comes to the future, as they say. Whether the forecasts are right or wrong, one thing is clear: 2022 will be an exciting year for the AI environment.

Outsourced software development company | Outstaffing services

Ready to see us in action:

More To Explore

IWanta.tech
Logo
Enable registration in settings - general
Have any project in mind?

Contact us:

small_c_popup.png