DFKI featuring industrial AI and AI in medicine and healthcare at Hannover Messe 2022

Bild der Pressemitteilung


After two years as a purely digital offering, Hannover Messe is back as an in-presence event, from May 30 to June 2, 2022.

DFKI presents industry topics such as production cells as digital twins, virtual teach-in in human-robot collaboration, assistance systems for humans in production, resilience management for global companies, and more value creation through data products. The second area of focus is AI in medicine. The research areas of the DFKI Lübeck branch will demonstrate how artificial intelligence can help with tumor segmentation, the control of prostheses, or the optimization of imaging procedures. The topics of social simulation of a pandemic, deep learning for individualized fixation implants, and data analysis for predicting critical conditions of intensive care patients reinforce the emphasis on medicine and health.

Visitors can find DFKI’s exhibits and technologies at the main DFKI stand in Hall 2, Booth C39, the Saarland Joint Stand, Hall 2, Booth B28, the FabOS Stand, Hall 5, Booth F54, and at the SmartFactory technology initiative in Hall 8, Booth D18.

A more detailed overview of the DFKI exhibits can be found here:
https://www.dfki.de/en/web/news-media/news-events/events/dfki-at-hannover-messe-2022

Exhibit descriptions in detail

Hall 2, C39, DFKI Main Stand

RICAIP – Virtual Reality-based Remote Human-Robot Collaboration in Assembly
Supplied by additional sensors, the digital twin of a production cell becomes the input interface for a robotic teach-in of the real world: As partners in the RICAIP Research and Innovation Centre on Advanced Industrial Production, DFKI and ZeMA demonstrate a collaborative robotic arm that a person can intuitively control from a distance via a virtual reality headset in real-time. The technologically complex scenario enables close interaction between distant locations and can also be used for remote maintenance or remote manufacturing. RICAIP is a distributed international research center of excellence with a research focus on robotics and artificial intelligence. RICAIP is based on a strategic partnership between the research institutions CIIRC CTU in Prague, CEITEC BUT in Brno, ZeMA, and DFKI in Saarbrücken. The research topics include cross-site, distributed, and digitized manufacturing in Industry 4.0.
https://ricaip.eu

SPAICER – Smart resilience services in the manufacturing industry
In a globalized and connected industry, disruptions in production or interruptions in the supply chain represent a business risk and can lead to massive financial losses. A company’s ability to anticipate and proactively adapt to internal disruptions (e.g., tool wear or raw material quality variations) and external changes (e.g., supply shortages) is the “quest for resilience.” Reinforced by a significant increase in complexity in production due to Industrie 4.0, resilience management is becoming an indispensable success factor for industrial production.
The SPAICER team presents the results of AI-based anomaly detection in the manufacturing or process industry as well as on the topic of digital twins. By means of smart resilience services, sensor data streams and quality and image data of semi-finished products or finished parts are analyzed. Based on this, recommendations for optimizing parameters on production machines can be derived, and the quality of semi-finished products and finished parts can be determined. This enables a reduction of production errors and cost savings by avoiding production downtimes and rejects.
www.spaicer.de

Intelligent workwear – Inertial motion detection in the factory or on the assembly line
Hitachi, DFKI, Sci-Track, and clothing company Xenoma have developed smart workwear for monitoring physical activities and workloads. The team has succeeded in integrating inertial motion sensors into a standard work jacket. This allows workload measurements to be taken without distracting the wearer. Together with Hitachi, DFKI is demonstrating a system that detects the wearer’s movements and workload by compensating for misinformation caused by the additional movement of the clothing. In the future, the collaboration partners plan to test the technology’s suitability for use in factories, maintenance, and logistics workplaces and improve worker safety and motivation in various settings through continuous activity monitoring.

PARTAS – A Personalizable Augmented Reality-based Task Adaption System
PARTAS is an assistance system tailored to people who have difficulties in cognition. This includes the areas of memory, concentration, and comprehension of quantities. These basic skills are often needed in everyday life and are currently supported in workshops for people with disabilities (WfbM) by manual aids. These are to be complemented by intuitive, personalizable guidance using contour-based instructions.

The system is mobile and can be quickly integrated into an existing workplace. A projector displays instructions directly onto the workspace. Individual work steps are recognized by a camera and acknowledged by an AI-based recognition algorithm. This enables automated execution of a task, immediate quality control and gives caregivers more time to address the individual needs of the cared-for. As an evaluation showed, PARTAS achieves a very high level of acceptance among both care recipients and caregivers. Due to the flexibility of the system, further areas of application are conceivable, for example, in the healthcare sector or the manufacturing industry.

AI in Medicine
Requirements for systems are as diverse as applications. Artificial Intelligence (AI) methods are used to learn complex relationships in a data-driven manner and to overcome the limitations of conventional mathematical models. In this context, resilience to disruptions and incomplete data, as well as the interpretability of the algorithms, are of essential importance.

How signals reveal their secrets
Three examples will demonstrate the power and versatility of AI solutions:

Hand gesture recognition
The example of hand gesture recognition illustrates the high adaptability of AI methods. One can train one’s own gesture recognition and then play games. It is shown how well AI systems can be adapted to new requirements by a short training phase and experience how powerful they are even in the context of embedded systems.

Acoustic event detection
Audio event detection plays a vital role in many areas, like smart homes, surveillance (also for production processes), and hearing aids. Thanks to sophisticated AI, one can see how well a variety of different events can be classified in real-time.

Sleep Staging
AI must function robustly despite many unknown circumstances. Sleep staging involves the analysis of EEG data recorded during sleep. One of the challenges is to provide reliable results for all patients despite different conditions and measurement devices. Countless EEG measurements are used to show how reliable and robust AI can be.

AI in medical imaging – Deep-learning-based brain tumor segmentation
A software demonstration of AI-based tumor segmentation shows how AI methods can efficiently delineate brain tumors in spatial 3D MRI image sequences in a pixel-precise automated manner. The deep-learning-based image analysis automatically determines essential brain tumor parameters such as its volume, position, and intensity values and provides the basis for a quantitative evaluation and assessment of the development of the growth.

The DFKI research area “Artificial Intelligence in Medical Image Processing” develops adaptive medical image processing methods to support medical diagnostics and therapy. In hybrid image processing systems, artificial intelligence methods are combined with medical image processing methods and visualization techniques for medical support.

The focus is on machine learning methods and deep learning networks for automatic analysis and recognition of different disease patterns, lesions, biomarkers, organs, tissues, etc., in medical images and image sequences. The researchers are also investigating the possibilities for image-based prediction of individual disease progression and personalized risk assessment to support therapy decisions using machine learning techniques.

How Horses become Zebras – Inter-Modal Image Synthesis Using the Cycle-GAN Architecture
Intelligent systems in healthcare build models by observing their environment and evaluating data in order to calculate actions optimally. In doing so, they must also deal with uncertainty. One exciting application under uncertainty is image registration, where areas on specific images are linked to areas on other images. One use case is matching organs in MRI images to the same organs on CT images. One approach to enhance registration is to use a cycle-GAN model to synthesize images before registration. In that approach, images from one domain are transformed into images from the other domain.

The team from the Stochastic Relational AI research area will demonstrate this synthesis using the example of synthesizing horses to zebras and vice versa. Visitors can place a Schleich© horse play figure, or a Schleich© zebra live in front of a camera. The live stream on the monitor shows the image synthesis resulting in a live transformation of the two animals.

AI for the ICU – Prognosis of risk indicators for cardiopulmonary decompensation
Artificial intelligence technologies should support the analysis of diverse patient data so that nurses and doctors in intensive care units can concentrate more on their patients. This is where the RIDIMP project of the Bremen hospital association Gesundheit Nord and the DFKI research unit “Cyber-Physical Systems” comes in. Data must be evaluated beforehand so that AI systems can learn meaningfully from the myriad of information.

For this purpose, the medical experts at Gesundheit Nord define two numerical scores, which are composed of many individual parameters such as oxygen saturation, pulse or medication administration, and assess the condition of the circulation or respiration based on the data on a scale from 0 (uncritical) to 9 (highly critical). These scores are, in turn, used to evaluate available historical patient data. By using machine learning techniques on this data, a prediction for the value of the scores in the future and, thus, for the likelihood of a collapse (decompensation) of circulation or respiration is implemented. In this way, medical staff can monitor the patient’s health status and be alerted to impending problems at an early stage.

At the Hannover Messe, DFKI researchers from Cyber-Physical Systems will demonstrate their prognosis algorithm on selected historical patient data from the intensive care unit. A prediction is calculated live on this data; visitors can then compare how good the prediction is. The exhibit is interactive – visitors can go back and forth in the timeline, speed up, slow down or pause the progression of time.
https://ki-sigs.de

AScore und AKRIMA – Crisis Management Cockpit and Crisis Resilience
Effective crisis management requires preserving the stability and ability to act on large parts of the overall social system. This requires flexible, timely, and appropriate responses to changing (crisis) situations. The Corona pandemic showed, as did recent extreme weather situations, that constant adaptation, which is crucial for this, represents a considerable challenge for most actors. With the help of AI, this information can be processed so that the relevant actors receive significant support in the event of a crisis. This ranges from simple documentation to simulation-based training scenarios and training courses.
The crisis management cockpit AScore prepares decision-relevant information by integrating smart cities and agent-based social simulation. As a result, the simulation model is able to make predictions regarding the spread of infections under specific scenarios. The project AKRIMA takes up this approach and aims to strengthen the crisis resilience of critical infrastructures, logistics chains, as well as authorities and organizations with security tasks by a simulation-based improvement of crisis response mechanisms.

Two-armed diving robot “Cuttlefish” – Intervention AUV for autonomous underwater manipulation
From the maintenance of maritime infrastructures to the salvage of munition leftovers and the removal of plastic waste, many underwater operations are not only complex and expensive but also pose significant risks to the divers who perform them. Remotely operated underwater vehicles (ROVs) are already being used to explore deep waters and monitor the condition of maritime assets. However, the trend is toward the use of autonomous underwater vehicles (AUVs), which remain in the water for long periods of time and can perform complex tasks there using artificial intelligence (AI) methods.
In cooperation with a worldwide network of partners from industry and science, the DFKI Robotics Innovation Center develops a new generation of autonomous underwater vehicles (AUVs). Thanks to artificial intelligence and state-of-the-art navigation methods, these robust systems can remain under water for long periods of time and perform complex tasks. A central field of research is the manipulation and handling of maritime infrastructures and operational environments by autonomous robots. The Robotics Innovation Center has already successfully demonstrated semi-autonomous underwater manipulation with the AUV “Cuttlefish” developed in the project Mare-IT. This intervention robot, which can be positioned freely in the water column, has two deep-sea capable gripping systems attached to its ventral side to flexibly manipulate objects underwater.

Open6GHub Germany – 6G for Society and Sustainability
The aim of the Open6GHub is to contribute to a global 6G harmonization process and standard in the European context that takes into account Germany’s interests in terms of societal priorities (sustainability, climate protection, data protection, resilience) while strengthening the competitiveness of companies, technological sovereignty and the position of Germany and Europe in the international competition for 6G.
The Open6GHub will contribute to the development of an overall 6G architecture, but also of end-to-end solutions in the following areas, among others: advanced network topologies with highly agile so-called organic networking, security and resilience, terahertz and photonic transmission methods, sensor functionalities in the networks and their intelligent use and further processing, and application-specific radio protocols.

Hall 2, Booth B28, Saarland Joint Stand

EVAREST – How AI is making food production smart
Producers in the food industry can generate additional revenue by creating and exploiting data products. Data becomes a reliable commodity in its own right, without giving away know-how or trade secrets.

The systemic approach to sovereign data trading and analytics for AI-based decision-making supports a data economy. A broker framework as a trusted third party enables trading, sharing, and processing of data between economic actors. Electronic contracts guarantee ownership and control of enterprise data assets by establishing usage rights, AI analysis methods, and uses for the data. The broker framework realizes decentralized data exchange, as well as joint AI-based analysis of data from different companies. Thus, new data products are created to support business actors in decision-making.

Individualized Implants for the Treatment of Lower Extremities
Researchers at Saarland University and DFKI have developed a personalized therapy for calf or shin fractures: With their method, they can tailor the optimal implant to each patient’s bone that withstands individual stresses and supports healing. To do this, they combine techniques from mechanics and computer science.

From image to 3D model
Routinely performed computer tomography (CT) yields image data sets that can be used for 3D reconstruction. Since a single CT scan requires pixel-accurate identification of materials (cortical bone, cancellous bone, metal, soft tissue) over several hundred images, this results in a lengthy process of manual classification by expert personnel. By utilizing Deep Learning Technology, a neural network can classify materials on CT images, thus significantly improving the speed and reliability of the image segmentation workflow. These tomograms are used to create a model that can then be processed virtually or exported to standard formats used by CAD software.
The result is implants tailored to the patient, which can be produced by selective laser melting or high-speed milling.

Hall 5, Booth F54, FabOS Stand

FabOS – Redeployment of Real-Time Applications
Researchers from FabOS – an open, distributed, real-time, and secure operating system for manufacturing – are demonstrating a simple redeployment of real-time applications at Hannover Messe. If changes are to be made to services in a real-time environment at runtime, strategies must be devised to minimize downtime. The exhibit “Redeployment of real-time applications” presents this scenario. A solution for industrial applications was developed on the basis of container-based virtualization and live migration approaches.

Hall 8, Booth D18, SmartFactory technology initiative

SmartFactory KL – Industrial AI in the Shared Production
SmartFactory-KL is working out the future of manufacturing with four networked production islands. The largest – production island _KUBA – will be presented to the world public for the first time at this year’s Hannover Messe. Visitors can configure a model truck whose production begins immediately on site. The truck’s parts (driver’s cab, trailer, wheels, etc.) move on an arrow-fast transport system and are assembled in interaction with autonomous machine modules and manual workstations.
The SmartFactory-KL thus demonstrates that artificial intelligence, man, and machine are the dream team of the future and that high-tech production does not displace man from the factory. He is and remains sovereign because, unlike robots, he can recognize errors, optimize systems and generate new ideas.
The unique system architecture of shared production enables resilience and sustainability, which can be organized in the future via data platforms such as Gaia-X. SmartFactory-KL focuses on cross-manufacturer modularity and has been bringing Industry 4.0 into application since 2014.
www.smartfactory.de

Panel Discussions & Keynotes:

May 30, 2022, 15:00 – 16:05 h, Tech Transfer Conference Stage, Hall 2, Booth A60
“The Future of Industry 4.0”
15:00-15:15 h: Keynote lecture, Prof. Dr. Peter Liggesmeyer, Fraunhofer IESE/Plattform Industrie 4.0
15:15-16:05 h: Panel discussion with:
Prof. Henning Kagermann, acatech
Prof. Wolf-Dieter Lukas, BMBF, State Secretary (ret.)
Prof. Wolfgang Wahlster, DFKI Chief Executive Advisor

Moderation: Dr. Tabea Golgath, LINK – KI und Kultur, Stiftung Niedersachsen

May 31, 2022, 15:00 – 16:00 h, Tech Transfer Conference Stage, Hall 2, Booth A60
“Sustainability of, with and through AI made in Lower Saxony”
Keynote lecture and panel discussion with:
Dr. Sebastian Pütz, Plan-Based Robot Control, DFKI, Osnabrück
Prof. Dr.-Ing. Daniel Sonntag, Interactive Machine Learning, DFKI, Oldenburg
Prof. Dr. Oliver Thomas, Smart Enterprise Engineering, DFKI, Osnabrück
Prof. Dr. Oliver Zielinski, Marine Perception, DFKI, Oldenburg

Moderation:
Reinhard Karger, DFKI Corporate Spokesman
Simone Wiegand, DFKI Corporate Communications Lower Saxony