For every object which needs inspection, an appropriate inspection system configuration must be determined. The process is conducted by an expert during inspection planning. The inspection planning process, as it is today is conducted in a lab, requiring a lot of experience and physical labor.
Virtual Inspection Planning project develops a modular, semi-automatic framework capable of replicating the complete process in an virtual environment. Such approach brings significant benefits such as faster requirement evaluation, shorter inspection planning times, lower costs and flexible inspection systems which can be integrated into Industry 4.0 concepts and reduction of physical work required to be done by an engineer.
Contact: Petra Gospodnetić
Industry 4.0 is further pushing automation of traditional manufacturing processes, therefore automation of surface inspection planning is required. Two important tasks of surface inspection planning are finding optimal hardware placement for complete coverage of inspected objects and developing image processing algorithms for surface analysis. Both tasks benefit from the simulation of the inspection environment which consists of inspected object, light and camera using computer graphics modeling and rendering techniques. The focus of our work is the simulation of the inspected object material. Our aim is to model physically based material using microfacet theory in order to achieve the realism required for surface inspection planning. Our approach is based on procedural texturing methods which are used to build models for generating microstructure over the inspected object surface. Alongside procedurally generated texture, the aim of our models is usage on arbitrary complex inspected object geometry.
Contact: Lovro Bosnar
Building floor plans are an important tool for architects to derive new designs or analyze given designs. Here, architects are interested in the analysis of existing designs. Unfortunately, building floor plans need to be analyzed by architects manually, which requires a lot of time. This project builds on a close collaboration between computer scientists and architects. The provided analytical method allows extracting geometric primitives in building floor plans, resembles them to superior structures, and analyzes their location and number of occurrences in a building floor plan. The analysis will be used to create various machine learning algorithms, helping architects to plan and design their projects.
Contact: Katharina Roth
Due to the experimental origin of capturing molecular data, there usually is an in-herent level of uncertainty present in molecular models. This uncertainty can originatefrom various sources and can have a massive effect on the decision-making process.Therefore, incorporating uncertainty in state of the art visualization approaches for molecular data is an important issue to ensure whoever analyses the data is aware of the inherent uncertainty present in the representation of the molecular data. In this work a framework that allows biochemists to explore molecular data in a familiar environment while, at the same time, including uncertainty information withinthe visualizations is presented. The framework is based on an anisotropic description of molecules that can be propagated along required computations, providing multiple views that extend prominent visualization approaches to visually encode the uncertainty of atom positions, allowing an interactive exploration.
Contact: Robin Maack
Coating adhesion is one of the most important parameters for evaluating the quality and functional reliability of thin-films for tribological purposes. The Rockwell indentation test, standardized in DIN 4856 and ISO 26443, is an established test method in industry and research for determining coating adhesion. A hardness indentation according to Rockwell C is performed on the coated component. Currently, any damage to the coating around the indentation imprint is qualitatively assessed by an expert and classified into adhesion classes according to the visual impression. For this purpose, comparative images are used, which schematically show typical crack and delamination patterns in various forms. The heuristic nature of the classification procedure and vague classes it ensues are not up to date with the current requirements of quality assurance. In our group we are developing a quantifiable and understandable method, which enables precise classification and future automation. Our approach is based on semantic segmentation with the use of neural networks, developing quantitative and intuitive features describing delamination and cracking per adhesion class and testing the use of these features for class prediction using statistical machine learning.
Contact: Damjan Hatić
Project partner: Volkswagen AG & Dr. Ing. h.c. F. Porsche AG
Simulationen und Virtualisierungen führen in frühen Entwicklungsphasen zu einer frühzeitigen Fehleridentifikation und zur Steigerung von Produkt-Qualität. Somit hat die Virtualisierung längst Einzug gefunden im kompletten Automobilbau. Umso wichtiger ist es an Schlüsselstellen die Virtualisierung weiter voran zu treiben. In dem einmaligen Kooperationsprojekt von Automobilindustrie und universitärer Forschung werden zukünftige Herausforderungen in der Qualitätssicherung betrachtet. Im Fokus hierbei steht die Virtualisierung des Meisterbockprozesses zur Qualitätsbeurteilung von Anbauteilen der Karosserie. Der physische Meisterbock galt jahrelang als Referenz- und Analysewerkzeug für diese Beurteilung. Jedoch, mit steigendem Frontloading im Fahrzeugentwicklungsprozess muss sich auch der Meisterbock neu erfinden.
Hierzu werden virtuelle Techniken zum Abbilden des physischen Prozesses basierend auf FEM Simulationen untersucht und ausgewertet. Der Forschungsschwerpunkt liegt hierbei auf der Untersuchung der auftretenden kinematischen Effekte bei elastischen Deformationen, sowie auf Konzepten zur Analyse und Auswertung von spezifischen Parametern des Anbau- und Fertigungsprozesses.
Die Forschungskooperation kann mittlerweile auf eine 5-jährige Erfolgsgeschichte zurückblicken und ist weiterhin eines der zentralen Themen der industrienahen Forschung der Arbeitsgruppe.
In car manufacturing and prototyping, quality control plays an important role. While this was done mostly by tactile measurements, using robots, there is a supposed paradigm change towards optical measurements via stereoscopic cameras or similar devices. The aim of our project is to efficiently store, evaluate and visualize this new type of measurement data and integrate it into Kronion’s widely used eMMA software suite. Funding by: Zentrales Innovationsprogramm Mittelstand (ZIM). Kooperationsprojekt zwischen Steinbichler Optotechnik GmbH, Hochschule Rhein-Main, Kronion GmbH, Universität Kaiserslautern.
With the increasing digitalization of factory and manufacturing planning and engineering the data produced no longer can be summarized in a single snapshot. Having already pursued a lot of effort in analyzing 1d, 2d and lately 3d field data, the challenge now is to deal with multiple outcomes at once and therefore not only incorporating unsteady (time-dependent) data but also correlating the results from multiple sources like simulations, measurements etc. This way the task of bridging the layers from process to manufacturing to factory level is getting to a new level of complexity.
The aim of this project is to provide insights in the correlation of all levels by investigating the needs in analysis and communication of every discipline. Doing so, we set up on state-of-the-art technologies and built a framework that allows the user to seamless integrate his results in one environment.
Contact: Patrick Ruediger
This Project is proceeded as a collaboration with UC Davis (CA) and is part of the IRTG 2057 - "Physical Modeling for Virtual Manufacturing Systems and Processes". In this research project, we propose to develop and evaluate human-centered visualization and interaction techniques that scale both with the level of the transaction to be considered and with the used devices with the expected result of a Human-centered virtual production environment achieved by the development of a highly scalable visualization/interaction framework, focusing on the cross product of visualization, interaction and collaboration.
Contact: Dr. Franca Alexandra Rupprecht
Heart vessel diseases, as atherosclerosis are one of the leading cause of death in modern society. Lots of research focusses on the early and fast detection of these diseases. CT scans are a wide used diagnosis method to review the entire heart system. Unfortunately an early detection of heart vessel anomalies on CT scans is often not possible as its resolution is low and contains several image errors. In this project we aim to enhance CT scans in a way that medical researchers are able to detect diseases earlier, more accurate and trustworthy by extracting features from CT scans, simulating blood flow and visually provide our medical coworkers with tools to examine CT scans in a more efficient way. This project is proceeded as a collaboration with the Wright State University, Dayton (OH).
Contact: Christina Gillman
In recent times, visual analysis has become increasingly important, especially in the area of software measurement, as most of the data from software measurement is multivariate. In this regard, standard software analysis tools are limited by their lack of ability to process huge collections of multidimensional data sets; current tools are designed to either support only well-known metrics, are too complicated to use for generating custom software metrics, or have limited software visualization capabilities. Furthermore, the analyst requires extensive knowledge of the underlying data schemas and the relevant querying language. To address these shortcomings, we propose an interactive visual workflow modeling approach that focuses on visual elements, their configurations, and inter-connectivity rather than a data ontology and querying language. Importantly, in terms of software comprehension we provide a tighter integration between 'software analysis' and 'software visualization' in order to provide an integrated means to specify and visualize software measurements.
Contact: Taimur Khan
GeoDict generates and operates on 3D geometric material models and requires the visualization of these models as well as that of computed scalar, vectorial and tensorial fields, such as temperature distributions, flow fields, displacement fields, etc. The objective of this project is to develop a highly time- and memory-efficient pde-solver that is compatible with GeoDicts voxelized structure and solution representation. This project is realized in cooperation with Fraunhofer ITWM and AG Computergrafik and HCI.
In the field of technical textiles (nonwovens, glass wool, ...), fibers and filaments of different materials are produced and processed. Simulations allow trying out modifications to these processes and running experiments at reduced cost, before implementing them in the real world. During the production, the fibers are subject to forces determining their dynamics. In order to simulate such processes, the dynamics of the fibers need to be modeled. One of the challenges for the simulation is to model the interaction of fibers with machine parts. When a collision of a fiber with the machine geometry is detected, its dynamics equation has to be expanded by geometric constraints to prevent penetration. The time step is then recomputed including appropriate contact forces resulting from the constraints. The picture shows the simulation of a filament in air flow for a spunbond line of Oerlikon Neumag, computations performed by Fraunhofer ITWM.
Numerical simulation of the magnetic field generation of the Earth (the geodynamo), other planets, and the sun, presents an enormous computational and visualization challenges. Simulations of the physical system that generates the magnetic field, the geodynamo, is being developed and carried out on some of the world’s fastest computers, requiring massively parallel computing to carry out long-duration simulations at sufficiently high resolution. The geodynamo represents a major visualization challenge, as the output consists of time-varying vector and scalar fields representing turbulent convection in the Earth’s core and the coupled magnetic field generated by that flow. The number of fields to be studied, the resolution required, and the long-time series makes extraction of features very challenging; moreover, the observation used to compare against simulations is the magnetic field at and above the Earth’s surface far from the computational domain of the simulations. Effective and specialized analysis and visualization systems capable of allowing a geophysicist to truly comprehend a simulated data set in its entirety and derive the relevant, hidden scientific insight are still missing. Our research effort is motivated by this fact, and the members of interdisciplinary constituting this effort jointly specified the design objectives of the magnetic field system presented here. Specifically, the Earth’s magnetic field exhibits highly turbulent behavior in the Earth’s interior, leading to simulated magnetic fields to highly intricate and hard-to-comprehend flow behavior and patterns.
Contact: Patrick Rüdiger