Chair of Visual Computing
The Visual Computing Group deals with machine vision and learning. One focus is on the automatic development of neural networks. We want to find out how such networks can become more efficient and remain robust against disturbances in images. We are also researching new approaches to better understand how neural networks make decisions - and how to bring them closer to human decision-making.
Our research profile
Development of efficient approaches for machine learning for various tasks in the field of computer vision.
The research of our group focuses on methods of machine learning and computer vision with emphasis on
-
Efficient search for neural architectures
We focus on advancing the automated design and optimization of neural networks to reduce manual trial-and-error procedures through efficient search strategies. Our group develops acceleration methods - often using predictive models - to explore large search spaces and minimize costly evaluations, optimizing the path to high-performance architectures.
-
Parameter generation and transfer learning
Our research extends transfer learning beyond traditional pre-trained weights to the generation of network parameters from learned distributions and the use of large language models. We aim to improve the efficiency and adaptability of neural networks for different domains, carefully considering the number of parameters and resource constraints.
-
Improving the robustness of neural networks
An important aspect of our research is to improve the robustness of neural networks against image noise and other disturbances that can lead to incorrect predictions. We are investigating strategies such as emphasizing object detection and controlling frequency biases in network weights to reduce vulnerabilities and strengthen model reliability.
Main research areas
- Automated design of neural architectures
- Robustness of deep learning models
- Object re-identification
- Efficiency of neural networks
- Joint hardware-software optimization (Learning2Sense)
- Learning and generation of weights
Courses
Teaching
We offer the following courses:
- Automated Machine Learning (summer semester)
- Digital Image Processing (summer semester)
- Practical course Deep Learning (winter semester)
- Unsupervised Deep Learning (winter semester)
Master's and Bachelor's theses
If you are interested in writing a Bachelor's or Master's thesis or project group with us, you are welcome to contact us. The following list contains only a small selection of the topics currently offered by our group. Please contact us for further topics and additional information. Your own ideas and interests are also welcome. Please note, we do not supervise external work that requires consent to a confidentiality agreement.
- Alexander Auras:
- NAS for inverse problems
- Penelope Natusch:
- Robustness Predictions
- Object Re-Identification
Publications
Can we talk models into seeing the world differently?
Can we talk models into seeing the world differently?
Transferrable Surrogates in Expressive Neural Architecture Search Spaces
Transferrable Surrogates in Expressive Neural Architecture Search Spaces
An Evaluation of Zero-Cost Proxies - from Neural Architecture Performance Prediction to Model Robustness
An Evaluation of Zero-Cost Proxies - from Neural Architecture Performance Prediction to Model Robustness
Implicit Representations for Constrained Image Segmentation
Implicit Representations for Constrained Image Segmentation
Surprisingly Strong Performance Prediction with Neural Graph Features
Surprisingly Strong Performance Prediction with Neural Graph Features
An Evaluation of Zero-Cost Proxies - From Neural Architecture Performance Prediction to Model Robustness
An Evaluation of Zero-Cost Proxies - From Neural Architecture Performance Prediction to Model Robustness
Neural Architecture Design and Robustness: A Dataset
Neural Architecture Design and Robustness: A Dataset
Improving Native CNN Robustness with Filter Frequency Regularization
Improving Native CNN Robustness with Filter Frequency Regularization
Learning Where to Look – Generative NAS is Surprisingly Efficient
Learning Where to Look – Generative NAS is Surprisingly Efficient
Surrogate NAS Benchmarks: Going Beyond the Limited Search Spaces of Tabular NAS Benchmarks
Surrogate NAS Benchmarks: Going Beyond the Limited Search Spaces of Tabular NAS Benchmarks
Smooth Variational Graph Embeddings for Efficient Neural Architecture Search
Smooth Variational Graph Embeddings for Efficient Neural Architecture Search
Neural Architecture Performance Prediction Using Graph Neural Networks
Neural Architecture Performance Prediction Using Graph Neural Networks
Contact the working group
Postal address
University of Siegen
Visual Computing Group
Hölderlinstraße 3
57076 Siegen
Visitor address
University of Siegen
Visual Computing Group
H-A Level 7
Room: H-A 7107
57076 Siegen
Secretariat
Secretary: Sarah Wagener
Phone: +49 (0)271 / 740-3315
Room: H-A 7107
E-mail: sarah-chr.wagener@uni-siegen.de