Tobias Nauen
Tobias Nauen
Home
Publications
Projects
Contact
Light
Dark
Automatic
Deep Learning
Which Transformer to Favor: A Comparative Analysis of Efficiency in Vision Transformers
A comprehensive benchmark and analysis of more than 45 transformer models for image classification to evaluate their efficiency, considering various performance metrics. We find the optimal architectures to use and uncover that model-scaling is more efficient than image scaling.
Tobias Christian Nauen
,
Sebastian Palacio
,
Federico Raue
,
Andreas Dengel
PDF
Cite
Code
Project
Project
Data Explorer
TaylorShift: Shifting the Complexity of Self-Attention from Squared to Linear (and Back) using Taylor-Softmax
This paper introduces TaylorShift, a novel reformulation of the attention mechanism using Taylor softmax that enables computing full token-to-token interactions in linear time. We analytically and empirically determine the crossover points where employing TaylorShift becomes more efficient than traditional attention. TaylorShift outperforms the traditional transformer architecture in 4 out of 5 tasks.
Tobias Christian Nauen
,
Sebastian Palacio
,
Andreas Dengel
PDF
Cite
Code
Project
Project
Appendix
Albatross
At its core, Albatross is a research project in the area of continual learning.
Sustainable Embedded AI
Energy- and data-saving methods for environmental perception in embedded AI systems using the case study of smart factory and smart farming applications; funded by the
Carl Zeiss Foundation
.
SustAInML
SustainML is dedicated to creating a sustainable ML framework for Green AI. By prioritizing energy efficiency, SustainML aims to pave the way for environmentally conscious AI solutions that are both efficient and effective.
Explaining Graph Neural Networks
We extend and test KEdge, an interpretable-by-design approach for graph neural networks, and compare it to gradient-based attribution techniques.
Tobias Christian Nauen
,
Thorben Funke
,
Avishek Anand
PDF
Cite
DOI
Cite
×