Tobias Nauen
Tobias Nauen
Home
Publications
Projects
Contact
Light
Dark
Automatic
Deep Learning
TaylorShift: Shifting the Complexity of Self-Attention from Squared to Linear (and Back) using Taylor-Softmax
This paper introduces TaylorShift, a novel reformulation of the attention mechanism using Taylor softmax that enables computing full token-to-token interactions in linear time. We analytically determine the crossover points where employing TaylorShift becomes more efficient than traditional attention, aligning closely with empirical measurements.
Tobias Christian Nauen
,
Sebastian Palacio
,
Andreas Dengel
PDF
Cite
Code
Project
Which Transformer to Favor: A Comparative Analysis of Efficiency in Vision Transformers
A comprehensive benchmark of more than 30 transformer models for vision to evaluate their efficiency, considering various performance metrics.
Tobias Christian Nauen
,
Sebastian Palacio
,
Andreas Dengel
PDF
Cite
Code
Project
Data Explorer
Sustainable Embedded AI
Energy- and data-saving methods for environmental perception in embedded AI systems using the case study of smart factory and smart farming applications; funded by the
Carl Zeiss Foundation
.
Explaining Graph Neural Networks
We extend and test KEdge, an interpretable-by-design approach for graph neural networks, and compare it to gradient-based attribution techniques.
Tobias Christian Nauen
,
Thorben Funke
,
Avishek Anand
PDF
Cite
DOI
Cite
×