Tobias Nauen
Tobias Nauen
Home
Publications
Projects
Contact
Light
Dark
Automatic
paper-conference
TaylorShift: Shifting the Complexity of Self-Attention from Squared to Linear (and Back) using Taylor-Softmax
This paper introduces TaylorShift, a novel reformulation of the attention mechanism using Taylor softmax that enables computing full token-to-token interactions in linear time. We analytically and empirically determine the crossover points where employing TaylorShift becomes more efficient than traditional attention. TaylorShift outperforms the traditional transformer architecture in 4 out of 5 tasks.
Tobias Christian Nauen
,
Sebastian Palacio
,
Andreas Dengel
PDF
Cite
Code
Project
Project
Slides
DOI
Appendix
Which Transformer to Favor: A Comparative Analysis of Efficiency in Vision Transformers
A comprehensive benchmark and analysis of more than 45 transformer models for image classification to evaluate their efficiency, considering various performance metrics. We find the optimal architectures to use and uncover that model-scaling is more efficient than image scaling.
Tobias Christian Nauen
,
Sebastian Palacio
,
Federico Raue
,
Andreas Dengel
PDF
Cite
Code
Project
Project
Data Explorer
Cite
×