![](/project/sembed-ai/featured_hud9fa707340cd4a7459344f273398913e_8765_f7d8285a342a6be5454608128426a319.webp)
![Tobias Christian Nauen](/authors/admin/avatar_hu50770964dae665ff80aea5331b545e2e_185443_270x270_fill_q75_lanczos_center.jpg)
Tobias Christian Nauen
PhD Student
My research interests include efficiency of machine learning models, multimodal learning, and transformer models.
Publications
This paper introduces TaylorShift, a novel reformulation of the attention mechanism using Taylor softmax that enables computing full token-to-token interactions in linear time. We analytically determine the crossover points where employing TaylorShift becomes more efficient than traditional attention, aligning closely with empirical measurements.
Tobias Christian Nauen, Sebastian Palacio, Andreas Dengel
A comprehensive benchmark of more than 30 transformer models for vision to evaluate their efficiency, considering various performance metrics.
Tobias Christian Nauen, Sebastian Palacio, Andreas Dengel