Alessio Quercia
Alessio Quercia
Home
Experience
Publications
Talks
Awards
Contact
Light
Dark
Automatic
Parameter-Efficient Fine-Tuning
1LoRA: Summation Compression for Very Low-Rank Adaptation
We propose 1LoRA (Summation Low-Rank Adaptation), a compute, parameter and memory efficient fine-tuning method which uses the feature sum as fixed compression and a single trainable vector as decompression.
Alessio Quercia
,
Zhuo Cao
,
Arya Bangun
,
Richard D. Paul
,
Abigail Morrison
,
Ira Assent
,
Hanno Scharr
PDF
Cite
Parameter-efficient Bayesian Neural Networks for Uncertainty-aware Depth Estimation
In this work, we investigate the suitability of PEFT methods for subspace Bayesian inference in large-scale Transformer-based vision models.
Richard D. Paul
*
,
Alessio Quercia
*
,
Vincent Fortuin
,
Katharina Nöh
,
Hanno Scharr
PDF
Cite
Cite
×