Get free YouTube views, likes and subscribers
Get Free YouTube Subscribers, Views and Likes

Low-Rank Adaptation - LoRA explained

Follow
AI Bites

RELATED LINKS
Paper Title: LoRA: LowRank Adaptation of Large Language Models
LoRA Paper: https://arxiv.org/abs/2106.09685
QLoRA Paper: https://arxiv.org/abs/2305.14314
LoRA official code: https://github.com/microsoft/LoRA
ParameterEfficient FineTuning (PEFT) Adapters paper: https://arxiv.org/abs/1902.00751
ParameterEfficient FineTuning (PEFT) library: https://github.com/huggingface/peft
HuggingFace LoRA training: https://huggingface.co/docs/diffusers...
HuggingFace LoRA notes: https://huggingface.co/docs/peft/conc...

⌚ ⌚ ⌚ TIMESTAMPS ⌚ ⌚ ⌚
0:00 Intro
0:58 Adapters
1:48 Twitter (  / ai_bites  )
2:13 What is LoRA
3:17 Rank Decomposition
4:28 Motivation Paper
5:02 LoRA Training
6:53 LoRA Inference
8:24 LoRA in Transformers
9:20 Choosing the rank
9:50 Implementations

MY KEY LINKS
YouTube:    / @aibites  
Twitter:   / ai_bites​  
Patreon:   / ai_bites​  
Github: https://github.com/aibites​

posted by chiliesx5