Free views, likes and subscribers at YouTube. Now!
Get Free YouTube Subscribers, Views and Likes

LoRA explained (and a bit about precision and quantization)

Follow
DeepFindr

▬▬ Papers / Resources ▬▬▬
LoRA Paper: https://arxiv.org/abs/2106.09685
QLoRA Paper: https://arxiv.org/abs/2305.14314
Huggingface 8bit intro: https://huggingface.co/blog/hfbitsan...
PEFT / LoRA Tutorial: https://www.philschmid.de/finetunef...
Adapter Layers: https://arxiv.org/pdf/1902.00751.pdf
Prefix Tuning: https://arxiv.org/abs/2101.00190


▬▬ Support me if you like
►Link to this channel: https://bit.ly/3zEqL1W
►Support me on Patreon: https://bit.ly/2Wed242
►Buy me a coffee on KoFi: https://bit.ly/3kJYEdl
►EMail: [email protected]

▬▬ Used Music ▬▬▬▬▬▬▬▬▬▬▬
Music from #Uppbeat (free for Creators!):
https://uppbeat.io/t/dangerlionx/fl...
License code: M4FRIPCTVNOO4S8F

▬▬ Used Icons ▬▬▬▬▬▬▬▬▬▬
All Icons are from flaticon: https://www.flaticon.com/authors/freepik

▬▬ Timestamps ▬▬▬▬▬▬▬▬▬▬▬
00:00 Introduction
00:20 Model scaling vs. finetuning
00:58 Precision & Quantization
01:30 Representation of floating point numbers
02:15 Model size
02:57 16 bit networks
03:15 Quantization
04:20 FLOPS
05:23 Parameterefficient fine tuning
07:18 LoRA
08:10 Intrinsic Dimension
09:20 Rank decomposition
11:24 LoRA forward pass
11:49 Scaling factor alpha
13:40 Optimal rank
14:16 Benefits of LoRA
15:20 Implementation
16:25 QLoRA

▬▬ My equipment
Microphone: https://amzn.to/3DVqB8H
Microphone mount: https://amzn.to/3BWUcOJ
Monitors: https://amzn.to/3G2Jjgr
Monitor mount: https://amzn.to/3AWGIAY
Heightadjustable table: https://amzn.to/3aUysXC
Ergonomic chair: https://amzn.to/3phQg7r
PC case: https://amzn.to/3jdlI2Y
GPU: https://amzn.to/3AWyzwy
Keyboard: https://amzn.to/2XskWHP
Bluelight filter glasses: https://amzn.to/3pj0fK2

posted by Kvildash