Рет қаралды 52,547
▬▬ Papers / Resources ▬▬▬
LoRA Paper: arxiv.org/abs/2106.09685
QLoRA Paper: arxiv.org/abs/2305.14314
Huggingface 8bit intro: huggingface.co/blog/hf-bitsan...
PEFT / LoRA Tutorial: www.philschmid.de/fine-tune-f...
Adapter Layers: arxiv.org/pdf/1902.00751.pdf
Prefix Tuning: arxiv.org/abs/2101.00190
▬▬ Support me if you like 🌟
►Link to this channel: bit.ly/3zEqL1W
►Support me on Patreon: bit.ly/2Wed242
►Buy me a coffee on Ko-Fi: bit.ly/3kJYEdl
►E-Mail: deepfindr@gmail.com
▬▬ Used Music ▬▬▬▬▬▬▬▬▬▬▬
Music from #Uppbeat (free for Creators!):
uppbeat.io/t/danger-lion-x/fl...
License code: M4FRIPCTVNOO4S8F
▬▬ Used Icons ▬▬▬▬▬▬▬▬▬▬
All Icons are from flaticon: www.flaticon.com/authors/freepik
▬▬ Timestamps ▬▬▬▬▬▬▬▬▬▬▬
00:00 Introduction
00:20 Model scaling vs. fine-tuning
00:58 Precision & Quantization
01:30 Representation of floating point numbers
02:15 Model size
02:57 16 bit networks
03:15 Quantization
04:20 FLOPS
05:23 Parameter-efficient fine tuning
07:18 LoRA
08:10 Intrinsic Dimension
09:20 Rank decomposition
11:24 LoRA forward pass
11:49 Scaling factor alpha
13:40 Optimal rank
14:16 Benefits of LoRA
15:20 Implementation
16:25 QLoRA
▬▬ My equipment 💻
- Microphone: amzn.to/3DVqB8H
- Microphone mount: amzn.to/3BWUcOJ
- Monitors: amzn.to/3G2Jjgr
- Monitor mount: amzn.to/3AWGIAY
- Height-adjustable table: amzn.to/3aUysXC
- Ergonomic chair: amzn.to/3phQg7r
- PC case: amzn.to/3jdlI2Y
- GPU: amzn.to/3AWyzwy
- Keyboard: amzn.to/2XskWHP
- Bluelight filter glasses: amzn.to/3pj0fK2