Grow your YouTube channel like a PRO with a free tool
Get Free YouTube Subscribers, Views and Likes

Hugging Face Transformers: the basics. Practical coding guides SE1E1. NLP Models (BERT/RoBERTa)

Follow
rupert ai

Practical Python Coding Guide BERT in PyTorch

In this first episode of the practical coding guide series, I discuss the basics of the Hugging Face Transformers Library. What is it? how does it work? what can you do with it? This episode focuses on highlevel concepts, navigating their website and implementing some outofthebox functionality.

Intro: 00:00
What is Hugging Face's Transformer Library: 1:12
Hugging Face models: 2:00
Navigating the Transformers documentation: 8:56
Coding with Transformers installation: 11:55
Using predefined pipelines: 12:45
Implementing a model through PyTorch: 14:08
Tokenisers, Token IDs and Attention Masks: 16:28
Output from the model: 25:26
Outro: 27:26

This series attempts to offer a casual guide to Hugging Face and Transformer models focused on implementation rather than theory. Let me know if you enjoy them!

In future episodes, I will be retraining a model from the Transformers Library (RoBERTa) on a downstream task: a multilabel classification problem. In an attempt to spot subtle sentiment attributes in online comments. Make sure to subscribe if you are interested.

Check out my website: https://www.rupert.digital

Good learning material for theory (Transformers / BERT)
Attention is all you need paper: https://arxiv.org/abs/1706.03762
BERT paper: https://arxiv.org/abs/1810.04805
RoBERTa paper: https://arxiv.org/abs/1907.11692
Jay Alanmar illustrated articles: https://jalammar.github.io/illustrate... (check out his BERT one too)
Chris McCormick: https://mccormickml.com/ (check out his youtube series on BERT / Transformers)

posted by Moogswist1k