Spent the last 2 months familiarizing myself with huggingface transformers and their other libraries after reading extensively (research papers and all) on transformers and encoder-decoder architectures and also implementing gpt2 from scratch.
1
1
1
73
0
I have fine-tuned a number of models such as BERT for NER and POS-tagging and I've also fine-tuned blenderbot on a custom dialogue dataset, I also fine-tuned BERT for masked language modelling and I've trained a causal language model from scratch (GPT-2).
Check out my profile on hugging face to see models that I've trained. huggingface.co/Binaryy For my next trick, I'll be taking part in the babyLM challenge by CoNLL, check it out on babylm.guthub.io