Spent the last 2 months familiarizing myself with huggingface transformers and their other libraries after reading extensively (research papers and all) on transformers and encoder-decoder architectures and also implementing gpt2 from scratch.
1
1
1
73
0
I have fine-tuned a number of models such as BERT for NER and POS-tagging and I've also fine-tuned blenderbot on a custom dialogue dataset, I also fine-tuned BERT for masked language modelling and I've trained a causal language model from scratch (GPT-2).