BERT-Based Transformer Projects

Hate Speech Detection you own this product

This project is part of the liveProject series BERT-Based Transformer Projects
intermediate Python and PyTorch • basics of natural language processing
skills learned
loading and configuring pretrained ALBERT model using HuggingFace • building and training a text classifier using PyTorch Lightning • validating the performance of the model and quantifying the results
Rohan Khilnani
1 week · 5-7 hours per week · INTERMEDIATE
filed under

placing your order...

Don't refresh or navigate away from the page.
liveProject This project is part of the liveProject series BERT-Based Transformer Projects liveProjects give you the opportunity to learn new skills by completing real-world challenges in your local development environment. Solve practical problems, write working code, and analyze real data—with liveProject, you learn by doing. These self-paced projects also come with full liveBook access to select books for 90 days plus permanent access to other select Manning products. $19.99 $29.99 you save $10 (33%)
Hate Speech Detection (liveProject) added to cart
continue shopping
adding to cart

choose your plan


only $41.67 per month
  • five seats for your team
  • access to all Manning books, MEAPs, liveVideos, liveProjects, and audiobooks!
  • choose another free eBook every time you renew
  • choose twelve free eBooks per year
  • exclusive 50% discount on all purchases
  • Hate Speech Detection eBook for free
Look inside
In this liveProject, you’ll use the ALBERT variation of the BERT Transformer to detect occurrences of hate speech in a data set. The ALBERT model uses fewer parameters than BERT, making it more suitable to the unstructured and slang-heavy text of social media. You’ll load this powerful pretrained model using the Hugging Face library and fine-tune it for your specific needs with PyTorch Lightning. As falsely tagging hate speech can be a big problem, the success of your model will involve calculating and optimizing its precision score. Your final product will run as a notebook on a GPU in the Google Colab environment.
This project is designed for learning purposes and is not a complete, production-ready application or solution.

book resources

When you start your liveProject, you get full access to the following books for 90 days.

project author

Rohan Khilnani
Rohan Khilnani is a data scientist at Optum, United Health Group. He has filed two patents in the field of natural language processing and has also published a research paper on LSTMs with Attention at the COLING conference in 2018.


This liveProject is for intermediate Python and NLP practitioners who are interested in implementing pretrained BERT architectures, and customizing them to solve real-world NLP problems. To begin this liveProject you will need to be familiar with:

  • Intermediate Python
  • Intermediate PyTorch
  • Basics of Google Colab
  • Basics of machine learning
  • Basics of neural networks
  • Basics of natural language processing

you will learn

In this liveProject, you will develop hands-on experience in building a text classifier using PyTorch Lightning and Hugging Face. You’ll also get practical experience working on GPUs in the Google Colab environment.

  • Working with Jupyter Notebook on Google Colab
  • Loading and preprocessing a text data set
  • Tokenizing data using pretrained tokenizers
  • Creating dataloaders and tensor data sets
  • Loading and configuring pretrained ALBERT model using Hugging Face
  • Building and training a text classifier using PyTorch Lightning
  • Validating the performance of the model by optimizing its precision score


You choose the schedule and decide how much time to invest as you build your project.
Project roadmap
Each project is divided into several achievable steps.
Get Help
While within the liveProject platform, get help from other participants and our expert mentors.
Compare with others
For each step, compare your deliverable to the solutions by the author and other participants.
book resources
Get full access to select books for 90 days. Permanent access to excerpts from Manning products are also included, as well as references to other resources.