BERT-Based Transformer Projects

Hate Speech Detection you own this product

This project is part of the liveProject series BERT-Based Transformer Projects
prerequisites
intermediate Python and PyTorch • basics of natural language processing
skills learned
loading and configuring pretrained ALBERT model using HuggingFace • building and training a text classifier using PyTorch Lightning • validating the performance of the model and quantifying the results
Rohan Khilnani
1 week · 5-7 hours per week · INTERMEDIATE

pro $24.99 per month

  • access to all Manning books, MEAPs, liveVideos, liveProjects, and audiobooks!
  • choose one free eBook per month to keep
  • exclusive 50% discount on all purchases

lite $19.99 per month

  • access to all Manning books, including MEAPs!

team

5, 10 or 20 seats+ for your team - learn more


Look inside
In this liveProject, you’ll use the ALBERT variation of the BERT Transformer to detect occurrences of hate speech in a data set. The ALBERT model uses fewer parameters than BERT, making it more suitable to the unstructured and slang-heavy text of social media. You’ll load this powerful pretrained model using the Hugging Face library and fine-tune it for your specific needs with PyTorch Lightning. As falsely tagging hate speech can be a big problem, the success of your model will involve calculating and optimizing its precision score. Your final product will run as a notebook on a GPU in the Google Colab environment.
This project is designed for learning purposes and is not a complete, production-ready application or solution.

book resources

When you start your liveProject, you get full access to the following books for 90 days.

project author

Rohan Khilnani
Rohan Khilnani is a data scientist at Optum, United Health Group. He has filed two patents in the field of natural language processing and has also published a research paper on LSTMs with Attention at the COLING conference in 2018.

prerequisites

This liveProject is for intermediate Python and NLP practitioners who are interested in implementing pretrained BERT architectures, and customizing them to solve real-world NLP problems. To begin this liveProject you will need to be familiar with:

TOOLS
  • Intermediate Python
  • Intermediate PyTorch
  • Basics of Google Colab
TECHNIQUES
  • Basics of machine learning
  • Basics of neural networks
  • Basics of natural language processing

you will learn

In this liveProject, you will develop hands-on experience in building a text classifier using PyTorch Lightning and Hugging Face. You’ll also get practical experience working on GPUs in the Google Colab environment.

  • Working with Jupyter Notebook on Google Colab
  • Loading and preprocessing a text data set
  • Tokenizing data using pretrained tokenizers
  • Creating dataloaders and tensor data sets
  • Loading and configuring pretrained ALBERT model using Hugging Face
  • Building and training a text classifier using PyTorch Lightning
  • Validating the performance of the model by optimizing its precision score

features

Self-paced
You choose the schedule and decide how much time to invest as you build your project.
Project roadmap
Each project is divided into several achievable steps.
Get Help
While within the liveProject platform, get help from other participants and our expert mentors.
Compare with others
For each step, compare your deliverable to the solutions by the author and other participants.
book resources
Get full access to select books for 90 days. Permanent access to excerpts from Manning products are also included, as well as references to other resources.

choose your plan

team

monthly
annual
$49.99
$499.99
only $41.67 per month
  • five seats for your team
  • access to all Manning books, MEAPs, liveVideos, liveProjects, and audiobooks!
  • choose another free product every time you renew
  • choose twelve free products per year
  • exclusive 50% discount on all purchases
  • Hate Speech Detection project for free