Natural Language Processing in Action
Understanding, analyzing, and generating text with Python
Hobson Lane, Cole Howard, Hannes Hapke
  • MEAP began April 2017
  • Publication in Summer 2018 (estimated)
  • ISBN 9781617294631
  • 300 pages (estimated)
  • printed in black & white

Natural Language Processing in Action is your guide to creating machines that understand human language using the power of Python with its ecosystem of packages dedicated to NLP and AI! You'll start with a mental model of how a computer learns to read and interpret language. Then, you'll discover how to train a Python-based NLP machine to recognize patterns and extract information from text. As you explore the carefully-chosen examples, you'll expand your machine's knowledge and apply it to a range of challenges, from building a search engine that can find documents based on their meaning rather than merely keywords, to training a chatbot that uses deep learning to answer questions and participate in a conversation.

Table of Contents detailed table of contents


Part 1: Wordy Machines

1 The Language of Thought

1.1 The Magic

1.2 Practical Applications

1.3 What is Driving NLP Advances?

1.4 Language through a Computer’s "Eyes"

1.4.1 The Language of Locks (Regular Expressions)

1.4.2 A Simple Chatbot

1.4.3 Another Way

1.5 A Brief Overflight of Hyperspace

1.6 Word Order and Grammar

1.7 A Chatbot Natural Language Pipeline

1.8 Processing in Depth

1.9 Natural Language IQ

1.10 Summary

2 Build Your Vocabulary

2.1 Building your vocabulary through tokenization

2.1 A Token Improvement

2.2.1 Contractions

2.3 Extending your vocabulary with N-grams

2.3.1 What are N-grams?

2.3.2 Stopwords

2.4 Normalizing your vocabulary

2.4.1 Case normalization

2.4.2 Stemming

2.4.3 Lemmatization

2.4.4 Use Cases

2.5 Summary

3 Math with Words

3.1 Bag of Words

3.2 Vectorizing

3.3 Zipf’s Law

3.4 Topic Modeling

3.4.1 Return of Zipf

3.4.2 Relevance Ranking

3.4.3 Tools

3.4.4 Alternatives

3.5 Summary

4 Finding Meaning in Word Counts

4.1 From Word Counts to Topics

4.2 Latent Semantic Analysis (LSA)

4.3 Singular Value Decomposition (SVD)

4.4 Truncated SVD

4.5 SciKit-Learn vs gensim

4.5.1 PCA on a Point Cloud

4.5.2 Let’s Stop Horsing Around and Get Back to NLP

4.5.3 PCA on SMS Messages

4.5.4 Truncated SVD

4.5.5 Which one is Better for Spam Classification?

4.5.6 A Simple Classifier

4.6 Latent Dirichlet Allocation (LDiA)

4.6.1 The LDiA Idea

4.6.2 LDiA Topic Model for SMS Messages

4.6.3 LDiA + LDA = Spam Classifier

4.6.4 A Fairer Comparison: 32 LDiA Topics

4.7 Distance and Similarity

4.8 Steering

4.8.1 Linear Discriminant Analysis (LDA)

4.9 Topic Vector Power

4.10 "Like" Prediction

4.11 Summary

Part 2: Deeper Learning

5 Baby Steps with Neural Networks

5.1 A Regression Digression

5.1.1 The Spaminess Topic

5.2 Neural Networks, the Ingredient List

5.2.1 Backpropagation

5.2.2 Let’s Go Skiing - The Error Surface

5.2.3 Off the Chair Lift, Onto the Slope

5.2.4 Let’s Shake Things Up a Bit

5.2.5 Keras: Neural Networks in Python

5.2.6 Onward and Deepward

5.2.7 Normalization: Input with Style

5.3 Summary

6 Reasoning with Word Vectors (Word2vec)

6.1 Vector-Oriented Reasoning

6.2 Applications for Word Vectors

6.3 How to compute the Word Vector Representations?

6.3.1 Skip-gram Approach

6.3.2 Continous Bag of Words Approach

6.3.3 Skip-gram vs. Continous Bag of Words: When to use which approach

6.3.4 Computational Tricks of Word2vec

6.4 How to use Gensim’s Word2vec module?

6.5 How to generate your own Word vector representations?

6.5.1 Preprocess Steps

6.5.2 Train your domain-specific Word2vec model

6.6 Word2vec vs GloVe (Global Vector)

6.7 Word2vec vs LSA

6.8 Doc2Vec

6.9 Summary

7 Getting Words in Order with Convolutional Neural Networks (CNNs)

7.1 Learning Meaning

7.1.1 Word Order

7.1.2 Word Proximity

7.2 Toolkit

7.3 Convolutional Neural Nets

7.3.1 Padding

7.3.2 Learning

7.4 Narrow Windows Indeed

7.4.1 Implementation in Keras: Prepping the Data

7.4.2 Convolutional Neural Network Architecture

7.4.3 Pooling

7.4.4 Dropout

7.4.5 The Cherry on the Sundae

7.4.6 Let’s Get to Learning (Training)

7.4.7 Using the Model in a Pipeline

7.4.8 Where Do We Go From Here?

7.5 Summary

8 Loopy (Recurrent) Neural Networks (RNNs)

8.1 Remembering with Recurrent Networks

8.2 Backpropagation Through Time

8.2.1 When Do We Update What?

8.2.2 There’s Always a Catch

8.2.3 RNN with Keras

8.3 Putting Things Together

8.4 Let’s Get to Learning Our Past Selves

8.5 Hyperparameters

8.6 Predicting

8.7 Two Way Street

8.8 Summary

9 Improving Retention with Long Short-Term Memory Networks (LSTMs)

10 Attention Networks, Translation, and Sequence to Sequence Models

Part 3: Getting Real

11 Information Extraction

12 Getting Chatty

13 Generative Models

14 Scaling Up


Appendix A: A Acquiring Words

Appendix B: B You Slice, I Choose

Appendix C: C Vectors and Matrices (Linear Algebra)

Appendix D: D Machine Learning

About the Technology

Most humans are pretty good at reading and interpreting text; computers...not so much. Natural Language Processing (NLP) is the discipline of teaching computers to read more like people, and you see examples of it in everything from chatbots to the speech-recognition software on your phone. Modern NLP techniques based on machine learning radically improve the ability of software to recognize patterns, use context to infer meaning, and accurately discern intent from poorly-structured text. NLP promises to help you improve customer interactions, save cost, and reinvent text-intensive applications like search or product support.

What's inside

  • Working with Keras, TensorFlow, Gensim, scikit-learn, and more
  • Parsing and normalizing text
  • Rule-based (Grammar) NLP
  • Data-based (Machine Learning) NLP
  • Deep Learning NLP
  • End-to-end chatbot pipeline with training data
  • Scalable NLP pipelines
  • Hyperparameter optimization algorithms

About the reader

While all examples are written in Python, experience with any modern programming language will allow readers to get the most from this book. A basic understanding of machine learning will also be helpful.

About the authors

Hobson Lane has more than 15 years of experience building autonomous systems that make important decisions on behalf of humans. Hannes Hapke is an Electrical Engineer turned Data Scientist with experience in deep learning. Cole Howard is a carpenter and writer turned Deep Learning expert.

Manning Early Access Program (MEAP) Read chapters as they are written, get the finished eBook as soon as it’s ready, and receive the pBook long before it's in bookstores.
MEAP combo $49.99 pBook + eBook + liveBook
MEAP eBook $39.99 pdf + ePub + kindle + liveBook

FREE domestic shipping on three or more pBooks