Natural Language Processing in Action
Understanding, analyzing, and generating text with Python
Hobson Lane, Cole Howard, Hannes Hapke
  • MEAP began April 2017
  • Publication in June 2018 (estimated)
  • ISBN 9781617294631
  • 300 pages (estimated)
  • printed in black & white

Natural Language Processing in Action is your guide to creating machines that understand human language using the power of Python with its ecosystem of packages dedicated to NLP and AI! You'll start with a mental model of how a computer learns to read and interpret language. Then, you'll discover how to train a Python-based NLP machine to recognize patterns and extract information from text. As you explore the carefully-chosen examples, you'll expand your machine's knowledge and apply it to a range of challenges, from building a search engine that can find documents based on their meaning rather than merely keywords, to training a chatbot that uses deep learning to answer questions and participate in a conversation.

Table of Contents detailed table of contents


Part 1: Wordy Machines

1 The Language of Thought

1.1 The Magic

1.2 Practical Applications

1.3 What is Driving NLP Advances?

1.4 Language through a Computer’s "Eyes"

1.4.1 The Language of Locks (Regular Expressions)

1.4.2 A Simple Chatbot

1.4.3 Another Way

1.5 A Brief Overflight of Hyperspace

1.6 Word Order and Grammar

1.7 A Chatbot Natural Language Pipeline

1.8 Processing in Depth

1.9 Natural Language IQ

1.10 Summary

2 Build Your Vocabulary

2.1 Building your vocabulary with a tokenizer

2.2 A Token Improvement

2.2.1 Contractions

2.3 Extending your vocabulary with n-grams

2.3.1 What are n-grams?

2.3.2 Stopwords

2.4 Normalizing your vocabulary

2.4.1 Case normalization

2.4.2 Stemming

2.4.3 Lemmatization

2.4.4 Use Cases

2.5 Sentiment

2.6 VADER — A Rule-based Sentiment Analyzer

2.7 Naive Bayes — a Machine Learning Sentiment Analyzer

2.8 Summary

3 Math with Words

3.1 Bag of Words

3.2 Vectorizing

3.2.1 Vector Spaces

3.3 Zipf’s Law

3.4 Topic Modeling

3.4.1 Return of Zipf

3.4.2 Relevance Ranking

3.4.3 Tools

3.4.4 Alternatives

3.4.5 Okapi BM25

3.5 Summary

4 Finding Meaning in Word Counts

4.1 From Word Counts to Topics

4.2 Latent Semantic Analysis (LSA)

4.3 Singular Value Decomposition (SVD)

4.4 Truncated SVD

4.5 SciKit-Learn vs gensim

4.5.1 PCA on a Point Cloud

4.5.2 Let’s Stop Horsing Around and Get Back to NLP

4.5.3 PCA on SMS Messages

4.5.4 Truncated SVD

4.5.5 How well does LSA work for SPAM Classification?

4.5.6 A Simple Classifier

4.6 Latent Dirichlet Allocation (LDiA)

4.6.1 The LDiA Idea

4.6.2 LDiA Topic Model for SMS Messages

4.6.3 LDiA + LDA = SPAM Classifier

4.6.4 A Fairer Comparison: 32 LDiA Topics

4.7 Distance and Similarity

4.8 Steering

4.8.1 Linear Discriminant Analysis (LDA)

4.9 Topic Vector Power

4.10 "Like" Prediction

4.11 Summary

Part 2: Deeper Learning

5 Baby Steps with Neural Networks

5.1 Neural Networks, the Ingredient List

5.1.1 Perceptron

5.1.2 A Numerical Perceptron

5.1.3 Detour through Bias

5.1.4 A Pythonic Neuron

5.1.5 Class is in Session

5.1.6 Logic is a Fun Thing to Learn

5.1.7 Next Step

5.1.8 Emergence from the From the First AI Winter

5.1.9 Backpropagation

5.1.10 Derivative All the Things

5.1.11 Let’s Go Skiing - The Error Surface

5.1.12 Off the Chair Lift, Onto the Slope

5.1.13 Let’s Shake Things Up a Bit

5.1.14 Keras: Neural Networks in Python

5.1.15 Onward and Deepward

5.1.16 Normalization: Input with Style

5.2 Summary

6 Reasoning with Word Vectors

6.1 Semantic Queries and Analogies

6.2 Word Vectors

6.2.1 Vector-Oriented Reasoning

6.2.2 Word Vector Representations based on Word2Vec

6.2.3 How to use Gensim’s word2vec module?

6.2.4 How to generate your own Word vector representations?

6.2.5 Word2vec vs GloVe (Global Vector)

6.2.6 fastText

6.2.7 Word2vec vs LSA

6.2.8 Visualizing Word Relationships

6.2.9 Unnatural Words

6.2.10 Document Similarity with Doc2vec

6.3 Summary

7 Getting Words in Order with Convolutional Neural Networks (CNNs)

7.1 Learning Meaning

7.1.1 Word Order

7.1.2 Word Proximity

7.2 Toolkit

7.3 Convolutional Neural Nets

7.3.1 Building Blocks

7.3.2 Step Size

7.3.3 Filter Composition

7.3.4 Padding

7.3.5 Learning

7.4 Narrow Windows Indeed

7.4.1 Implementation in Keras: Prepping the Data

7.4.2 Convolutional Neural Network Architecture

7.4.3 Pooling

7.4.4 Dropout

7.4.5 The Cherry on the Sundae

7.4.6 Let’s Get to Learning (Training)

7.4.7 Using the Model in a Pipeline

7.4.8 Where Do We Go From Here?

7.5 Summary

8 Loopy (Recurrent) Neural Networks

8.1 Remembering with Recurrent Networks

8.1.1 Backpropagation Through Time

8.1.2 When Do We Update What?

8.1.3 Recap

8.1.4 There’s Always a Catch

8.1.5 Recurrent Neural Net with Keras

8.2 Putting Things Together

8.3 Let’s Get to Learning Our Past Selves

8.4 Hyperparameters

8.5 Predicting

8.5.1 Statefulness

8.5.2 Two Way Street

8.5.3 What is this thing?

8.6 Summary

9 Improving Retention with Long Short-Term Memory Networks (LSTMs)

9.1 LSTM

9.1.1 Backpropagation Through Time

9.1.2 In Practice

9.1.3 Where does the rubber hit the road?

9.1.4 Dirty Data

9.1.5 Back to Our Dirty Data

9.1.6 Words are hard. Letters are easier.

9.1.7 My Turn to Talk

9.1.8 My Turn to Speak More Clearly

9.1.9 Learned How to Say, but not yet What.

9.1.10 Other Kinds of Memory

9.1.11 Going Deeper

9.2 Summary

10 Sequence to Sequence Models and Attention Mechanism

10.1 Sequence-to-Sequence Networks

10.2 How are seq2seq networks implemented?

10.2.1 Preparing your dataset for the sequence-to-sequence training

10.2.2 The Sequence-to-Sequence Encoder

10.2.3 The Sequence-to-Sequence Decoder

10.2.4 Assembling the Sequence-to-Sequence Network

10.3 Training the Sequence-to-Sequence Network

10.3.1 Generate output sequences

10.4 Building a chatbot using seq2seq networks

10.4.1 Preparing the corpus for our training

10.4.2 Building our character dictionary

10.4.3 Generate one-hot encoded training sets

10.4.4 Train your sequence-to-sequence chatbot

10.4.5 Assemble the model for sequence generation

10.4.6 Predicting a sequence

10.4.7 Generating a response

10.4.8 Converse with your chatbot

10.5 Enhancements

10.5.1 Reduce Training Complexity by using Bucketing

10.5.2 Paying Attention

10.6 In the Real World

10.7 Summary

Part 3: Getting Real

11 Information Extraction

12 Getting Chatty

13 Generative Models

14 Scaling Up


Appendix A: A Acquiring Words

Appendix B: B You Slice, I Choose

Appendix C: C Vectors and Matrices (Linear Algebra)

Appendix D: D Machine Learning

About the Technology

Most humans are pretty good at reading and interpreting text; computers...not so much. Natural Language Processing (NLP) is the discipline of teaching computers to read more like people, and you see examples of it in everything from chatbots to the speech-recognition software on your phone. Modern NLP techniques based on machine learning radically improve the ability of software to recognize patterns, use context to infer meaning, and accurately discern intent from poorly-structured text. NLP promises to help you improve customer interactions, save cost, and reinvent text-intensive applications like search or product support.

What's inside

  • Working with Keras, TensorFlow, Gensim, scikit-learn, and more
  • Parsing and normalizing text
  • Rule-based (Grammar) NLP
  • Data-based (Machine Learning) NLP
  • Deep Learning NLP
  • End-to-end chatbot pipeline with training data
  • Scalable NLP pipelines
  • Hyperparameter optimization algorithms

About the reader

While all examples are written in Python, experience with any modern programming language will allow readers to get the most from this book. A basic understanding of machine learning will also be helpful.

About the authors

Hobson Lane has more than 15 years of experience building autonomous systems that make important decisions on behalf of humans. Hannes Hapke is an Electrical Engineer turned Data Scientist with experience in deep learning. Cole Howard is a carpenter and writer turned Deep Learning expert.

Manning Early Access Program (MEAP) Read chapters as they are written, get the finished eBook as soon as it’s ready, and receive the pBook long before it's in bookstores.
MEAP combo $49.99 pBook + eBook + liveBook
MEAP eBook $39.99 pdf + ePub + kindle + liveBook

FREE domestic shipping on three or more pBooks