Exploring Natural Language Processing
With chapters selected by Hobson Lane
  • July 2020
  • ISBN 9781617297342
  • 101 pages
Natural Language Processing, a machine’s ability to understand written or spoken text, is an exciting technology that is making a huge impact on our everyday lives, both personally and professionally. Personal virtual assistants, email autocorrect, spam filters, online chatbots, and real-time translators are all made possible because of advances in NLP. Continuously improving in dealing with the subtleties of human communication, such as inferring meaning from context and discerning emotions and sentiment, this incredible technology is even being trusted in such sensitive areas as aircraft maintenance and predictive police work. With its limitless possible applications, NLP promises to significantly shape our lives, now and far into the future.

About the book

In Exploring Natural Language Processing, author and NLP engineer Hobson Lane has combined four chapters from Manning books that introduce you to this amazing technology. You’ll learn basic NLP concepts, including the impact of deep learning on NLP, and take a look at a few methods used to process language. You’ll explore word vectors—numerical representations of word meanings—to identify synonyms, antonyms, or words that belong to the same category. In a chapter on sequential labeling and language models, you’ll learn how these NLP processes help with tasks including linguistic analysis, speech recognition, and image captioning. You’ll also delve into mapping one text sequence to another with a neural network, and using encoder-decoder model architectures for translation and chat. This interesting and informative sampler will inspire you with ideas for applying NLP in your own innovative ways!
Table of Contents detailed table of contents


Part 1: Combining data values

Combining data values

2.1 Basic architectures of deep learning

2.1.1 Deep multilayer perceptrons

2.1.2 Two basic operators: spatial and temporal

2.2 Deep learning and NLP: a new paradigm

2.3 Summary

Part 2: Reasoning with word vectors (Word2vec)

Reasoning with word vectors (Word2vec)

6.1 Semantic queries and analogies

6.2 Word vectors

6.2.1 Vector-oriented reasoning

6.2.2 How to compute Word2vec representations

6.2.3 How to use the gensim.word2vec module

6.2.4 How to generate your own word vector representations

6.2.5 Word2vec vs. GloVe (Global Vectors)

6.2.6 fastText

6.2.7 Word2vec vs. LSA

6.2.8 Visualizing word relationships

6.2.9 Unnatural words

6.2.10 Document similarity with Doc2vec

6.3 Summary

Part 3: Sequential labeling and language modeling

Sequential labeling and language modeling

5.1 Introduction to sequential labeling

5.1.1 What is sequential labeling

5.1.2 Using RNNs to encode sequences

5.1.3 Implementing Seq2seq encoder in AllenNLP

5.2 Building a part-of-speech tagger

5.2.1 Reading dataset

5.2.2 Defining the model and the loss

5.2.3 Building the training pipeline

5.3 Multi-layer and bidirectional RNNs

5.3.1 Multi-layer RNNs

5.3.2 Bidirectional RNN

5.4 Named entity recognition

5.4.1 What is named entity recognition?

5.4.2 Tagging spans

5.4.3 Implementing a named entity recognizer

5.5 Modeling a language

5.5.1 What a language model is

5.5.2 Why language models are useful

5.5.3 Training an RNN language model

5.6 Text generation using RNNs

5.6.1 Feeding characters to an RNN

5.6.2 Evaluating text using language model

5.6.3 Generating text using language model

5.7 Summary

Part 4: Sequence-to-sequence models and attention

Sequence-to-sequence models and attention

10.1 Encoder-decoder architecture

10.1.1 Decoding thought

10.1.2 Look familiar?

10.1.3 Sequence-to-sequence conversation

10.1.4 LSTM review

10.2 Assembling a sequence-to-sequence pipeline

10.2.1 Preparing your dataset for the sequence-to-sequence training

10.2.2 Sequence-to-sequence model in Keras

10.2.3 Sequence encoder

10.2.4 Thought decoder

10.2.5 Assembling the sequence-to-sequence network

10.3 Training the sequence-to-sequence network

10.3.1 Generate output sequences

10.4 Building a chatbot using sequence-to-sequence networks

10.4.1 Preparing the corpus for your training

10.4.2 Building your character dictionary

10.4.3 Generate one-hot encoded training sets

10.4.4 Train your sequence-to-sequence chatbot

10.4.5 Assemble the model for sequence generation

10.4.6 Predicting a sequence

10.4.7 Generating a response

10.4.8 Converse with your chatbot

10.5 Enhancements

10.5.1 Reduce training complexity with bucketing

10.5.2 Paying attention

10.6 In the real world

10.7 Summary

What's inside

  • “Deep learning and language: the basics” – Chapter 2 from Deep Learning for Natural Language Processing by Stephan Raaijmakers
  • “Reasoning with word vectors (Word2vec)” – Chapter 6 from Natural Language Processing in Action by Hobson Lane, Cole Howard, and Hannes Max Hapke
  • “Sequential Labeling and Language Modeling”– Chapter 5 from Real-World Natural Language Processing by Masato Hagiwara
  • “Sequence-to-sequence models and attention” – Chapter 10 from Natural Language Processing in Action by Hobson Lane, Cole Howard, and Hannes Max Hapke

About the author

Hobson Lane is a seasoned data scientist, NLP engineer, and open source advocate who loves building helpful autonomous devices and systems.

placing your order...

Don't refresh or navigate away from the page.
eBook $0.00 PDF only + liveBook
Exploring Natural Language Processing (eBook) added to cart
continue shopping
go to cart

Prices displayed in rupees will be charged in USD when you check out.

FREE domestic shipping on three or more pBooks