Near-lifelike chatbots, meaningful resume-to-job matches, and laser-focused product recommendations are just a few examples of what’s possible when you apply deep learning to natural language processing (NLP). Emerging NLP algorithms and machine learning techniques give these amazing systems the ability to determine emotional tone, infer meaning from context, summarize documents, and even generate helpful responses to new questions.
Exploring Deep Learning for Language is a collection of chapters from five Manning books, handpicked by machine learning expert Jeff Smith. This free eBook begins with an overview of natural language processing before moving on to techniques for working with language data. You’ll explore practical techniques like feature generation to help algorithms make sense of your unstructured data and generating synonyms for improving relevant query results. You’ll also get an overview of more advanced topics like using artificial neural networks to model language and embedding natural language in the popular TensorFlow machine learning framework. These carefully-selected chapters deliver a solid foundation for what you can do when you combine deep learning with natural language processing.
2.3.2 Setting up a Lucene index with synonym expansion
2.4 Generating synonyms
2.5 Feed-forward neural networks
2.5.1 How it works: Weights and activation functions
2.5.2 Backpropagation in a nutshell
2.6 Using word2vec
2.6.1 Setting up word2vec in Deeplearning4J
2.6.2 Word2vec-based synonym expansion
2.7 Evaluations and comparisons
2.8 Considerations for production systems
2.8.1 Synonyms vs. antonyms
Neural Networks that Understand Language
Neural Networks that understand language king — man + woman ==?
11.1 What does it mean to understand language?
What kinds of predictions do people make about language?
11.2 Natural language processing (NLP)
NLP is divided into a collection of tasks or challenges.
11.3 Supervised NLP
Words go in, and predictions come out.
11.4 IMDB movie reviews dataset
You can predict whether people post positive or negative reviews.
11.5 Capturing word correlation in input data
Bag of words: Given a review’s vocabulary, predict the sentiment.
11.6 Predicting movie reviews
With the encoding strategy and the previous network, you can predict sentiment.
11.7 Intro to an embedding layer
Here’s one more trick to make the network faster.
After running the previous code, run this code.
11.8 Interpreting the output
11.9 Neural architecture
How did the choice of architecture affect what the network learned?
What should you see in the weights connecting words and hidden neurons?
11.10 Comparing word embeddings
How can you visualize weight similarity?
11.11 What is the meaning of a neuron?
Meaning is entirely based on the target labels being predicted.
11.12 Filling in the blank
Learn richer meanings for words by having a richer signal to learn.
11.13 Meaning is derived from loss.
Neural networks don’t really learn data; they minimize the loss function.
The choice of loss function determines the neural network’s knowledge.
11.14 King — Man + Woman ~= Queen
Word analogies are an interesting consequence of the previously built network.
11.15 Word analogies
Linear compression of an existing property in the data
You’ve learned a lot about neural word embeddings and the impact of loss on learning.
SEQUENCE-TO-SEQUENCE MODELS FOR CHATBOTS
Sequence-to-sequence models for chatbots
11.1 Building on classification and RNNs
11.2 Seq2seq architecture
11.3 Vector representation of symbols
11.4 Putting it all together
11.5 Gathering dialogue data
"Packets of thought (NLP overview)" from Natural Language Processing in Action by Hobson Lane, Cole Howard, and Hannes Hapke
"Generating features" from Machine Learning Systems by Jeff Smith
"Generating synonyms" from from Deep Learning for Search by Tommaso Teofili
"Neural Networks that understand language" from Grokking Deep Learning by Andrew Trask
"Sequence-to-sequence models for chatbots" from Machine Learning with TensorFlow by Nishant Shukla
Jeff Smith builds powerful machine learning systems. For the past decade, he has been working on building data science applications, teams, and companies as part of various teams in New York, San Francisco, and Hong Kong. He blogs (https://medium.com/@jeffksmithjr), tweets (@jeffksmithjr), and speaks (www.jeffsmith.tech/speaking) about various aspects of building real-world machine learning systems. He’s the author of Machine Learning Systems from Manning.
Exploring Deep Learning for Language (eBook) added to cart