Welcome to Manning India!

We are pleased to be able to offer regional eBook pricing for Indian residents.
All eBook prices are discounted 40% or more!
Exploring Deep Learning for Language
With chapters selected by Jeff Smith
  • April 2019
  • ISBN 9781617296796
  • 160 pages
Near-lifelike chatbots, meaningful resume-to-job matches, and laser-focused product recommendations are just a few examples of what’s possible when you apply deep learning to natural language processing (NLP). Emerging NLP algorithms and machine learning techniques give these amazing systems the ability to determine emotional tone, infer meaning from context, summarize documents, and even generate helpful responses to new questions.

Exploring Deep Learning for Language is a collection of chapters from five Manning books, handpicked by machine learning expert Jeff Smith. This free eBook begins with an overview of natural language processing before moving on to techniques for working with language data. You’ll explore practical techniques like feature generation to help algorithms make sense of your unstructured data and generating synonyms for improving relevant query results. You’ll also get an overview of more advanced topics like using artificial neural networks to model language and embedding natural language in the popular TensorFlow machine learning framework. These carefully-selected chapters deliver a solid foundation for what you can do when you combine deep learning with natural language processing.
Table of Contents detailed table of contents

The Language of Thought

Packets of thought (NLP overview)

1.1 Natural language vs. programming language

1.2 The magic

1.2.1 Machines that converse

1.2.2 The math

1.3 Practical applications

1.4 Language through a computer’s “eyes”

1.4.1 The language of locks

1.4.2 Regular expressions

1.4.3 A simple chatbot

1.4.4 Another way

1.5 A brief overflight of hyperspace

1.6 Word order and grammar

1.7 A chatbot natural language pipeline

1.8 Processing in depth

1.9 Natural language IQ

1.10 Summary

What’s inside:

GENERATING FEATURES

Generating features

4.1 Spark ML

4.2 Extracting features

4.3 Transforming features

4.3.1 Common feature transforms

4.3.2 Transforming concepts

4.4 Selecting features

4.5 Structuring feature code

4.5.1 Feature generators

4.5.2 Feature set composition

4.6 Applications

4.7 Reactivities

4.8 Summary

What’s inside:

Generating synonyms

Generating synonyms

2.1 Introducing synonym expansion

2.2 Why synonyms?

2.3 Vocabulary-based synonym matching

2.3.1 A quick look at Apache Lucene

2.3.2 Setting up a Lucene index with synonym expansion

2.4 Generating synonyms

2.5 Feed-forward neural networks

2.5.1 How it works: Weights and activation functions

2.5.2 Backpropagation in a nutshell

2.6 Using word2vec

2.6.1 Setting up word2vec in Deeplearning4J

2.6.2 Word2vec-based synonym expansion

2.7 Evaluations and comparisons

2.8 Considerations for production systems

2.8.1 Synonyms vs. antonyms

2.9 Summary

What’s inside:

Neural Networks that Understand Language

Neural Networks that understand language king — man + woman ==?

11.1 What does it mean to understand language?

What kinds of predictions do people make about language?

11.2 Natural language processing (NLP)

NLP is divided into a collection of tasks or challenges.

11.3 Supervised NLP

Words go in, and predictions come out.

11.4 IMDB movie reviews dataset

You can predict whether people post positive or negative reviews.

11.5 Capturing word correlation in input data

Bag of words: Given a review’s vocabulary, predict the sentiment.

11.6 Predicting movie reviews

With the encoding strategy and the previous network, you can predict sentiment.

11.7 Intro to an embedding layer

Here’s one more trick to make the network faster.

After running the previous code, run this code.

11.8 Interpreting the output

11.9 Neural architecture

How did the choice of architecture affect what the network learned?

What should you see in the weights connecting words and hidden neurons?

11.10 Comparing word embeddings

How can you visualize weight similarity?

11.11 What is the meaning of a neuron?

Meaning is entirely based on the target labels being predicted.

11.12 Filling in the blank

Learn richer meanings for words by having a richer signal to learn.

11.13 Meaning is derived from loss.

Neural networks don’t really learn data; they minimize the loss function.

The choice of loss function determines the neural network’s knowledge.

11.14 King — Man + Woman ~= Queen

Word analogies are an interesting consequence of the previously built network.

11.15 Word analogies

Linear compression of an existing property in the data

11.16 Summary

You’ve learned a lot about neural word embeddings and the impact of loss on learning.

SEQUENCE-TO-SEQUENCE MODELS FOR CHATBOTS

Sequence-to-sequence models for chatbots

11.1 Building on classification and RNNs

11.2 Seq2seq architecture

11.3 Vector representation of symbols

11.4 Putting it all together

11.5 Gathering dialogue data

11.6 Summary

What's inside

  • "Packets of thought (NLP overview)" from Natural Language Processing in Action by Hobson Lane, Cole Howard, and Hannes Hapke
  • "Generating features" from Machine Learning Systems by Jeff Smith
  • "Generating synonyms" from from Deep Learning for Search by Tommaso Teofili
  • "Neural Networks that understand language" from Grokking Deep Learning by Andrew Trask
  • "Sequence-to-sequence models for chatbots" from Machine Learning with TensorFlow by Nishant Shukla

About the author

Jeff Smith builds powerful machine learning systems. For the past decade, he has been working on building data science applications, teams, and companies as part of various teams in New York, San Francisco, and Hong Kong. He blogs (https://medium.com/@jeffksmithjr), tweets (@jeffksmithjr), and speaks (www.jeffsmith.tech/speaking) about various aspects of building real-world machine learning systems. He’s the author of Machine Learning Systems from Manning.

eBook $0.00 PDF only + liveBook
Prices displayed in rupees will be charged in USD when you check out.

placing your order...

Don't refresh or navigate away from the page.

FREE domestic shipping on three or more pBooks