Deep Learning for Search
Tommaso Teofili
Foreword by Chris Mattmann
  • June 2019
  • ISBN 9781617294792
  • 328 pages
  • printed in black & white

A practical approach that shows you the state of the art in using neural networks, AI, and deep learning in the development of search engines.

From the Foreword by Chris Mattmann, NASA JPL

Deep Learning for Search teaches you how to improve the effectiveness of your search by implementing neural network-based techniques. By the time you're finished with the book, you'll be ready to build amazing search engines that deliver the results your users need and that get better as time goes on!

About the Technology

Deep learning handles the toughest search challenges, including imprecise search terms, badly indexed data, and retrieving images with minimal metadata. And with modern tools like DL4J and TensorFlow, you can apply powerful DL techniques without a deep background in data science or natural language processing (NLP). This book will show you how.

About the book

Deep Learning for Search teaches you to improve your search results with neural networks. You’ll review how DL relates to search basics like indexing and ranking. Then, you’ll walk through in-depth examples to upgrade your search with DL techniques using Apache Lucene and Deeplearning4j. As the book progresses, you’ll explore advanced topics like searching through images, translating user queries, and designing search engines that improve as they learn!

Table of Contents detailed table of contents

Part 1: Search meets deep learning

1 Neural search

1.1 Neural networks and deep learning

1.2 What is machine learning?

1.4 A roadmap for learning deep learning

1.5 Retrieving useful information

1.5.1 Text, tokens, terms, and search fundamentals

1.5.2 Relevance first

1.5.3 Classic retrieval models

1.5.4 Precision and recall

1.6 Unsolved problems

1.7 Opening the search engine black box

1.8 Deep learning to the rescue

1.9 Index, please meet neuron

1.10 Neural network training


2 Generating synonyms

2.1 Introduction to synonym expansion

2.1.1 Why synonyms?

2.1.2 Vocabulary-based synonym matching

2.2 The importance of context

2.3 Feed-forward neural networks

2.4 Using word2vec

2.4.1 Setting up word2vec in Deeplearning4j

2.4.2 Word2vec-based synonym expansion

2.5 Evaluations and comparisons

2.6 Considerations for production systems

2.6.1 Synonyms vs. antonyms


Part 2: Throwing neural nets at a search engine

3 From plain retrieval to text generation

3.1 Information need vs. query: Bridging the gap

3.1.1 Generating alternative queries

3.1.2 Data preparation

3.1.3 Wrap-up of generating data

3.2 Learning over sequences

3.3 Recurrent neural networks

3.3.1 RNN internals and dynamics

3.3.2 Long-term dependencies

3.3.3 Long short-term memory networks

3.4 LSTM networks for unsupervised text generation

3.4.1 Unsupervised query expansion

3.5 From unsupervised to supervised text generation

3.5.1 Sequence-to-sequence modeling

3.6 Considerations for production systems


4 More-sensitive query suggestions

4.1 Generating query suggestions

4.1.1 Suggesting while composing queries

4.1.2 Dictionary-based suggesters

4.2 Lucene Lookup APIs

4.3 Analyzed suggesters

4.4 Using language models

4.5 Content-based suggesters

4.6 Neural language models

4.7 Character-based neural language model for suggestions

4.8 Tuning the LSTM language model

4.9 Diversifying suggestions using word embeddings


5 Ranking search results with word embeddings

5.1 The importance of ranking

5.2 Retrieval models

5.2.1 TF-IDF and the vector space model

5.2.2 Ranking documents in Lucene

5.2.3 Probabilistic models

5.3 Neural information retrieval

5.4 From word to document vectors

5.5 Evaluations and comparisons

5.5.1 Similarity based on averaged word embeddings


6 Document embeddings for rankings and recommendations

6.1 From word to document embeddings

6.2 Using paragraph vectors in ranking

6.2.1 Paragraph vector—​based similarity

6.3.2 Using frequent terms to find similar content

6.3.3 Retrieving similar content with paragraph vectors

6.3.4 Retrieving similar content with vectors from encoder-decoder models


Part 3: One step beyond

Searching across languages

7.1 Serving users who speak multiple languages

7.1.1 Translating documents vs. queries

7.1.3 Querying in multiple languages on top of Lucene

7.2 Statistical machine translation

7.2.1 Alignment

7.2.2 Phrase-based translation

7.3 Working with parallel corpora

7.4 Neural machine translation

7.4.1 Encoder-decoder models

7.4.2 Encoder-decoder for MT in DL4J

7.5 Word and document embeddings for multiple languages

7.5.1 Linear projected monolingual embeddings


8 Content-based image search

8.2 A look back: Text-based image retrieval

8.3 Understanding images

8.3.1 Image representations

8.3.2 Feature extraction

8.4 Deep learning for image representation

8.4.1 Convolutional neural networks

8.4.3 Locality-sensitive hashing

8.5 Working with unlabeled images


9 A peek at performance

9.1 Performance and the promises of deep learning

9.1.1 From model design to production

9.2 Indexes and neurons working together

9.3 Working with streams of data


What's inside

  • Accurate and relevant rankings
  • Searching across languages
  • Content-based image search
  • Search with recommendations

About the reader

For developers comfortable with Java or a similar language and search basics. No experience with deep learning or NLP needed.

About the author

Tommaso Teofili is a software engineer with a passion for open source and machine learning. As a member of the Apache Software Foundation, he contributes to a number of open source projects, ranging from topics like information retrieval (such as Lucene and Solr) to natural language processing and machine translation (including OpenNLP, Joshua, and UIMA).

He currently works at Adobe, developing search and indexing infrastructure components, and researching the areas of natural language processing, information retrieval, and deep learning. He has presented search and machine learning talks at conferences including BerlinBuzzwords, International Conference on Computational Science, ApacheCon, EclipseCon, and others. You can find him on Twitter at @tteofili.

placing your order...

Don't refresh or navigate away from the page.
print book $35.99 $59.99 pBook + eBook + liveBook
Additional shipping charges may apply
Deep Learning for Search (print book) added to cart
continue shopping
go to cart

eBook $38.39 $47.99 3 formats + liveBook
Deep Learning for Search (eBook) added to cart
continue shopping
go to cart

Prices displayed in rupees will be charged in USD when you check out.
customers also reading

This book 1-hop 2-hops 3-hops

FREE domestic shipping on three or more pBooks