Deep Learning with JavaScript
Neural networks in TensorFlow.js
Shanqing Cai, Stanley Bileschi, Eric D. Nielsen with Francois Chollet
  • MEAP began November 2018
  • Publication in Fall 2019 (estimated)
  • ISBN 9781617296178
  • 350 pages (estimated)
  • printed in black & white

This book inspires me to learn more about deep learning. Especially now I can use a language I am most familiar with to do experiments.

Evan Wallace
Deep learning has transformed the fields of computer vision, image processing, and natural language applications. Thanks to TensorFlow.js, now JavaScript developers can build deep learning apps without relying on Python or R. Deep Learning with JavaScript shows developers how they can bring DL technology to the web. Written by the main authors of the TensorFlow library, this new book provides fascinating use cases and in-depth instruction for deep learning apps in JavaScript in your browser or on Node.
Table of Contents detailed table of contents

Part 1: Motivation and Basic Concepts

1 Deep Learning and JavaScript

1.1 Artificial intelligence, machine learning, neural networks and deep learning

1.1.1 Artificial intelligence

1.1.2 Machine learning: How it differs from traditional programming

1.1.3 Neural networks and deep learning

1.1.4 Why deep learning? Why now?

1.2 Why combine JavaScript and machine learning

1.2.1 Why TensorFlow.js?

1.2.2 What this book will and will not teach you about TensorFlow.js

1.3 Summary

1.4 Exercises

Part 2: A Gentle Introduction to TensorFlow.js

2 Getting Started: Simple Linear Regression in TensorFlow.js

2.1 Example 1: Predicting the duration of a download using TensorFlow.js

2.1.1 Project Overview : Duration Prediction

2.1.2 A note on code listings and console interactions

2.1.3 Creating and formatting the data

2.1.4 Defining a simple model

2.1.5 Fitting the model to the training data

2.1.6 Using our trained model to make predictions

2.1.7 Summary of our first example

2.2 Inside Model.fit(): Dissecting gradient descent from Example 1

2.2.1 The intuitions behind gradient descent optimization

2.2.2 Backpropagation: Inside gradient descent

2.3 Linear regression with multiple input features

2.3.1 The Boston Housing Prices dataset

2.3.2 Getting and running the Boston-housing project from GitHub

2.3.3 Accessing the Boston-housing data

2.3.4 Precisely defining the Boston-housing problem

2.3.5 A slight diversion into data normalization

2.3.6 Linear regression on the Boston-housing data

2.4 How to interpret your model

2.4.1 Extracting meaning from learned weights

2.4.2 Extracting internal weights from the model

2.4.3 Caveats on interpretability

2.5 Summary

2.6 Exercises

3 Adding Nonlinearity: Beyond Weighted Sums

3.1 Nonlinearity: What It Is and What It Is Good For

3.1.1. Building the Intuition for Nonlinearity in Neural Networks

3.1.2. Hyperparameters and Hyperparameter Optimization

3.2. Nonlinearity at Output: Models for Classification

3.2.1. What is Binary Classification

3.2.2. Measuring the Quality of Binary Classifiers: Precision, Recall, Accuracy, and ROC curves

3.2.3. The ROC Curve: Showing Tradeoffs in Binary Classification

3.2.4. Binary Cross Entropy: The Loss Function for Binary Classification

3.3. Multi-class Classification

3.3.1. One-hot Encoding of Categorical Data

3.3.2. Softmax Activation

3.3.3. Categorical Cross Entropy: The Loss Function for Multi-class Classification

3.3.4. Confusion Matrix: Fine-grained Analysis of Multi-class Classification

Summary

Exercises

4 Recognizing Images and Sounds Using Convolutional Neural Networks

4.1. From Vectors to Tensors: Representing Images

4.1.1. The MNIST Dataset

4.2. Your First Convolutional Neural Network

4.2.1. conv2d Layer

4.2.2. maxPooling2d Layer

4.2.3. Repeating Motifs of Convolution and Pooling

4.2.4. Flatten and Dense Layers

4.2.5. Training the Convnet

4.2.6. Using Convnet to Make Predictions

4.3. Beyond Browsers: Training Models Faster Using Node.js

4.3.1. Dependencies and Imports for Using tfjs-node

4.3.2. Training an enhanced convnet for MNIST in tfjs-node

4.3.3. Saving Model from Node.js and Loading Model in Browser

4.4. Spoken word recognition: Applying Convnets on Audio Data

4.4.1. Spectrograms: Representing Sounds as Images

Summary

Exercises

5 Transfer Learning: Reusing Pretrained Neural Networks

5.1 Introduction to transfer learning: Reusing pretrained models

5.1.1 Transfer learning based on compatible output shapes: Freezing layers

5.1.2 Transfer learning on incompatible output shapes: Creating a new model using outputs from the base model

5.1.3 Getting the most out of transfer-learning through fine-tuning: An audio example

5.2 Object detection through transfer learning on a convnet

5.2.1 A simple object detection problem based on synthesized scenes

5.2.2 Deep dive into simple object detection

Summary

Exercises

Part 3: Advanced Deep Learning with TensorFlow.js

6 Working with Data

6.1 Data is rocket fuel

6.2 Using tf.data to manage data

6.2.1 The tf.data.Dataset object

6.2.2 Creating a tf.data.Dataset

6.2.3 Accessing the data in your Dataset

6.2.4 Manipulating tfjs-data Datasets

6.3 Training models with model.fitDataset

6.4 Common patterns for accessing data

6.4.1 Working with CSV format data

6.4.2 Accessing video data using tf.data.webcam

6.5 Your data is likely flawed. Dealing with problems in your data

6.5.1 Theory of data

6.5.2 Detecting and cleaning problems with data

6.6 Data augmentation

6.7 Summary

6.8 Exercises

7 Visualizing Data and Models

7.1 Data visualization

7.1.1 Visualizing data using tfjs-vis

7.2 Visualizing model training

7.2.1 Formulation of the temperature-prediction problem

7.2.2 Underfitting, overfitting, and countermeasures

7.3 Visualizing models after training

7.3.1 Visualizing the internal activations of a convnet

7.3.2 Visualizing what convolutional layers are sensitive to: Maximally-activating images

7.3.3 Visual interpretation of a convnet’s classification result

Summary

Exercises

8 Deep Learning for Sequences and Text

8.1 Second attempt at weather prediction: Introducing Recurrent Neural Networks

8.1.1 Why dense layers fail to model sequential order

8.1.2 How recurrent neural networks model sequential order

8.2. Building deep learning models for text

8.2.1 How text is represented in machine learning: One-hot and multi-hot encoding

8.2.2 First attempt at the sentiment analysis problem

8.2.3 A more efficient representation of text: Word embeddings

8.2.4 1D convolutional neural networks

8.3. Sequence-to-sequence tasks with attention mechanism

8.3.1 Formulation of the sequence-to-sequence task

8.3.2 The encoder-decoder architecture and the attention mechanism

8.3.3 Deep dive into the attention-based encoder-decoder model

8.4 Summary

8.5 Materials for further reading

8.6 Exercises

9 Generative deep learning in the browser

10 Reinforcement learning in the browser

Part 4: Summary and Closing Words

11 Quick review

12 Final Words

Appendixes

Appendix A: Appendix A

A.1 Glossary

A.2 Comparing the Features of TensorFlow.js to Some Other JavaScript Deep-Learning Libraries

A.3 Installing tfjs-node-gpu and Its Dependencies

A.3.1 Installing tfjs-node-gpu on Linux

A.3.2 Installing tfjs-node-gpu on Windows

A.4 A Quick Tutorial of Tensors and Operations in TensorFlow.js

A.4.1 Tensor creation and tensor axis conventions

A.4.2 Basic tensor operations

A.4.3 Memory Management in TensorFlow.js: tf.dispose() and tf.tidy()

A.4.4 Exercises for Appendix A4

About the Technology

TensorFlow.js is an open-source JavaScript library for defining, training, and deploying deep learning models to the web browser. It’s quickly gaining popularity with developers for its amazing set of benefits including scalability, responsiveness, modularity, and portability. And with JavaScript, deep learning applications can run on a wide variety platforms, making it more accessible than ever before!. TensorFlow-based applications are combating disease, detecting and translating speech in real time and helping NASA identify Near Earth Objects. Imagine what it can do for you!

About the book

In Deep Learning with JavaScript, authors Shanqing Cai, Eric Nielsen, Stanley Bileschi and François Chollet teach you how to use TensorFlow to build incredible deep learning applications in JavaScript. These seasoned deep learning experts make it easy to see why JavaScript lends itself so well to deep learning. After teaching you the basics of Tensorflow.js, they ease you into core concepts like client-side prediction and analytics, web-based sensors, and supervised machine learning. Then, you’ll dive into image recognition, transfer learning, preparing data, DL for text and sequences, generative DL in the browser, and reinforced learning in the browser. Interesting and relevant use cases, including recognizing speech commands and captioning images and videos, drive home the value of your new skills. By the end, you’ll be solving real world problems with DL models and applications of your own!

What's inside

  • Deploying computer vision, audio, and natural language processing in the browser
  • Fine-tuning machine learning models with client-side data
  • Constructing and training a neural network
  • Interactive AI for browser games using deep reinforcement learning
  • Generative neural networks to generate music and pictures
  • Using TensorFlow.js with Cloud ML

About the reader

For web developers interested in deep learning.

About the author

Shanging Cai and Eric Nielsen are senior software engineers on the Google Brain team. Stan Bileschi is the technical lead for Google’s TensorFlow Usability team, which built the TensorFlow Layers API. All three have advanced degrees from MIT. Together, they’re responsible for writing most of TensorFlow.js.

Manning Early Access Program (MEAP) Read chapters as they are written, get the finished eBook as soon as it’s ready, and receive the pBook long before it's in bookstores.
MEAP combo $49.99 pBook + eBook + liveBook
MEAP eBook $39.99 pdf + ePub + kindle + liveBook

placing your order...

Don't refresh or navigate away from the page.

FREE domestic shipping on three or more pBooks