Welcome to Manning India!

We are pleased to be able to offer regional eBook pricing for Indian residents.
All eBook prices are discounted 40% or more!
Deep Learning with JavaScript
Neural networks in TensorFlow.js
Shanqing Cai, Stanley Bileschi, Eric D. Nielsen with Francois Chollet
  • MEAP began November 2018
  • Publication in December 2019 (estimated)
  • ISBN 9781617296178
  • 350 pages (estimated)
  • printed in black & white

This book inspires me to learn more about deep learning. Especially now I can use a language I am most familiar with to do experiments.

Evan Wallace
Deep learning has transformed the fields of computer vision, image processing, and natural language applications. Thanks to TensorFlow.js, now JavaScript developers can build deep learning apps without relying on Python or R. Deep Learning with JavaScript shows developers how they can bring DL technology to the web. Written by the main authors of the TensorFlow library, this new book provides fascinating use cases and in-depth instruction for deep learning apps in JavaScript in your browser or on Node.
Table of Contents detailed table of contents

Part 1: Motivation and Basic Concepts

1 Deep Learning and JavaScript

1.1 Artificial intelligence, machine learning, neural networks and deep learning

1.1.1 Artificial intelligence

1.1.2 Machine learning: How it differs from traditional programming

1.1.3 Neural networks and deep learning

1.1.4 Why deep learning? Why now?

1.2 Why combine JavaScript and machine learning

1.2.1 Why TensorFlow.js?

1.2.2 What this book will and will not teach you about TensorFlow.js

Summary

Exercises

Part 2: A Gentle Introduction to TensorFlow.js

2 Getting Started: Simple Linear Regression in TensorFlow.js

2.1 Example 1: Predicting the duration of a download using TensorFlow.js

2.1.1 Project Overview : Duration Prediction

2.1.2 A note on code listings and console interactions

2.1.3 Creating and formatting the data

2.1.4 Defining a simple model

2.1.5 Fitting the model to the training data

2.1.6 Using our trained model to make predictions

2.1.7 Summary of our first example

2.2 Inside Model.fit(): Dissecting gradient descent from Example 1

2.2.1 The intuitions behind gradient descent optimization

2.2.2 Backpropagation: Inside gradient descent

2.3 Linear regression with multiple input features

2.3.1 The Boston Housing Prices dataset

2.3.2 Getting and running the Boston-housing project from GitHub

2.3.3 Accessing the Boston-housing data

2.3.4 Precisely defining the Boston-housing problem

2.3.5 A slight diversion into data normalization

2.3.6 Linear regression on the Boston-housing data

2.4 How to interpret your model

2.4.1 Extracting meaning from learned weights

2.4.2 Extracting internal weights from the model

2.4.3 Caveats on interpretability

Summary

Exercises

3 Adding Nonlinearity: Beyond Weighted Sums

3.1 Nonlinearity: What It Is and What It Is Good For

3.1.1. Building the Intuition for Nonlinearity in Neural Networks

3.2. Nonlinearity at Output: Models for Classification

3.2.2. Measuring the Quality of Binary Classifiers: Precision, Recall, Accuracy, and ROC curves

3.2.4. Binary Cross Entropy: The Loss Function for Binary Classification

3.3. Multi-class Classification

3.3.1. One-hot Encoding of Categorical Data

3.3.2. Softmax Activation

3.3.3. Categorical Cross Entropy: The Loss Function for Multi-class Classification

3.3.4. Confusion Matrix: Fine-grained Analysis of Multi-class Classification

Summary

Exercises

4 Recognizing Images and Sounds Using Convolutional Neural Networks

4.1. From Vectors to Tensors: Representing Images

4.1.1. The MNIST Dataset

4.2. Your First Convolutional Neural Network

4.2.1. conv2d Layer

4.2.2. maxPooling2d Layer

4.2.3. Repeating Motifs of Convolution and Pooling

4.2.4. Flatten and Dense Layers

4.2.5. Training the Convnet

4.2.6. Using Convnet to Make Predictions

4.3. Beyond Browsers: Training Models Faster Using Node.js

4.3.1. Dependencies and Imports for Using tfjs-node

4.3.2. Training an enhanced convnet for MNIST in tfjs-node

4.3.3. Saving Model from Node.js and Loading Model in Browser

4.4. Spoken word recognition: Applying Convnets on Audio Data

4.4.1. Spectrograms: Representing Sounds as Images

Summary

Exercises

5 Transfer Learning: Reusing Pretrained Neural Networks

5.1 Introduction to transfer learning: Reusing pretrained models

5.1.1 Transfer learning based on compatible output shapes: Freezing layers

5.1.2 Transfer learning on incompatible output shapes: Creating a new model using outputs from the base model

5.1.3 Getting the most out of transfer-learning through fine-tuning: An audio example

5.2 Object detection through transfer learning on a convnet

5.2.1 A simple object detection problem based on synthesized scenes

5.2.2 Deep dive into simple object detection

Summary

Exercises

Part 3: Advanced Deep Learning with TensorFlow.js

6 Working with Data

6.1 Data is rocket fuel[93]

6.2 Using tf.data to manage data

6.2.1 The tf.data.Dataset object

6.2.2 Creating a tf.data.Dataset

6.2.3 Accessing the data in your Dataset

6.2.4 Manipulating tfjs-data Datasets

6.3 Training models with model.fitDataset

6.4 Common patterns for accessing data

6.4.1 Working with CSV format data

6.4.2 Accessing video data using tf.data.webcam

6.4.3 Accessing audio data using tf.data.microphone

6.5 Your data is likely flawed. Dealing with problems in your data

6.5.1 Theory of data

6.5.2 Detecting and cleaning problems with data

6.6 Data augmentation

Summary

Exercises

7 Visualizing Data and Models

7.1 Data visualization

7.1.1 Visualizing data using tfjs-vis

7.2 Visualizing models after training

7.2.1 Visualizing the internal activations of a convent

7.2.2 Visualizing what convolutional layers are sensitive to: Maximally-activating images

7.2.3 Visual interpretation of a convnet’s classification result

Summary

Materials for further reading and exploration

Exercises

8 Underfitting, Overfitting, and the Universal Workflow of Machine Learning

8.1 Visualizing model training

8.1.1 Formulation of the temperature-prediction problem

8.1.2 Underfitting, overfitting, and countermeasures

8.2 The universal workflow of machine learning

Summary

Exercises

9 Deep Learning for Sequences and Text

9.1 Second attempt at weather prediction: Introducing Recurrent Neural Networks

9.1.1 Why dense layers fail to model sequential order

9.1.2 How recurrent neural networks model sequential order

9.2. Building deep learning models for text

9.2.1. How text is represented in machine learning: One-hot and multi-hot encoding

9.2.2. First attempt at the sentiment analysis problem

9.2.3. A more efficient representation of text: Word embeddings

9.2.4. 1D convolutional neural networks

9.3. Sequence-to-sequence tasks with attention mechanism

9.3.1 Formulation of the sequence-to-sequence task

9.3.2 The encoder-decoder architecture and the attention mechanism

9.3.3 Deep dive into the attention-based encoder-decoder model

Summary

Materials for further reading

Exercises

10 Generative Deep Learning

10.1 Generating text with LSTM

10.1.1. Next-character predictor: A simple way to generate text

10.1.2. The LSTM-text-generation example

10.1.3. Temperature: adjustable randomness in the generated text

10.2 Variational autoencoders (VAEs): Finding an efficient and structured vector representation of images

10.2.1. Classical autoencoder and VAE: Basic ideas

10.2.2. A detailed example of VAE: The Fashion-MNIST example

10.3 Image generation with generative adversarial networks (GANs)

10.3.1 The basic idea behind GANs

10.3.2 The building blocks of ACGAN

10.3.3 Diving deeper into the training of ACGAN

10.3.4 Seeing the MNIST ACGAN training and generation

Summary

Materials for further reading

Exercises

11 Basics of Deep Reinforcement Learning

11.1 The formulation reinforcement-learning problems

11.2 Policy networks and policy gradients: The cart-pole example

11.2.1 Cart-pole as a reinforcement-learning problem

11.2.2 Policy network

11.2.3 Training the policy network: The REINFORCE algorithm

11.3 Value networks and Q-learning: The snake game example

11.3.1 Snake as a reinforcement-learning problem

11.3.2 Markov decision process and Q-values

11.3.3 Deep Q-Network

11.3.4 Training the deep Q-network

Summary

Materials for further reading

Exercises

Part 4: Summary and Closing Words

12 Testing, Optimizing, and Deploying Models

12.1 Testing TensorFlow.js Models

12.1.1 Traditional unit testing

12.1.2 Testing with golden values

12.1.3 Considerations around continuous training

12.2 Model Optimization

12.2.1 Model-size optimization through post-training weight quantization

12.2.2 Inference speed optimization using GraphModel conversion

12.3 Deploying TensorFlow.js models on various platforms and environments

12.3.1 Additional considerations when deploying to the web

12.3.2 Deployment to cloud serving

12.3.3 Deploying to a browser extension, like Chrome Extension

12.3.4 Deploying TensorFlow.js models in JavaScript-based mobile applications

12.3.5 Deploying TensorFlow.js models in JavaScript-based cross-platform desktop applications

12.3.6 Deploying TensorFlow.js models on WeChat and as other JavaScript-based mobile app plugin systems

12.3.7. Deploying TensorFlow.js models on single-board computers

12.3.8 Summary of deployments

Summary

Materials for further reading

Exercises

13 Summary, Conclusions, and Beyond

13.1 Key concepts in review

13.1.1 Various approaches to AI

13.1.2 What makes deep learning stand out among the subfields of machine learning

13.1.3 How to think about deep learning at a high level

13.1.4 Key enabling technologies of deep learning

13.1.5 Applications and opportunities unlocked by deep learning in JavaScript

13.2 Quick overview of deep-learning workflow and algorithms in TensorFlow.js

13.2.1 The universal workflow of supervised deep learning

13.2.2 Reviewing model and layer types in TensorFlow.js: A quick reference

13.2.3 Leveraging pre-trained models from TensorFlow.js

13.2.4 The space of possibilities

13.2.5 Limitations of deep learning

13.4 Pointers for further exploration

13.5 Final words

Appendixes

Appendix A: Appendix

A.1 Glossary

A.2 Comparing the Features of TensorFlow.js to Some Other JavaScript Deep-Learning Libraries

A.3 Installing tfjs-node-gpu and Its Dependencies

A.3.1 Installing tfjs-node-gpu on Linux

A.3.2 Installing tfjs-node-gpu on Windows

A.4 A Quick Tutorial of Tensors and Operations in TensorFlow.js

A.4.1 Tensor creation and tensor axis conventions

A.4.2 Basic tensor operations

A.4.3 Memory Management in TensorFlow.js: tf.dispose() and tf.tidy()

A.4.4 Calculating gradients

A.4.5 Exercises for Appendix A4

About the Technology

TensorFlow.js is an open-source JavaScript library for defining, training, and deploying deep learning models to the web browser. It’s quickly gaining popularity with developers for its amazing set of benefits including scalability, responsiveness, modularity, and portability. And with JavaScript, deep learning applications can run on a wide variety platforms, making it more accessible than ever before!. TensorFlow-based applications are combating disease, detecting and translating speech in real time and helping NASA identify Near Earth Objects. Imagine what it can do for you!

About the book

In Deep Learning with JavaScript, authors Shanqing Cai, Eric Nielsen, Stanley Bileschi and François Chollet teach you how to use TensorFlow to build incredible deep learning applications in JavaScript. These seasoned deep learning experts make it easy to see why JavaScript lends itself so well to deep learning. After teaching you the basics of Tensorflow.js, they ease you into core concepts like client-side prediction and analytics, web-based sensors, and supervised machine learning. Then, you’ll dive into image recognition, transfer learning, preparing data, DL for text and sequences, generative DL in the browser, and reinforced learning in the browser. Interesting and relevant use cases, including recognizing speech commands and captioning images and videos, drive home the value of your new skills. By the end, you’ll be solving real world problems with DL models and applications of your own!

What's inside

  • Deploying computer vision, audio, and natural language processing in the browser
  • Fine-tuning machine learning models with client-side data
  • Constructing and training a neural network
  • Interactive AI for browser games using deep reinforcement learning
  • Generative neural networks to generate music and pictures
  • Using TensorFlow.js with Cloud ML

About the reader

For web developers interested in deep learning.

About the author

Shanging Cai, Stanley Bileschi and Eric Nielsen are senior engineers on the Google Brain team. They were the primary developers of the high-level API of TensorFlow.js, including many of the examples, the documentation, and the related tooling. They each have advanced degrees from MIT.

Manning Early Access Program (MEAP) Read chapters as they are written, get the finished eBook as soon as it’s ready, and receive the pBook long before it's in bookstores.
MEAP combo $49.99 pBook + eBook + liveBook
MEAP eBook $39.99 pdf + ePub + kindle + liveBook
Prices displayed in rupees will be charged in USD when you check out.

placing your order...

Don't refresh or navigate away from the page.

FREE domestic shipping on three or more pBooks