Deep Learning Design Patterns
Andrew Ferlitsch
  • MEAP began July 2020
  • Publication in Spring 2021 (estimated)
  • ISBN 9781617298264
  • 400 pages (estimated)
  • printed in black & white

A great resource for thinking about how to approach developing deep learning models in a very plug-n-play extendable way.

Nick Vazquez
Deep learning has revealed ways to create algorithms for applications that we never dreamed were possible. For software developers, the challenge lies in taking cutting-edge technologies from R&D labs through to production. Deep Learning Design Patterns is here to help. In it, you'll find deep learning models presented in a unique new way: as extendable design patterns you can easily plug-and-play into your software projects. Written by Google deep learning expert Andrew Ferlitsch, it's filled with the latest deep learning insights and best practices from his work with Google Cloud AI. Each valuable technique is presented in a way that's easy to understand and filled with accessible diagrams and code samples.

About the Technology

You don't need to design your deep learning applications from scratch! By viewing cutting-edge deep learning models as design patterns, developers can speed up their creation of AI models and improve model understandability for both themselves and other users.

About the book

Deep Learning Design Patterns distills models from the latest research papers into practical design patterns applicable to enterprise AI projects. Using diagrams, code samples, and easy-to-understand language, Google Cloud AI expert Andrew Ferlitsch shares insights from state-of-the-art neural networks. You'll learn how to integrate design patterns into deep learning systems from some amazing examples, including a real-estate program that can evaluate house prices just from uploaded photos and a speaking AI capable of delivering live sports broadcasting. Building on your existing deep learning knowledge, you'll quickly learn to incorporate the very latest models and techniques into your apps as idiomatic, composable, and reusable design patterns.
Table of Contents detailed table of contents

Part 1 Junior Data Scientist

1 Evolution in Production AI

1.1 The Evolution in Machine Learning Concepts

1.1.1 Classical AI vs Narrow AI

1.1.2 Next Steps: IA, machine design, model fusion, model amalgamation

1.2 The Evolution in Machine Learning Steps

1.2.1 Machine Learning as a Pipeline

1.2.2 Machine Learning as a CI/CD Production Process

1.2.3 Machine Learning as Intelligent Automation

1.2.4 Machine Learning as Model Amalgamation

1.2.5 Model Amalgamation in Production

1.3 The Evolution in Machine Learning Design Patterns

1.3.1 Procedural Reuse

1.3.2 Factory

1.3.3 Abstract Factory

1.3.4 How the Book is Organized

2 Evolution in Production AI

2.1 Architecture

2.2 Stem Component

2.2.1 VGG

2.2.2 ResNet

2.2.3 ResNeXt

2.2.4 Xception

2.3 Pre-Stem

2.4 Learner Component

2.4.1 ResNet

2.4.2 DenseNet

2.5 Task Component

2.5.1 ResNet

2.5.2 Multi-Layer Output

2.5.3 SqueezeNet

2.6 Summary

3 Wide Convolutional Neural Networks

3.1 Inception V1 (GoogLeNet)

3.1.1 Naive Inception Module

3.1.2 Inception v1 Module

3.1.3 Stem

3.1.4 Learner

3.1.5 Auxiliary Classifiers

3.1.6 Classifier

3.2 Inception V2 - Factoring Convolutions

3.3 Inception V3 - Architecture Redesign

3.3.1 Inception Groups & Blocks

3.3.2 Normal Convolution

3.3.3 Spatial Separable Convolution

3.3.4 Stem

3.3.5 Auxiliary Classifier

3.4 ResNeXt - Wide Residual Neural Networks

3.4.1 ResNeXt Block

3.4.2 Architecture

3.5 Wide Residual Network

3.5.1 Architecture

3.5.2 Wide Residual Block

3.6 Summary

4 Alternative Connectivity Patterns

4.1 DenseNet - Densely Connected Convolutional Neural Network

4.1.1 Dense Group

4.1.2 Dense Block

4.1.3 Architecture

4.1.4 Transition Block

4.2 Xception - Extreme Inception

4.2.1 Architecture

4.2.2 Entry Flow

4.2.3 Middle Flow

4.2.4 Exit Flow

4.2.5 Depthwise Separable Convolution

4.2.6 Depthwise Convolution

4.2.7 Pointwise Convolution

4.3 SE-Net - Squeeze and Excitation

4.3.1 Architecture

4.3.2 Group & Block

4.4 Summary

5 Mobile Convolutional Neural Networks

5.1 MobileNet V1

5.1.1 Architecture

5.1.2 Width Multiplier

5.1.3 Resolution Multiplier

5.1.4 Stem

5.1.5 Learner

5.1.6 Classifier

5.2 MobileNet V2

5.2.1 Architecture

5.2.2 Stem

5.2.3 Learner

5.2.4 Classifier

5.3 SqueezeNet

5.3.1 Architecture

5.3.2 Stem

5.3.3 Learner

5.3.4 Classifier

5.3.5 Bypass Connections

5.4 ShuffleNet V1

5.4.1 Architecture

5.4.2 Stem

5.4.3 Learner

5.5 Deployment

5.5.1 Quantization

5.5.2 TFLite Conversion and Prediction

5.6 Summary

6 AutoEncoders

6.1 Deep Neural Network AutoEncoders

6.1.1 Architecture

6.1.2 Encoder

6.1.3 Decoder

6.1.4 Training

6.2 Convolutional AutoEncoders

6.2.1 Architecture

6.2.2 Encoder

6.2.3 Decoder

6.3 Sparse AutoEncoders

6.4 Denoising AutoEncoders

6.5 Super Resolution

6.5.1 Pre-Upsampling SR

6.5.2 Post-Upsampling SR

6.6 Pre-text Tasks

6.7 Summary

7 Hyperparameter Tuning

7.1 Weight Initialization

7.1.1 Weight Distributions

7.1.2 Lottery Hypothesis

7.2 Warmup (Numerical Stability)

7.3.1 Manual Method

7.3.4 KerasTuner

7.4 Learning Rate Scheduler

7.4.1 Keras Decay Parameter

7.4.2 Keras Learning Rate Scheduler

7.4.3 Ramp

7.4.4 Constant Step

7.4.5 Cosine Annealing

7.5 Regularization

7.5.1 Weight Regularization

7.5.2 Label Smoothing

7.6 Summary

8 Transfer Learning

9 Data Distributions

10 Data Pipeline

11 Training Pipeline

Part 2 Mid-level Data Scientist

12 AutoML by Design (Patterns)

13 Multi-Task Models

14 Network Architecture Search

15 Automatic Hyperparameter Search

16 Production Foundation (Training at Scale)

Part 3 Advanced Data Scientist

17 Model Amalgamation

18 Automatic Macro-Architecture Search

19 Knowledge Distillation (Student/Teacher)

20 Semi/Weakly Supervised Learning

21 Self Supervised Learning

What's inside

  • Internal functioning of modern convolutional neural networks
  • Procedural reuse design pattern for CNN architectures
  • Models for mobile and IoT devices
  • Composable design pattern for automatic learning methods
  • Assembling large-scale model deployments
  • Complete code samples and example notebooks
  • Accompanying YouTube videos

About the reader

For machine learning engineers familiar with Python and deep learning.

About the author

Andrew Ferlitsch is an expert on computer vision and deep learning at Google Cloud AI Developer Relations. He was formerly a principal research scientist for 20 years at Sharp Corporation of Japan, where he amassed 115 US patents and worked on emerging technologies in telepresence, augmented reality, digital signage, and autonomous vehicles. In his present role, he reaches out to developer communities, corporations and universities, teaching deep learning and evangelizing Google's AI technologies.

placing your order...

Don't refresh or navigate away from the page.
Manning Early Access Program (MEAP) Read chapters as they are written, get the finished eBook as soon as it’s ready, and receive the pBook long before it's in bookstores.
print book $29.99 $59.99 pBook + eBook + liveBook
Additional shipping charges may apply
Deep Learning Design Patterns (print book) added to cart
continue shopping
go to cart

eBook $24.99 $49.99 3 formats + liveBook
Deep Learning Design Patterns (eBook) added to cart
continue shopping
go to cart

Prices displayed in rupees will be charged in USD when you check out.

FREE domestic shipping on three or more pBooks