Probabilistic Deep Learning with Python
Oliver Dürr, Beate Sick, Elvis Murina
  • MEAP began June 2019
  • Publication in Summer 2020 (estimated)
  • ISBN 9781617296079
  • 225 pages (estimated)
  • printed in black & white

A fresh approach on deep learning.

Al Krinker
Probabilistic Deep Learning with Python teaches the increasingly popular probabilistic approach to deep learning that allows you to tune and refine your results more quickly and accurately without as much trial-and-error testing. Emphasizing practical techniques that use the Python-based Tensorflow Probability Framework, you’ll learn to build highly-performant deep learning applications that can reliably handle the noise and uncertainty of real-world data.
Table of Contents detailed table of contents

Part 1: First Steps

1 Introduction to deep learning

1.1 A first brief look at deep learning

1.1.1 A success story

1.1.2 A first example: face recognition (DL vs. traditional methods)

1.2 The principles of curve fitting

1.3 When to use and when not to use DL?

1.3.1 When not to use DL

1.3.2 When to use DL

1.4 What you’ll learn in this book

1.5 Summary

2 Neural network architectures

2.1 Fully connected neural networks

2.1.1 The biology that inspired the design of artificial NNs

2.1.2 Getting started with implementing an NN

2.1.3 Using a fully connected NN to classify images

2.2 2D convolutional NNs for image-like data

2.2.1 Main ideas in a CNN architecture

2.2.2 A minimal CNN for edge lovers

2.2.3 Biological inspiration for a CNN architecture

2.2.4 Building and understanding a CNN

2.3 One dimensional CNNs for ordered data

2.3.1 Format of time-ordered data

2.3.2 What’s special about ordered data?

2.3.3 Architectures for time-ordered data

2.4 Summary

Part 2: Foundations of deep learning

3 Principles of curve fitting

3.1 “Hello world” in curve fitting

3.1.1 Fitting a linear regression model based on a loss function

3.2 Gradient descent method

3.2.1 Loss with one free model parameter

3.2.2 Loss with two free-model parameters

3.3 Special DL sauce

3.3.1 Mini-batch gradient descent

3.3.2 Using SGD variants to speed up the learning

3.3.3 Automatic differentiation

3.4 Backpropagation in DL frameworks

3.4.1 Static graph frameworks

3.4.2 Dynamic graph frameworks

3.5 Summary

4 Building loss functions using statistical learning principles

5 Mastering nonstandard problems

6 Getting reliable predictions

7 Assigning uncertainties

About the Technology

Probabilistic deep learning models are better suited to dealing with the noise and uncertainty of real world data — a crucial factor for self-driving cars, scientific results, financial industries, and other accuracy-critical applications. By utilizing probabilistic techniques, deep learning engineers can judge how reliable their results are, and get a better understanding of how their algorithms function.

About the book

Probabilistic Deep Learning with Python shows how probabilistic deep learning models gives you the tools to identify and account for uncertainty and potential errors in your results. Starting by applying the underlying maximum likelihood principle of curve fitting to deep learning, you’ll move on to using the Python-based Tensorflow Probability framework, and set up Bayesian neural networks that can state their uncertainties. Hands-on code examples and illustrative Jupyter notebooks ensure that you’re focused on the practical applications of the abstract-but-powerful concepts of probabilistic deep learning. By the time you’re done, you’ll be able to build highly-performant applications that can account for inaccuracies without constantly running, and re-running, your models.

What's inside

  • The maximum likelihood principle that underlies deep learning applications
  • Probabilistic DL models that can indicate the distribution of possible outcomes
  • Bayesian deep learning models that make it possible to capture the uncertainty occurring in real-world applications
  • Normalizing flows for modeling and generating complex distributions, such as images of human faces

About the reader

Aimed at a reader experienced with developing machine learning or deep learning applications.

About the authors

Oliver Dürr is professor for data science at the University of Applied Sciences in Konstanz, Germany. Beate Sick holds a chair for applied statistics at ZHAW, and works as a researcher and lecturer at the University of Zurich, and as a lecturer at ETH Zurich. Elvis Murina is a research assistant, responsible for the extensive exercises that accompany this book.

Dürr and Sick are both experts in machine learning and statistics. They have supervised numerous bachelors, masters, and PhD theses on the topic of deep learning, and planned and conducted several postgraduate and masters-level deep learning courses. All three authors have been working with deep learning methods since 2013 and have extensive experience in both teaching the topic and developing probabilistic deep learning models.

Manning Early Access Program (MEAP) Read chapters as they are written, get the finished eBook as soon as it’s ready, and receive the pBook long before it's in bookstores.
MEAP combo $49.99 pBook + eBook + liveBook
MEAP eBook $39.99 pdf + ePub + kindle + liveBook
Prices displayed in rupees will be charged in USD when you check out.

placing your order...

Don't refresh or navigate away from the page.

FREE domestic shipping on three or more pBooks