8.1 The problem - A more complicated alien planet!
8.1.1 Solution - If one line is not enough, use two lines to classify your dataset
8.1.2 Why two lines? Is happiness not linear?
8.1.3 Perceptrons and how to combine them
8.1.4 From discrete perceptrons to continuous perceptrons - a trick to improve our training
8.2 The general scenario - Neural networks
8.2.1 The architecture of a neural network
8.2.2 Bias vs Threshold - Two equivalent ways of describing the constant term in the perceptron
8.3 Training neural networks
8.3.2 Backpropagation - The key step in reducing the error function in order to train the neural network
8.3.3 Potential problems with neural networks - From overfitting to vanishing gradients
8.3.4 Techniques for training your neural network - Dropout, regularization
8.3.5 Different activation functions - Sigmoid, hyperbolic tangent (tanh), and the rectified linear unit (ReLU)
8.3.7 Hyperparameters - what we fine tune to improve our training
8.3.8 Can neural networks predict values instead of classes? Yes we can! - Neural networks for regression
8.4 How to code a neural network in Keras
8.4.1 Categorizing our data - a way to turn categorical features into numbers
8.4.2 The architecture of a neural network that weâ€™ll use to train this dataset
8.4.3 Defining the model in Keras - Number of layers, size of each layer, and activation functions
8.4.4 Training the model in Keras
8.5 Other more complicated architectures and some sci-fi applications
8.5.1 How neural networks see - Image recognition
8.5.2 How neural networks talk - Natural language processing
8.5.3 How neural networks generate faces that look real - Generative adversarial networks
8.6 Summary