TensorFlow for Deep Learning

Book description

None

Table of contents

  1. Preface
    1. Conventions Used in This Book
    2. Using Code Examples
    3. O’Reilly Safari
    4. How to Contact Us
    5. Acknowledgments
  2. 1. Introduction to Deep Learning
    1. Machine Learning Eats Computer Science
    2. Deep Learning Primitives
      1. Fully Connected Layer
      2. Convolutional Layer
      3. Recurrent Neural Network Layers
      4. Long Short-Term Memory Cells
    3. Deep Learning Architectures
      1. LeNet
      2. AlexNet
      3. ResNet
      4. Neural Captioning Model
      5. Google Neural Machine Translation
      6. One-Shot Models
      7. AlphaGo
      8. Generative Adversarial Networks
      9. Neural Turing Machines
    4. Deep Learning Frameworks
      1. Limitations of TensorFlow
    5. Review
  3. 2. Introduction to TensorFlow Primitives
    1. Introducing Tensors
      1. Scalars, Vectors, and Matrices
      2. Matrix Mathematics
      3. Tensors
      4. Tensors in Physics
      5. Mathematical Asides
    2. Basic Computations in TensorFlow
      1. Installing TensorFlow and Getting Started
      2. Initializing Constant Tensors
      3. Sampling Random Tensors
      4. Tensor Addition and Scaling
      5. Matrix Operations
      6. Tensor Types
      7. Tensor Shape Manipulations
      8. Introduction to Broadcasting
    3. Imperative and Declarative Programming
      1. TensorFlow Graphs
      2. TensorFlow Sessions
      3. TensorFlow Variables
    4. Review
  4. 3. Linear and Logistic Regression with TensorFlow
    1. Mathematical Review
      1. Functions and Differentiability
      2. Loss Functions
      3. Gradient Descent
      4. Automatic Differentiation Systems
    2. Learning with TensorFlow
      1. Creating Toy Datasets
      2. New TensorFlow Concepts
    3. Training Linear and Logistic Models in TensorFlow
      1. Linear Regression in TensorFlow
      2. Logistic Regression in TensorFlow
    4. Review
  5. 4. Fully Connected Deep Networks
    1. What Is a Fully Connected Deep Network?
    2. “Neurons” in Fully Connected Networks
      1. Learning Fully Connected Networks with Backpropagation
      2. Universal Convergence Theorem
      3. Why Deep Networks?
    3. Training Fully Connected Neural Networks
      1. Learnable Representations
      2. Activations
      3. Fully Connected Networks Memorize
      4. Regularization
      5. Training Fully Connected Networks
    4. Implementation in TensorFlow
      1. Installing DeepChem
      2. Tox21 Dataset
      3. Accepting Minibatches of Placeholders
      4. Implementing a Hidden Layer
      5. Adding Dropout to a Hidden Layer
      6. Implementing Minibatching
      7. Evaluating Model Accuracy
      8. Using TensorBoard to Track Model Convergence
    5. Review
  6. 5. Hyperparameter Optimization
    1. Model Evaluation and Hyperparameter Optimization
    2. Metrics, Metrics, Metrics
      1. Binary Classification Metrics
      2. Multiclass Classification Metrics
      3. Regression Metrics
    3. Hyperparameter Optimization Algorithms
      1. Setting Up a Baseline
      2. Graduate Student Descent
      3. Grid Search
      4. Random Hyperparameter Search
      5. Challenge for the Reader
    4. Review
  7. 6. Convolutional Neural Networks
    1. Introduction to Convolutional Architectures
      1. Local Receptive Fields
      2. Convolutional Kernels
      3. Pooling Layers
      4. Constructing Convolutional Networks
      5. Dilated Convolutions
    2. Applications of Convolutional Networks
      1. Object Detection and Localization
      2. Image Segmentation
      3. Graph Convolutions
      4. Generating Images with Variational Autoencoders
    3. Training a Convolutional Network in TensorFlow
      1. The MNIST Dataset
      2. Loading MNIST
      3. TensorFlow Convolutional Primitives
      4. The Convolutional Architecture
      5. Evaluating Trained Models
      6. Challenge for the Reader
    4. Review
  8. 7. Recurrent Neural Networks
    1. Overview of Recurrent Architectures
    2. Recurrent Cells
      1. Long Short-Term Memory (LSTM)
      2. Gated Recurrent Units (GRU)
    3. Applications of Recurrent Models
      1. Sampling from Recurrent Networks
      2. Seq2seq Models
    4. Neural Turing Machines
    5. Working with Recurrent Neural Networks in Practice
    6. Processing the Penn Treebank Corpus
      1. Code for Preprocessing
      2. Loading Data into TensorFlow
      3. The Basic Recurrent Architecture
      4. Challenge for the Reader
    7. Review
  9. 8. Reinforcement Learning
    1. Markov Decision Processes
    2. Reinforcement Learning Algorithms
      1. Q-Learning
      2. Policy Learning
      3. Asynchronous Training
    3. Limits of Reinforcement Learning
    4. Playing Tic-Tac-Toe
      1. Object Orientation
      2. Abstract Environment
      3. Tic-Tac-Toe Environment
      4. The Layer Abstraction
      5. Defining a Graph of Layers
    5. The A3C Algorithm
      1. The A3C Loss Function
      2. Defining Workers
      3. Training the Policy
      4. Challenge for the Reader
    6. Review
  10. 9. Training Large Deep Networks
    1. Custom Hardware for Deep Networks
    2. CPU Training
      1. GPU Training
      2. Tensor Processing Units
      3. Field Programmable Gate Arrays
      4. Neuromorphic Chips
    3. Distributed Deep Network Training
      1. Data Parallelism
      2. Model Parallelism
    4. Data Parallel Training with Multiple GPUs on Cifar10
      1. Downloading and Loading the DATA
      2. Deep Dive on the Architecture
      3. Training on Multiple GPUs
      4. Challenge for the Reader
    5. Review
  11. 10. The Future of Deep Learning
    1. Deep Learning Outside the Tech Industry
      1. Deep Learning in the Pharmaceutical Industry
      2. Deep Learning in Law
      3. Deep Learning for Robotics
      4. Deep Learning in Agriculture
    2. Using Deep Learning Ethically
    3. Is Artificial General Intelligence Imminent?
    4. Where to Go from Here?
  12. Index

Product information

  • Title: TensorFlow for Deep Learning
  • Author(s):
  • Release date:
  • Publisher(s): O'Reilly Media, Inc.
  • ISBN: None