Coverart for item
The Resource Applied deep learning : a case-based approach to understanding deep neural networks, Umberto Michelucci

Applied deep learning : a case-based approach to understanding deep neural networks, Umberto Michelucci

Label
Applied deep learning : a case-based approach to understanding deep neural networks
Title
Applied deep learning
Title remainder
a case-based approach to understanding deep neural networks
Statement of responsibility
Umberto Michelucci
Creator
Author
Subject
Language
eng
Summary
Work with advanced topics in deep learning, such as optimization algorithms, hyper-parameter tuning, dropout, and error analysis as well as strategies to address typical problems encountered when training deep neural networks. You{u2019}ll begin by studying the activation functions mostly with a single neuron (ReLu, sigmoid, and Swish), seeing how to perform linear and logistic regression using TensorFlow, and choosing the right cost function. The next section talks about more complicated neural network architectures with several layers and neurons and explores the problem of random initialization of weights. An entire chapter is dedicated to a complete overview of neural network error analysis, giving examples of solving problems originating from variance, bias, overfitting, and datasets coming from different distributions. Applied Deep Learning also discusses how to implement logistic regression completely from scratch without using any Python library except NumPy, to let you appreciate how libraries such as TensorFlow allow quick and efficient experiments. Case studies for each method are included to put into practice all theoretical information. You{u2019}ll discover tips and tricks for writing optimized Python code (for example vectorizing loops with NumPy). What You Will Learn Implement advanced techniques in the right way in Python and TensorFlow Debug and optimize advanced methods (such as dropout and regularization) Carry out error analysis (to realize if one has a bias problem, a variance problem, a data offset problem, and so on) Set up a machine learning project focused on deep learning on a complex dataset Who This Book Is For Readers with a medium understanding of machine learning, linear algebra, calculus, and basic Python programming
Member of
Cataloging source
N$T
http://library.link/vocab/creatorName
Michelucci, Umberto
Dewey number
006.3/1
Index
index present
LC call number
Q325.5
Literary form
non fiction
Nature of contents
dictionaries
http://library.link/vocab/subjectName
  • Machine learning
  • Neural networks (Computer science)
Label
Applied deep learning : a case-based approach to understanding deep neural networks, Umberto Michelucci
Instantiates
Publication
Distribution
Copyright
Note
Includes index
Antecedent source
unknown
Carrier category
online resource
Carrier category code
cr
Carrier MARC source
rdacarrier
Color
multicolored
Content category
text
Content type code
txt
Content type MARC source
rdacontent
Contents
  • Intro; Table of Contents; About the Author; About the Technical Reviewer; Acknowledgments; Introduction; Chapter 1: Computational Graphs and TensorFlow; How to Set Up Your Python Environment; Creating an Environment; Installing TensorFlow; Jupyter Notebooks; Basic Introduction to TensorFlow; Computational Graphs; Tensors; Creating and Running a Computational Graph; Computational Graph with tf.constant; Computational Graph with tf.Variable; Computational Graph with tf.placeholder; Differences Between run and eval; Dependencies Between Nodes; Tips on How to Create and Close a Session
  • Chapter 2: Single NeuronThe Structure of a Neuron; Matrix Notation; Python Implementation Tip: Loops and NumPy; Activation Functions; Identity Function; Sigmoid Function; Tanh (Hyperbolic Tangent Activation) Function; ReLU (Rectified Linear Unit) Activation Function; Leaky ReLU; Swish Activation Function; Other Activation Functions; Cost Function and Gradient Descent: The Quirks of the Learning Rate; Learning Rate in a Practical Example; Example of Linear Regression in tensorflow; Dataset for Our Linear Regression Model; Neuron and Cost Function for Linear Regression
  • Satisficing and Optimizing a MetricExample of Logistic Regression; Cost Function; Activation Function; The Dataset; tensorflow Implementation; References; Chapter 3: Feedforward Neural Networks; Network Architecture; Output of Neurons; Summary of Matrix Dimensions; Example: Equations for a Network with Three Layers; Hyperparameters in Fully Connected Networks; sof tmax Function for Multiclass Classification; A Brief Digression: Overfitting; A Practical Example of Overfitting; Basic Error Analysis; The Zalando Dataset; Building a Model with tensorflow; Network Architecture
  • Modifying Labels for the softmax Function-One-Hot EncodingThe tensor flow Model; Gradient Descent Variations; Batch Gradient Descent; Stochastic Gradient Descent; Mini-Batch Gradient Descent; Comparison of the Variations; Examples of Wrong Predictions; Weight Initialization; Adding Many Layers Efficiently; Advantages of Additional Hidden Layers; Comparing Different Networks; Tips for Choosing the Right Network; Chapter 4: Training Neural Networks; Dynamic Learning Rate Decay; Iterations or Epochs?; Staircase Decay; Step Decay; Inverse Time Decay; Exponential Decay; Natural Exponential Decay
  • Tensorflow ImplementationApplying the Methods to the Zalando Dataset; Common Optimizers; Exponentially Weighted Averages; Momentum; RMSProp; Adam; Which Optimizer Should I Use?; Example of Self-Developed Optimizer; Chapter 5: Regularization; Complex Networks and Overfitting; What Is Regularization?; About Network Complexity; lp Norm; l2 Regularization; Theory of l2 Regularization; tensorflow Implementation; l1 Regularization; Theory of l1 Regularization and tensorflow Implementation; Are Weights Really Going to Zero?; Dropout; Early Stopping; Additional Methods; Chapter 6: Metric Analysis
Dimensions
unknown
Extent
1 online resource.
File format
unknown
Form of item
online
Isbn
9781484237892
Level of compression
unknown
Media category
computer
Media MARC source
rdamedia
Media type code
c
Quality assurance targets
not applicable
Reformatting quality
unknown
Sound
unknown sound
Specific material designation
remote
Label
Applied deep learning : a case-based approach to understanding deep neural networks, Umberto Michelucci
Publication
Distribution
Copyright
Note
Includes index
Antecedent source
unknown
Carrier category
online resource
Carrier category code
cr
Carrier MARC source
rdacarrier
Color
multicolored
Content category
text
Content type code
txt
Content type MARC source
rdacontent
Contents
  • Intro; Table of Contents; About the Author; About the Technical Reviewer; Acknowledgments; Introduction; Chapter 1: Computational Graphs and TensorFlow; How to Set Up Your Python Environment; Creating an Environment; Installing TensorFlow; Jupyter Notebooks; Basic Introduction to TensorFlow; Computational Graphs; Tensors; Creating and Running a Computational Graph; Computational Graph with tf.constant; Computational Graph with tf.Variable; Computational Graph with tf.placeholder; Differences Between run and eval; Dependencies Between Nodes; Tips on How to Create and Close a Session
  • Chapter 2: Single NeuronThe Structure of a Neuron; Matrix Notation; Python Implementation Tip: Loops and NumPy; Activation Functions; Identity Function; Sigmoid Function; Tanh (Hyperbolic Tangent Activation) Function; ReLU (Rectified Linear Unit) Activation Function; Leaky ReLU; Swish Activation Function; Other Activation Functions; Cost Function and Gradient Descent: The Quirks of the Learning Rate; Learning Rate in a Practical Example; Example of Linear Regression in tensorflow; Dataset for Our Linear Regression Model; Neuron and Cost Function for Linear Regression
  • Satisficing and Optimizing a MetricExample of Logistic Regression; Cost Function; Activation Function; The Dataset; tensorflow Implementation; References; Chapter 3: Feedforward Neural Networks; Network Architecture; Output of Neurons; Summary of Matrix Dimensions; Example: Equations for a Network with Three Layers; Hyperparameters in Fully Connected Networks; sof tmax Function for Multiclass Classification; A Brief Digression: Overfitting; A Practical Example of Overfitting; Basic Error Analysis; The Zalando Dataset; Building a Model with tensorflow; Network Architecture
  • Modifying Labels for the softmax Function-One-Hot EncodingThe tensor flow Model; Gradient Descent Variations; Batch Gradient Descent; Stochastic Gradient Descent; Mini-Batch Gradient Descent; Comparison of the Variations; Examples of Wrong Predictions; Weight Initialization; Adding Many Layers Efficiently; Advantages of Additional Hidden Layers; Comparing Different Networks; Tips for Choosing the Right Network; Chapter 4: Training Neural Networks; Dynamic Learning Rate Decay; Iterations or Epochs?; Staircase Decay; Step Decay; Inverse Time Decay; Exponential Decay; Natural Exponential Decay
  • Tensorflow ImplementationApplying the Methods to the Zalando Dataset; Common Optimizers; Exponentially Weighted Averages; Momentum; RMSProp; Adam; Which Optimizer Should I Use?; Example of Self-Developed Optimizer; Chapter 5: Regularization; Complex Networks and Overfitting; What Is Regularization?; About Network Complexity; lp Norm; l2 Regularization; Theory of l2 Regularization; tensorflow Implementation; l1 Regularization; Theory of l1 Regularization and tensorflow Implementation; Are Weights Really Going to Zero?; Dropout; Early Stopping; Additional Methods; Chapter 6: Metric Analysis
Dimensions
unknown
Extent
1 online resource.
File format
unknown
Form of item
online
Isbn
9781484237892
Level of compression
unknown
Media category
computer
Media MARC source
rdamedia
Media type code
c
Quality assurance targets
not applicable
Reformatting quality
unknown
Sound
unknown sound
Specific material designation
remote

Library Locations

Processing Feedback ...