Coverart for item
The Resource Artificial neural networks : an introduction, Kevin L. Priddy and Paul E. Keller, (electronic book)

Artificial neural networks : an introduction, Kevin L. Priddy and Paul E. Keller, (electronic book)

Label
Artificial neural networks : an introduction
Title
Artificial neural networks
Title remainder
an introduction
Statement of responsibility
Kevin L. Priddy and Paul E. Keller
Creator
Contributor
Subject
Language
eng
Summary
This tutorial text provides the reader with an understanding of artificial neural networks (ANNs) and their application, beginning with the biological systems which inspired them, through the learning methods that have been developed and the data collection processes, to the many ways ANNs are being used today. The material is presented with a minimum of math (although the mathematical details are included in the appendices for interested readers), and with a maximum of hands-on experience. All specialized terms are included in a glossary. The result is a highly readable text that will teach the engineer the guiding principles necessary to use and apply artificial neural networks
Member of
Additional physical form
Also available in print.
Cataloging source
CaBNvSL
http://library.link/vocab/creatorName
Priddy, Kevin L
Dewey number
006.3/2
Illustrations
illustrations
Index
index present
LC call number
QA76.87
LC item number
.P736 2005e
Literary form
non fiction
Nature of contents
  • dictionaries
  • bibliography
http://library.link/vocab/relatedWorkOrContributorName
  • Keller, Paul E
  • Society of Photo-optical Instrumentation Engineers
Series statement
Tutorial texts in optical engineering
Series volume
TT68
http://library.link/vocab/subjectName
Neural networks (Computer science)
Target audience
  • adult
  • specialized
Label
Artificial neural networks : an introduction, Kevin L. Priddy and Paul E. Keller, (electronic book)
Instantiates
Publication
Note
  • "SPIE digital library."
  • Title from PDF t.p. (viewed on 8/23/09)
Bibliography note
Includes bibliographical references (p. [151]-162) and index
Color
black and white
Contents
  • Chapter 1. Introduction. 1.1. The neuron -- 1.2. Modeling neurons -- 1.3. The feedforward neural network -- 1.4. Historical perspective on computing with artificial neurons
  • Chapter 2. Learning methods. 2.1. Supervised training methods -- 2.2. Unsupervised training methods
  • Chapter 3. Data normalization. 3.1. Statistical or Z-score normalization -- 3.2. Min-max normalization -- 3.3. Sigmoidal or SoftMax normalization -- 3.4. Energy normalization -- 3.5. Principal components normalization
  • Chapter 4. Data collection, preparation, labeling, and input coding. 4.1. Data collection -- 4.2. Feature selection and extraction
  • Chapter 5. Output coding. 5.1. Classifier coding -- 5.2. Estimator coding
  • Chapter 6. Post-processing
  • Chapter 7. Supervised training methods. 7.1. The effects of training data on neural network performance -- 7.2. Rules of thumb for training neural networks -- 7.3. Training and testing
  • Chapter 8. Unsupervised training methods. 8.1. Self-organizing maps (SOMs) -- 8.2. Adaptive resonance theory network
  • Chapter 9. Recurrent neural networks. 9.1. Hopfield neural networks -- 9.2. The bidirectional associative memory (BAM) -- 9.3. The generalized linear neural network -- 9.4. Real-time recurrent network -- 9.5. Elman recurrent network
  • Chapter 10. A plethora of applications. 10.1. Function approximation -- 10.2. Function approximation-Boston housing example -- 10.3. Function approximation-cardiopulmonary modeling -- 10.4. Pattern recognition-tree classifier example -- 10.5. Pattern recognition-handwritten number recognition example -- 10.6. Pattern recognition-electronic nose example -- 10.7. Pattern recognition-airport scanner texture recognition example -- 10.8. Self organization-serial killer data-mining example -- 10.9. Pulse-coupled neural networks-image segmentation example
  • Chapter 11. Dealing with limited amounts of data. 11.1. K-fold cross-validation -- 11.2. Leave-one-out cross-validation -- 11.3. Jackknife resampling -- 11.4. Bootstrap resampling
  • Appendix A. The feedforward neural network. A.1. Mathematics of the feedforward process -- A.2. The backpropagation algorithm -- A.3. Alternatives to backpropagation
  • Appendix B. Feature saliency
  • Appendix C. Matlab code for various neural networks. C.1. Matlab code for principal components normalization -- C.2. Hopfield network -- C.3. Generalized neural network -- C.4. Generalized neural network example -- C.5. ART-like network -- C.6. Simple perceptron algorithm -- C.7. Kohonen self-organizing feature map
  • Appendix D. Glossary of terms -- References -- Index
Dimensions
unknown
Extent
1 online resource (ix, 165 p. : ill.)
File format
multiple file formats
Form of item
electronic
Isbn
9780819478726
Isbn Type
(ebk)
Other physical details
digital file.
Reformatting quality
access
Reproduction note
Electronic resource.
Specific material designation
remote
System details
System requirements: Adobe Acrobat Reader
Label
Artificial neural networks : an introduction, Kevin L. Priddy and Paul E. Keller, (electronic book)
Publication
Note
  • "SPIE digital library."
  • Title from PDF t.p. (viewed on 8/23/09)
Bibliography note
Includes bibliographical references (p. [151]-162) and index
Color
black and white
Contents
  • Chapter 1. Introduction. 1.1. The neuron -- 1.2. Modeling neurons -- 1.3. The feedforward neural network -- 1.4. Historical perspective on computing with artificial neurons
  • Chapter 2. Learning methods. 2.1. Supervised training methods -- 2.2. Unsupervised training methods
  • Chapter 3. Data normalization. 3.1. Statistical or Z-score normalization -- 3.2. Min-max normalization -- 3.3. Sigmoidal or SoftMax normalization -- 3.4. Energy normalization -- 3.5. Principal components normalization
  • Chapter 4. Data collection, preparation, labeling, and input coding. 4.1. Data collection -- 4.2. Feature selection and extraction
  • Chapter 5. Output coding. 5.1. Classifier coding -- 5.2. Estimator coding
  • Chapter 6. Post-processing
  • Chapter 7. Supervised training methods. 7.1. The effects of training data on neural network performance -- 7.2. Rules of thumb for training neural networks -- 7.3. Training and testing
  • Chapter 8. Unsupervised training methods. 8.1. Self-organizing maps (SOMs) -- 8.2. Adaptive resonance theory network
  • Chapter 9. Recurrent neural networks. 9.1. Hopfield neural networks -- 9.2. The bidirectional associative memory (BAM) -- 9.3. The generalized linear neural network -- 9.4. Real-time recurrent network -- 9.5. Elman recurrent network
  • Chapter 10. A plethora of applications. 10.1. Function approximation -- 10.2. Function approximation-Boston housing example -- 10.3. Function approximation-cardiopulmonary modeling -- 10.4. Pattern recognition-tree classifier example -- 10.5. Pattern recognition-handwritten number recognition example -- 10.6. Pattern recognition-electronic nose example -- 10.7. Pattern recognition-airport scanner texture recognition example -- 10.8. Self organization-serial killer data-mining example -- 10.9. Pulse-coupled neural networks-image segmentation example
  • Chapter 11. Dealing with limited amounts of data. 11.1. K-fold cross-validation -- 11.2. Leave-one-out cross-validation -- 11.3. Jackknife resampling -- 11.4. Bootstrap resampling
  • Appendix A. The feedforward neural network. A.1. Mathematics of the feedforward process -- A.2. The backpropagation algorithm -- A.3. Alternatives to backpropagation
  • Appendix B. Feature saliency
  • Appendix C. Matlab code for various neural networks. C.1. Matlab code for principal components normalization -- C.2. Hopfield network -- C.3. Generalized neural network -- C.4. Generalized neural network example -- C.5. ART-like network -- C.6. Simple perceptron algorithm -- C.7. Kohonen self-organizing feature map
  • Appendix D. Glossary of terms -- References -- Index
Dimensions
unknown
Extent
1 online resource (ix, 165 p. : ill.)
File format
multiple file formats
Form of item
electronic
Isbn
9780819478726
Isbn Type
(ebk)
Other physical details
digital file.
Reformatting quality
access
Reproduction note
Electronic resource.
Specific material designation
remote
System details
System requirements: Adobe Acrobat Reader

Library Locations

Processing Feedback ...