000 05630nam a2200241Ia 4500
003 NULRC
005 20250520102828.0
008 250520s9999 xx 000 0 und d
020 _a9789814522731
040 _cNULRC
050 _aQA 76.87 .G73 2013
100 _aGraupe, Daniel
_eauthor
245 0 _aPrinciples of artificial neural networks /
_cDaniel Graupe.
250 _a3rd edition
260 _a[New Jersey] :
_bWorld Scientific Publishing Company,
_cc2013
300 _axviii, 363 pages :
_billustrations ;
_c26 cm.
365 _bUSD170.97
504 _aIncludes bibliographical references (pages 349-356) and index.
505 _aCh. 1. Introduction and role of artificial neural networks -- ch. 2. Fundamentals of biological neural networks -- ch. 3. Basic principles of ANNs and their early structures. 3.1. Basic principles of ANN design. 3.2. Basic network structures. 3.3. The Perceptron's input-output principles. 3.4. The Adaline (ALC) -- ch. 4. The Perceptron. 4.1. The basic structure. 4.2. The single-layer representation problem. 4.3. The limitations of the single-layer Perceptron. 4.4. Many-layer Perceptrons. 4.A. Perceptron case study: identifying autoregressive parameters of a signal (AR time series identification) -- ch. 5. The Madaline. 5.1. Madaline training. 5.A. Madaline case study: character recognition -- ch. 6. Back propagation. 6.1. The back propagation learning procedure. 6.2. Derivation of the BP algorithm. 6.3. Modified BP algorithms. 6.A. Back propagation case study: character recognition. 6.B. Back propagation case study: the exclusive-OR (XOR) problem (2-layer BP). 6.C. Back propagation case study: the XOR problem -- 3 layer BP network. 6.D. Average monthly high and low temperature prediction using backpropagation neural networks -- ch. 7. Hopfield networks. 7.1. Introduction. 7.2. Binary Hopfield networks. 7.3. Setting of weights in Hopfield nets -- bidirectional associative memory (BAM) principle. 7.4. Walsh functions. 7.5. Network stability. 7.6. Summary of the procedure for implementing the Hopfield network. 7.7. Continuous Hopfield models. 7.8. The continuous energy (Lyapunov) function. 7.A. Hopfield network case study: character recognition. 7.B. Hopfield network case study: traveling salesman problem. 7.C. Cell shape detection using neural networks -- ch. 8. Counter propagation. 8.1. Introduction. 8.2. Kohonen self-organizing map (SOM) layer. 8.3. Grossberg layer. 8.4. Training of the Kohonen layer. 8.5. Training of Grossberg layers. 8.6. The combined counter propagation network. 8.A. Counter propagation network case study: character recognition. Ch. 9. Large scale memory storage and retrieval (LAMSTAR) network. 9.1. Motivation. 9.2. Basic principles of the LAMSTAR neural network. 9.3. Detailed outline of the LAMSTAR network. 9.4. Forgetting feature. 9.5. Training vs. operational runs. 9.6. Operation in face of missing data. 9.7. Advanced data analysis capabilities. 9.8. Modified version: normalized weights. 9.9. Concluding comments and discussion of applicability. 9.A. LAMSTAR network case study: character recognition. 9.B. Application to medical diagnosis problems. 9.C. Predicting price movement in market microstructure via LAMSTAR. 9.D. Constellation recognition -- ch. 10. Adaptive resonance theory. 10.1. Motivation. 10.2. The ART network structure. 10.3. Setting-up of the ART network. 10.4. Network operation. 10.5. Properties of ART. 10.6. Discussion and general comments on ART-I and ART-II. 10.A. ART-I network case study: character recognition. 10.B. ART-I case study: speech recognition -- ch. 11. The cognitron and the neocognitron. 11.1. Background of the cognitron. 11.2. The basic principles of the cognitron. 11.3. Network operation. 11.4. Cognitron's network training. 11.5. The neocognitron -- ch. 12. Statistical training. 12.1. Fundamental philosophy. 12.2. Annealing methods. 12.3. Simulated annealing by Boltzman training of weights. 12.4. Stochastic determination of magnitude of weight change. 12.5. Temperature-equivalent setting. 12.6. Cauchy training of neural network. 12.A. Statistical training case study: a stochastic Hopfield network for character recognition. 12.B. Statistical training case study: Identifying AR signal parameters with a stochastic Perceptron model -- ch. 13. Recurrent (time cycling) back propagation networks. 13.1. Recurrent/discrete time networks. 13.2. Fully recurrent networks. 13.3. Continuously recurrent back propagation networks. 13.A. Recurrent back propagation case study: character recognition.
520 _aArtificial neural networks are most suitable for solving problems that are complex, ill-defined, highly nonlinear, of many and different variables, and/or stochastic. Such problems are abundant in medicine, in finance, in security and beyond. This volume covers the basic theory and architecture of the major artificial neural networks. Uniquely, it presents 18 complete case studies of applications of neural networks in various fields, ranging from cell-shape classification to micro-trading in finance and to constellation recognition - all with their respective source codes. These case studies demonstrate to the readers in detail how such case studies are designed and executed and how their specific results are obtained. The book is written for a one-semester graduate or senior-level undergraduate course on artificial neural networks. It is also intended to be a self-study and a reference text for scientists, engineers and for researchers in medicine, finance and data mining.
650 _aNEURAL NETWORKS (COMPUTER SCIENCE)
942 _2lcc
_cBK
999 _c16389
_d16389