## ARTIFICIAL NEURAL NETWORKSDesigned as an introductory level textbook on Artificial Neural Networks at the postgraduate and senior undergraduate levels in any branch of engineering, this self-contained and well-organized book highlights the need for new models of computing based on the fundamental principles of neural networks. Professor Yegnanarayana compresses, into the covers of a single volume, his several years of rich experience, in teaching and research in the areas of speech processing, image processing, artificial intelligence and neural networks. He gives a masterly analysis of such topics as Basics of artificial neural networks, Functional units of artificial neural networks for pattern recognition tasks, Feedforward and Feedback neural networks, and Archi-tectures for complex pattern recognition tasks. Throughout, the emphasis is on the pattern processing feature of the neural networks. Besides, the presentation of real-world applications provides a practical thrust to the discussion. |

### What people are saying - Write a review

User Review - Flag as inappropriate

one of the best book for ann......

User Review - Flag as inappropriate

hai

### Contents

1 | |

BASICS OF ARTIFICIAL NEURAL NETWORKS 1539 | 15 |

ACTIVATION AND SYNAPTIC DYNAMICS 4075 | 40 |

FUNCTIONAL UNITS OF ANN FOR PATTERN | 76 |

FEEDFORWARD NEURAL NETWORKS 88141 | 88 |

FEEDBACK NEURAL NETWORKS 142200 | 142 |

COMPETITIVE LEARNING NEURAL NETWORKS 201232 | 201 |

ARCHITECTURES FOR COMPLEX PATTERN | 233 |

APPLICATIONS OF ANN 278339 | 278 |

Appendices 341397 | 341 |

Bibliography 399431 | 399 |

433 | |

441 | |

### Common terms and phrases

activation dynamics activation value applications architecture artificial neural networks associative memory autoassociative backpropagation backpropagation learning binary biological neural network Boltzmann learning Boltzmann machine competitive learning computing connections constraint convergence corresponding desired output discussed energy landscape equations error surface external input feature mapping feedback layer feedback network feedforward network feedforward neural network fuzzy gradient descent Hebbian learning hidden layer hidden units Hopfield IEEE input data input layer input pattern input vector input-output pattern pairs instar ith unit learning algorithm learning law learning rate parameter linear method minima neural network models neuron nonlinear optimization output function output layer output pattern output units pattern association pattern mapping pattern recognition tasks pattern storage perceptron learning performance pixel principal component probability distribution processing units radial basis function random recall shown in Figure simulated annealing stable supervised learning synaptic dynamics temperature term training set VC dimension weight matrix weight vector

### Popular passages

Page 406 - Pattern classification and scene analysis", New- York, Wiley, 1973.

Page 406 - Burke, B. (1989). Hardware architecture of a neural network model simulating pattern recognition by the olfactory bulb.

Page 425 - Neurocomputing," in Artificial Neural Networks: Paradigms, Applications, and Hardware Implementations, E. SanchezSinencio and C. Lau, Eds., IEEE Press, 1992, pp 344-363. [69] E. Vittoz, H. Oguey, MA Maher, O. Nys, E. Dijkstra, and M. Chevroulet, "Analog Storage of Adjustable Synaptic Weights," in VLSI Design of Neural Networks, Norwell MA: Kluwer Academic, pp 47-63, 1991.

Page 420 - AF Rocha, Fuzzy-set based models of neurons and knowledge-based networks, IEEE Trans, on Fuzzy Systems, 1, 1993, 254-266.

Page 420 - Chaos - A Tutorial for Engineers," Proceedings of The IEEE, vol.