Prix bas
CHF79.20
Impression sur demande - l'exemplaire sera recherché pour vous.
This book describes how neural networks operate from the mathematical point of view. As a result, neural networks can be interpreted both as function universal approximators and information processors. The book bridges the gap between ideas and concepts of neural networks, which are used nowadays at an intuitive level, and the precise modern mathematical language, presenting the best practices of the former and enjoying the robustness and elegance of the latter.
This book can be used in a graduate course in deep learning, with the first few parts being accessible to senior undergraduates. In addition, the book will be of wide interest to machine learning researchers who are interested in a theoretical understanding of the subject.
Contains a fair number of end-of chapter exercises Full solutions provided to all exercises Appendices including topics needed in the book exposition
Auteur
Ovidiu Calin , a graduate from University of Toronto, is a professor at Eastern Michigan University and a former visiting professor at Princeton University and University of Notre Dame. He has delivered numerous lectures at several universities in Japan, Hong Kong, Taiwan, and Kuwait over the last 15 years. His publications include over 60 articles and 8 books in the fields of machine learning, computational finance, stochastic processes, variational calculus and geometric analysis.
Contenu
Introductory Problems.- Activation Functions.- Cost Functions.- Finding Minima Algorithms.- Abstract Neurons.- Neural Networks.- Approximation Theorems.- Learning with One-dimensional Inputs.- Universal Approximators.- Exact Learning.- Information Representation.- Information Capacity Assessment.- Output Manifolds.- Neuromanifolds.- Pooling.- Convolutional Networks.- Recurrent Neural Networks.- Classification.- Generative Models.- Stochastic Networks.- Hints and Solutions.
Prix bas