Prix bas
CHF109.60
L'exemplaire sera recherché pour vous.
Pas de droit de retour !
This is the first text on pattern recognition to present the Bayesian viewpoint, one that has become increasing popular in the last five years. It presents approximate inference algorithms that permit fast approximate answers in situations where exact answers are not feasible. It provides the first text to use graphical models to describe probability distributions when there are no other books that apply graphical models to machine learning. It is also the first four-color book on pattern recognition. The book is suitable for courses on machine learning, statistics, computer science, signal processing, computer vision, data mining, and bioinformatics. Extensive support is provided for course instructors, including more than 400 exercises, graded according to difficulty. Example solutions for a subset of the exercises are available from the book web site, while solutions for the remainder can be obtained by instructors from the publisher.
Auteur
Chris Bishop is a Microsoft Distinguished Scientist and the Laboratory Director at Microsoft Research Cambridge. He is also Professor of Computer Science at the University of Edinburgh, and a Fellow of Darwin College, Cambridge. In 2004, he was elected Fellow of the Royal Academy of Engineering, and in 2007 he was elected Fellow of the Royal Society of Edinburgh.
Chris obtained a BA in Physics from Oxford, and a PhD in Theoretical Physics from the University of Edinburgh, with a thesis on quantum field theory. He then joined Culham Laboratory where he worked on the theory of magnetically confined plasmas as part of the European controlled fusion programme.
Texte du rabat
The dramatic growth in practical applications for machine learning over the last ten years has been accompanied by many important developments in the underlying algorithms and techniques. For example, Bayesian methods have grown from a specialist niche to become mainstream, while graphical models have emerged as a general framework for describing and applying probabilistic techniques. The practical applicability of Bayesian methods has been greatly enhanced by the development of a range of approximate inference algorithms such as variational Bayes and expectation propagation, while new models based on kernels have had a significant impact on both algorithms and applications.
This completely new textbook reflects these recent developments while providing a comprehensive introduction to the fields of pattern recognition and machine learning. It is aimed at advanced undergraduates or first-year PhD students, as well as researchers and practitioners. No previous knowledge of pattern recognition or machine learning concepts is assumed. Familiarity with multivariate calculus and basic linear algebra is required, and some experience in the use of probabilities would be helpful though not essential as the book includes a self-contained introduction to basic probability theory.
The book is suitable for courses on machine learning, statistics, computer science, signal processing, computer vision, data mining, and bioinformatics. Extensive support is provided for course instructors, including more than 400 exercises, graded according to difficulty. Example solutions for a subset of the exercises are available from the book web site, while solutions for the remainder can be obtained by instructors from the publisher. The book is supported by a great deal of additional material, and the reader is encouraged to visit the book web site for the latest information.
Christopher M. Bishop is Deputy Director of Microsoft Research Cambridge, and holds a Chair inComputer Science at the University of Edinburgh. He is a Fellow of Darwin College Cambridge, a Fellow of the Royal Academy of Engineering, and a Fellow of the Royal Society of Edinburgh. His previous textbook "Neural Networks for Pattern Recognition" has been widely adopted.
Coming soon:
*For students, worked solutions to a subset of exercises available on a public web site (for exercises marked "www" in the text)
*For instructors, worked solutions to remaining exercises from the Springer web site
*Lecture slides to accompany each chapter
*Data sets available for download
Contenu
Probability Distributions.- Linear Models for Regression.- Linear Models for Classification.- Neural Networks.- Kernel Methods.- Sparse Kernel Machines.- Graphical Models.- Mixture Models and EM.- Approximate Inference.- Sampling Methods.- Continuous Latent Variables.- Sequential Data.- Combining Models.