Pattern Recognition and Machine Learning
机器学习三大经典书籍之一(PRML ESL MLAPP)Information Science and statisticsAkaike and Kitagawa. The Practice of Time Series AnalysisBishop Pattern Recognition and Machine LearningCowell, Dawid, Lauritzen, and Spiegelhalter: Probabilistic Networks andExpert SystemsDoucet, de Freitas, and Gordon: Sequential Monte Carlo Methods in PracticeFine: Feedforward Neural Network MethodologyHawkins and Olwell: Cumulative Sum Charts and Charting for Quality ImprovementJensen: Bayesian Networks and Decision GraphsMarchette: Computer Intrusion Detection and Network MonitoringA Statistical ViewpointRubinstein and Kroese: The Cross-Entropy Method: A Unified Approach toCombinatorial Optimization, Monte Carlo Simulation, and Machine LearningStudeny: Probabilistic Conditional Independence structuresVapnik: The Nature of Statistical Learning Theory, Second EditionWallace: Statistical and Inductive Inference by Minimum Massage LengthChristopher M. BishopPattern Recognition andMachine Learning② SpringerChristopher m. bishop F Renassistant directorMicrosoft research ltdCambridge cb3 ofb. U.Kcmbishop@microsoft.comhttp://research.microsoft.com/-cmbishopSeries editorsMichael jordanProfessor Jon KleinbergBernhard ScholkopfDepartment of ComputerDepartment of ComputerMax planck institute forScience and departmentScienceBiological Cyberneticsof statisticsCornell Universitypemannstrasse 38University of California,Ithaca NY 1485372076 TubingenBerkeleyUSAGermanyBerkeley, Ca 94720USALibrary of Congress Control Number: 2006922522ISBN-10:0-387-31073-8ISBN-13:978-0387-31073-2Printed on acid-free papero 2006 Springer Science +Business Media, LLCAll rights reserved. This work may not be translated or copied in whole or in part without the written permission of the publisher(Springer Science+Business Media, LLC, 233 Spring Street, New York, NY 10013, USA), except for brief excerpts in connectionwith reviews or scholarly analysis. Use in connection with any form of information storage and retrieval, electronic adaptationcomputer software, or by similar or dissimilar methodology now known or hereafter developed is forbiddenis not to be taken as an expression of opinion as to whether or not they are subject to proprietary nghs e not identified as such,The use in this publication of trade names, trademarks, service marks, and similar terms, even if theyPrinted in Singapore. (KYO)987654321Springer. comThis book is dedicated to my familyJenna, Mark, and HughTotal eclipse of the sun, Antalya, Turkey, 29 March 2006PrefacePattern recognition has its origins in engineering, whereas machine learning grewout of computer science. However, these activities can be viewed as two facets ofthe same field, and together they have undergone substantial development over thepast ten years. In particular, Bayesian methods have grown from a specialist niche tobecome mainstream, while graphical models have emerged as a general frameworkfor describing and applying probabilistic models. Also, the practical applicability ofBayesian methods has been greatly enhanced through the development of a range ofapproximate inference algorithms such as variational Bayes and expectation propa-gation. Similarly, new models based on kernels have had significant impact on bothalgorithms and applicationsThis new textbook reflects these recent developments while providing a comprehensive introduction to the fields of pattern recognition and machine learning. It isaimed at advanced undergraduates or first year Phd students, as well as researchersand practitioners, and assumes no previous know ledge of pattern recognition or machine learning concepts. Knowledge of multivariate calculus and basic linear algebrais required, and some familiarity with probabilities would be helpful though not essential as the book includes a self-contained introduction to basic probability theoryBecause this book has broad scope, it is impossible to provide a complete list ofreferences, and in particular no attempt has been made to provide accurate historicaattribution of ideas. Instead the aim has been to give references that offer greaterdetail than is possible here and that hopefully provide entry points into what, in somecases, is a very extensive literature. For this reason the references are often to morerecent textbooks and review articles rather than to original sourcesThe book is supported by a great deal of additional material, including lectureslides as well as the complete set of figures used in the book, and the reader isencouraged to visit the book web site for the latest informationhttp://research.microsoftcom/cmbishop/prmlPREFACEExercisesThe exercises that appear at the end of every chapter form an important com-ponent of the book. Each exercise has been carefully chosen to reinforce conceptsexplained in the text or to develop and generalize them in significant ways, and eachis graded according to difficulty ranging from(*), which denotes a simple exercisetaking a few minutes to complete, through to(***), which denotes a significantlmore complex exerciseIt has been difficult to know to what extent these solutions should be madewidely available. Those engaged in self study will find worked solutions very beneficial, whereas many course tutors request that solutions be available only via thepublisher so that the exercises may be used in class. In order to try to meet theseconflicting requirements, those exercises that help amplify key points in the text, orthat fill in important details, have solutions that are available as a pdf file from thebookwebsiteSuchexercisesaredenotedbywww.Solutionsfortheremainingexercises are available to course tutors by contacting the publisher (contact detailsare given on the book web site). Readers are strongly encouraged to work throughthe exercises unaided, and to turn to the solutions only as requiredAlthough this book focuses on concepts and principles, in a taught course thestudents should ideally have the opportunity to experiment with some of the keyalgorithms using appropriate data sets. A companion volume(Bishop and Nabney,2008)will deal with practical aspects of pattern recognition and machine learning,and will be accompanied by Matlab software implementing most of the algorithmsdiscussed in this bookAcknowledgementsFirst of all I would like to express my sincere thanks to Markus Svensen whohas provided immense help with preparation of figures and with the typesetting ofthe book in TEX. His assistance has been invaluableI am very grateful to Microsoft Research for providing a highly stimulating re-search environment and for giving me the freedom to write this book( the views andopinions expressed in this book however are my own and are therefore not necessarily the same as those of microsoft or its affiliatesSpringer has provided excellent support throughout the final stages of preparation of this book, and I would like to thank my commissioning editor John Kimmelfor his support and professionalism, as well as Joseph Piliero for his help in design-ing the cover and the text format and Mary ann Brickner for her numerous contributions during the production phase. The inspiration for the cover design came from adiscussion with antonio criminisiI also wish to thank Oxford university Press for permission to reproduce ex-certs from an earlier textbook, Neural Networks for Pattern Recognition(Bishop,1995a). The images of the Mark 1 perceptron and of Frank rosenblatt are reproduced with the permission of Arvin Calspan Advanced Technology Center. I wouldalso like to thank Asela Gunawardana for plotting the spectrogram in Figure 13.1and Bernhard Scholkopf for permission to use his kernel PCa code to plot Fiure12.17PREFACEMany people have helped by proofreading draft material and providing comments and suggestions, including Shivani Agarwal, Cedric Archambeau, Arik azranAndrew Blake, Hakan Cevikalp, Michael Fourman, Brendan Frey, Zoubin Ghahra-mani, Thore Graepel, Katherine Heller, Ralf Herbrich, Geoffrey Hinton, Adam Johansen, Matthew Johnson, Michael Jordan, Eva Kalyvianaki, Anitha Kannan, JuliaLasserre, David Liu, Tom Minka, lan Nabney, Tonatiuh Pena, Yuan Qi, Sam RoweisBalaji sanjiya, Toby Sharp Ana Costa e Silva, David Spiegelhalter, Jay stokes, TaraSymeonides, Martin Szummer, Marshall Tappen, llkay Ulusoy, Chris williams, JohnWinn, and andrew ZissermanFinally, I would like to thank my wife Jenna who has been hugely supportivethroughout the several years it has taken to write this bookChris BishopCambridgeFebruary 2006Mathematical notationI have tried to keep the mathematical content of the book to the minimum neces-sary to achieve a proper understanding of the field. However, this minimum level isnonzero, and it should be emphasized that a good grasp of calculus, linear algebra,and probability theory is essential for a clear understanding of modern pattern recognition and machine learning techniques. Nevertheless, the emphasis in this book ison conveying the underlying concepts rather than on mathematical rigour.I have tried to use a consistent notation throughout the book, although at timesthis means departing from some of the conventions used in the corresponding research literature. Vectors are denoted by lower case bold roman letters such asx, and all vectors are assumed to be column vectors a superscript t denotes thetranspose of a matrix or vector, so that x will be a row vector. Uppercase boldroman letters, such as M, denote matrices. The notation (1,..., wM denotes arow vector with M elements, while the corresponding column vector is written asMThe notation a, b is used to denote the closed interval from a to b, that is theinterval including the values a and b themselves, while(a, b) denotes the corresponding open interval, that is the interval excluding a and b. Similarly, la, b) denotes aninterval that includes a but excludes 6. For the most part, however, there will belittle need to dwell on such refinements as whether the end points of an interval areThe M X M identity matrix(also known as the unit matrix) is denoted IMwhich will be abbreviated to I where there is no ambiguity about it dimensionalityIt has elements lij that equal 1 if i=j and 0 if if 3A functional is denoted fly where y(a) is some function. The concept of afunctional is discussed in Appendix DThe notation g(x)=o(f(a)) denotes that f()/g (a) is bounded as xFor instance if g()322+2, then glThe expectation of a function f(a, y) with respect to a random variable is denoted by Er[f (a, g). In situations where there is no ambiguity as to which variableis being averaged over, this will be simplified by omitting the suffix, for instance
下载地址
用户评论
非常不错的资源 多谢分享
非常实用,谢谢。