close menu
Bookswagon-24x7 online bookstore
close menu
My Account
A First Course in Machine Learning, Second Edition

A First Course in Machine Learning, Second Edition

          
5
4
3
2
1

Out of Stock


Premium quality
Premium quality
Bookswagon upholds the quality by delivering untarnished books. Quality, services and satisfaction are everything for us!
Easy Return
Easy return
Not satisfied with this product! Keep it in original condition and packaging to avail easy return policy.
Certified product
Certified product
First impression is the last impression! Address the book’s certification page, ISBN, publisher’s name, copyright page and print quality.
Secure Checkout
Secure checkout
Security at its finest! Login, browse, purchase and pay, every step is safe and secured.
Money back guarantee
Money-back guarantee:
It’s all about customers! For any kind of bad experience with the product, get your actual amount back after returning the product.
On time delivery
On-time delivery
At your doorstep on time! Get this book delivered without any delay.
Notify me when this book is in stock
Add to Wishlist

About the Book

"A First Course in Machine Learning by Simon Rogers and Mark Girolami is the best introductory book for ML currently available. It combines rigor and precision with accessibility, starts from a detailed explanation of the basic foundations of Bayesian analysis in the simplest of settings, and goes all the way to the frontiers of the subject such as infinite mixture models, GPs, and MCMC."
—Devdatt Dubhashi, Professor, Department of Computer Science and Engineering, Chalmers University, Sweden

"This textbook manages to be easier to read than other comparable books in the subject while retaining all the rigorous treatment needed. The new chapters put it at the forefront of the field by covering topics that have become mainstream in machine learning over the last decade."
—Daniel Barbara, George Mason University, Fairfax, Virginia, USA

"The new edition of A First Course in Machine Learning by Rogers and Girolami is an excellent introduction to the use of statistical methods in machine learning. The book introduces concepts such as mathematical modeling, inference, and prediction, providing `just in time’ the essential background on linear algebra, calculus, and probability theory that the reader needs to understand these concepts."
—Daniel Ortiz-Arroyo, Associate Professor, Aalborg University Esbjerg, Denmark

"I was impressed by how closely the material aligns with the needs of an introductory course on machine learning, which is its greatest strength…Overall, this is a pragmatic and helpful book, which is well-aligned to the needs of an introductory course and one that I will be looking at for my own students in coming months."
—David Clifton, University of Oxford, UK

"The first edition of this book was already an excellent introductory text on machine learning for an advanced undergraduate or taught masters level course, or indeed for anybody who wants to learn about an interesting and important field of computer science. The additional chapters of advanced material on Gaussian process, MCMC and mixture modeling provide an ideal basis for practical projects, without disturbing the very clear and readable exposition of the basics contained in the first part of the book." 
—Gavin Cawley, Senior Lecturer, School of Computing Sciences, University of East Anglia, UK

"This book could be used for junior/senior undergraduate students or first-year graduate students, as well as individuals who want to explore the field of machine learning…The book introduces not only the concepts but the underlying ideas on algorithm implementation from a critical thinking perspective."
—Guangzhi Qu, Oakland University, Rochester, Michigan, USA



Table of Contents:

Linear Modelling: A Least Squares Approach
LINEAR MODELLING
De ning the model
Modelling assumptions
De ning a good model
The least squares solution—a worked example
Worked example
Least squares t to the Olympic data
Summary
MAKING PREDICTIONS
A second Olympic dataset
Summary
VECTOR/MATRIX NOTATION
Example
Numerical example
Making predictions
Summary
NON-LINEAR RESPONSE FROM A LINEAR MODEL
GENERALISATION AND OVER-FITTING
Validation data
Cross-validation
Computational scaling of K-fold cross-validation
REGULARISED LEAST SQUARES
EXERCISES
FURTHER READING

Linear Modelling: A Maximum Likelihood Approach
ERRORS AS NOISE
Thinking generatively
RANDOM VARIABLES AND PROBABILITY
Random variables
Probability and distributions
Adding probabilities
Conditional probabilities
Joint probabilities
Marginalisation
Aside—Bayes' rule
Expectations
POPULAR DISCRETE DISTRIBUTIONS
Bernoulli distribution
Binomial distribution
Multinomial distribution
CONTINUOUS RANDOM VARIABLES { DENSITY
FUNCTIONS
POPULAR CONTINUOUS DENSITY FUNCTIONS
The uniform density function
The beta density function
The Gaussian density function
Multivariate Gaussian
SUMMARY
THINKING GENERATIVELY...CONTINUED
LIKELIHOOD
Dataset likelihood
Maximum likelihood
Characteristics of the maximum likelihood solution
Maximum likelihood favours complex models
THE BIAS-VARIANCE TRADE-OFF
Summary
EFFECT OF NOISE ON PARAMETER ESTIMATES
Uncertainty in estimates
Comparison with empirical values
Variability in model parameters—Olympic data
VARIABILITY IN PREDICTIONS
Predictive variability—an example
Expected values of the estimators
CHAPTER SUMMARY
EXERCISES
FURTHER READING

The Bayesian Approach to Machine Learning
A COIN GAME
Counting heads
The Bayesian way
THE EXACT POSTERIOR
THE THREE SCENARIOS
No prior knowledge
The fair coin scenario
A biased coin
The three scenarios—a summary
Adding more data
MARGINAL LIKELIHOODS
Model comparison with the marginal likelihood
HYPERPARAMETERS
GRAPHICAL MODELS
SUMMARY
A BAYESIAN TREATMENT OF THE OLYMPIC 100m DATA 122
The model
The likelihood
The prior
The posterior
A first-order polynomial
Making predictions
MARGINAL LIKELIHOOD FOR POLYNOMIAL MODEL
ORDER SELECTION
CHAPTER SUMMARY
EXERCISES
FURTHER READING
Bayesian Inference
NON-CONJUGATE MODELS
BINARY RESPONSES
A model for binary responses
A POINT ESTIMATE—THE MAP SOLUTION
THE LAPLACE APPROXIMATION
Laplace approximation example: Approximating a
gamma density
Laplace approximation for the binary response model
SAMPLING TECHNIQUES
Playing darts
The Metropolis{Hastings algorithm
The art of sampling
CHAPTER SUMMARY
EXERCISES
FURTHER READING

Classification
THE GENERAL PROBLEM
PROBABILISTIC CLASSIFIERS
The Bayes classifier
Likelihood—class-conditional distributions
Prior class distribution
Example—Gaussian class-conditionals
Making predictions
The naive-Bayes assumption
Example—classifying text
Smoothing
Logistic regression
Motivation
Non-linear decision functions
Non-parametric models—the Gaussian process
NON-PROBABILISTIC CLASSIFIERS
K-nearest neighbours
Choosing K
Support vector machines and other kernel methods
The margin
Maximising the margin
Making predictions
Support vectors
Soft margins
Kernels
Summary
ASSESSING CLASSIFICATION PERFORMANCE
Accuracy—0/1 loss
Sensitivity and speci city
The area under the ROC curve
Confusion matrices
DISCRIMINATIVE AND GENERATIVE CLASSIFIERS
CHAPTER SUMMARY
EXERCISES
FURTHER READING

Clustering
THE GENERAL PROBLEM
K-MEANS CLUSTERING
Choosing the number of clusters
Where K-means fails
Kernelised K-means
Summary
MIXTURE MODELS
A generative process
Mixture model likelihood
The EM algorithm
Updating _k
Updating _k
Updating _k
Updating qnk
Some intuition
Example
EM nds local optima
Choosing the number of components
Other forms of mixture component
MAP estimates with EM
Bayesian mixture models
CHAPTER SUMMARY
EXERCISES
FURTHER READING

Principal Components Analysis and Latent Variable Models
THE GENERAL PROBLEM
Variance as a proxy for interest
PRINCIPAL COMPONENTS ANALYSIS
Choosing D
Limitations of PCA
LATENT VARIABLE MODELS
Mixture models as latent variable models
Summary
VARIATIONAL BAYES
Choosing Q(_)
Optimising the bound
A PROBABILISTIC MODEL FOR PCA
Q_ (_ )
Qxn(xn)
Qwm(wm)
The required expectations
The algorithm
An example
MISSING VALUES
Missing values as latent variables
Predicting missing values
NON-REAL-VALUED DATA
Probit PPCA
Visualising parliamentary data
Aside—relationship to classification
CHAPTER SUMMARY
EXERCISES
FURTHER READING

Advanced Topics

Gaussian Processes
PROLOGUE—NON-PARAMETRIC MODELS
GAUSSIAN PROCESS REGRESSION
The Gaussian process prior
Noise-free regression
Noisy regression
Summary
Noisy regression—an alternative route Alternative covariance functions
Linear
Polynomial
Neural network
ARD
Composite covariance functions
Summary
GAUSSIAN PROCESS CLASSIFICATION
A classi cation likelihood
A classi cation roadmap
The point estimate approximation
Propagating uncertainty through the sigmoid
The Laplace approximation
Summary
HYPERPARAMETER OPTIMISATION
EXTENSIONS
Non-zero mean
Multiclass classi cation
Other likelihood functions and models
Other inference schemes
CHAPTER SUMMARY
EXERCISES
FURTHER READING

Markov Chain Monte Carlo Sampling
GIBBS SAMPLING
EXAMPLE: GIBBS SAMPLING FOR GP
CLASSIFICATION
Conditional densities for GP classi cation via Gibbs sampling
Summary
WHY DOES MCMC WORK?
SOME SAMPLING PROBLEMS AND SOLUTIONS
Burn-in and convergence
Autocorrelation
Summary
ADVANCED SAMPLING TECHNIQUES
Adaptive proposals and Hamiltonian Monte Carlo
Approximate Bayesian computation
Population MCMC and temperature schedules
Sequential Monte Carlo
CHAPTER SUMMARY
EXERCISES
FURTHER READING

Advanced Mixture Modelling
A GIBBS SAMPLER FOR MIXTURE MODELS
COLLAPSED GIBBS SAMPLING
AN INFINITE MIXTURE MODEL
The Chinese restaurant process
Inference in the in nite mixture model
Summary
DIRICHLET PROCESSES
Hierarchical Dirichlet processes
Summary
BEYOND STANDARD MIXTURES—TOPIC MODELS
CHAPTER SUMMARY
EXERCISES
FURTHER READING
Glossary
Index


Best Seller

| | See All

Product Details
  • ISBN-13: 9781498738576
  • Publisher: Taylor & Francis Inc
  • Publisher Imprint: Chapman & Hall/CRC
  • Edition: New edition
  • No of Pages: 397
  • ISBN-10: 1498738575
  • Publisher Date: 05 Aug 2016
  • Binding: Digital (delivered electronically)
  • Language: English


Similar Products

How would you rate your experience shopping for books on Bookswagon?

Add Photo
Add Photo

Customer Reviews

REVIEWS           
Be The First to Review
A First Course in Machine Learning, Second Edition
Taylor & Francis Inc -
A First Course in Machine Learning, Second Edition
Writing guidlines
We want to publish your review, so please:
  • keep your review on the product. Review's that defame author's character will be rejected.
  • Keep your review focused on the product.
  • Avoid writing about customer service. contact us instead if you have issue requiring immediate attention.
  • Refrain from mentioning competitors or the specific price you paid for the product.
  • Do not include any personally identifiable information, such as full names.

A First Course in Machine Learning, Second Edition

Required fields are marked with *

Review Title*
Review
    Add Photo Add up to 6 photos
    Would you recommend this product to a friend?
    Tag this Book
    Read more
    Does your review contain spoilers?
    What type of reader best describes you?
    I agree to the terms & conditions
    You may receive emails regarding this submission. Any emails will include the ability to opt-out of future communications.

    CUSTOMER RATINGS AND REVIEWS AND QUESTIONS AND ANSWERS TERMS OF USE

    These Terms of Use govern your conduct associated with the Customer Ratings and Reviews and/or Questions and Answers service offered by Bookswagon (the "CRR Service").


    By submitting any content to Bookswagon, you guarantee that:
    • You are the sole author and owner of the intellectual property rights in the content;
    • All "moral rights" that you may have in such content have been voluntarily waived by you;
    • All content that you post is accurate;
    • You are at least 13 years old;
    • Use of the content you supply does not violate these Terms of Use and will not cause injury to any person or entity.
    You further agree that you may not submit any content:
    • That is known by you to be false, inaccurate or misleading;
    • That infringes any third party's copyright, patent, trademark, trade secret or other proprietary rights or rights of publicity or privacy;
    • That violates any law, statute, ordinance or regulation (including, but not limited to, those governing, consumer protection, unfair competition, anti-discrimination or false advertising);
    • That is, or may reasonably be considered to be, defamatory, libelous, hateful, racially or religiously biased or offensive, unlawfully threatening or unlawfully harassing to any individual, partnership or corporation;
    • For which you were compensated or granted any consideration by any unapproved third party;
    • That includes any information that references other websites, addresses, email addresses, contact information or phone numbers;
    • That contains any computer viruses, worms or other potentially damaging computer programs or files.
    You agree to indemnify and hold Bookswagon (and its officers, directors, agents, subsidiaries, joint ventures, employees and third-party service providers, including but not limited to Bazaarvoice, Inc.), harmless from all claims, demands, and damages (actual and consequential) of every kind and nature, known and unknown including reasonable attorneys' fees, arising out of a breach of your representations and warranties set forth above, or your violation of any law or the rights of a third party.


    For any content that you submit, you grant Bookswagon a perpetual, irrevocable, royalty-free, transferable right and license to use, copy, modify, delete in its entirety, adapt, publish, translate, create derivative works from and/or sell, transfer, and/or distribute such content and/or incorporate such content into any form, medium or technology throughout the world without compensation to you. Additionally,  Bookswagon may transfer or share any personal information that you submit with its third-party service providers, including but not limited to Bazaarvoice, Inc. in accordance with  Privacy Policy


    All content that you submit may be used at Bookswagon's sole discretion. Bookswagon reserves the right to change, condense, withhold publication, remove or delete any content on Bookswagon's website that Bookswagon deems, in its sole discretion, to violate the content guidelines or any other provision of these Terms of Use.  Bookswagon does not guarantee that you will have any recourse through Bookswagon to edit or delete any content you have submitted. Ratings and written comments are generally posted within two to four business days. However, Bookswagon reserves the right to remove or to refuse to post any submission to the extent authorized by law. You acknowledge that you, not Bookswagon, are responsible for the contents of your submission. None of the content that you submit shall be subject to any obligation of confidence on the part of Bookswagon, its agents, subsidiaries, affiliates, partners or third party service providers (including but not limited to Bazaarvoice, Inc.)and their respective directors, officers and employees.

    Accept

    New Arrivals

    | | See All


    Inspired by your browsing history


    Your review has been submitted!

    You've already reviewed this product!
    ASK VIDYA