The following set of tutorials focus on many aspects of statistical data mining, including the foundations of probability, the foundations of statistical data analysis, and most of the classic machine learning and data mining algorithms.
These include classification algorithms such as decision trees, neural nets, Bayesian classifiers, Support Vector Machines and cased-based (aka non-parametric) learning. They include regression algorithms such as multivariate polynomial regression, MARS, Locally Weighted Regression, GMDH and neural nets. And they include other data mining operations such as clustering (mixture models, k-means and hierarchical), Bayesian networks and Reinforcement Learning.

Tutorial Slides by Andrew Moore.
* Decision Trees
* Information Gain
* Probability for Data Miners
* Probability Density Functions
* Gaussians
* Maximum Likelihood Estimation
* Gaussian Bayes Classifiers
* Cross-Validation
* Neural Networks
* Instance-based learning (aka Case-based or Memory-based or non-parametric)
* Eight Regression Algorithms
* Predicting Real-valued Outputs: An introduction to regression
* Bayesian Networks
* Inference in Bayesian Networks (by Scott Davies and Andrew Moore)
* Learning Bayesian Networks
* A Short Intro to Naive Bayesian Classifiers
* Short Overview of Bayes Nets
* Gaussian Mixture Models
* K-means and Hierarchical Clustering
* Hidden Markov Models
* VC dimension
* Support Vector Machines
* PAC Learning
* Markov Decision Processes
* Reinforcement Learning
* Biosurveillance: An example
* Elementary probability and Naive Bayes classifiers
* Spatial Surveillance
* Time Series Methods
* Constraint Satisfaction Algorithms, with applications in Computer Vision and Scheduling
* Robot Motion Planning
* HillClimbing, Simulated Annealing and Genetic Algorithms

Post to Twitter Post to Yahoo Buzz Post to Delicious Post to Digg Post to Facebook Post to Google Buzz Post to LinkedIn Post to Slashdot Post to StumbleUpon Post to Technorati