机器学习概论机器学习概论 (9).pdf
《机器学习概论机器学习概论 (9).pdf》由会员分享,可在线阅读,更多相关《机器学习概论机器学习概论 (9).pdf(19页珍藏版)》请在淘文阁 - 分享文档赚钱的网站上搜索。
1、1The final topic:Introduction to Machine Learning overview introduction to machine learning:overview2Overview Basic concepts Machine learning approaches 2019/6/102I.Basic Concepts introduction to machine learning:overview4Whats machine learning?2019/6/103introduction to machine learning:overview5Wha
2、ts machine learning?Learning=improving with experience at some taskT(Task)P(Performance)E(Experience)introduction to machine learning:overview7A generic system System1x2xNx1y2yMy12,.,Kh hh()12,.,Nx xx=x()12,.,Kh hh=h()12,.,Ky yy=yInput Variables:Hidden Variables:Output Variables:Control Parameterse,
3、d2019/6/104II.Machine learning approachesintroduction to machine learning:overview9The branches of machine learningSupervised LearningUnsupervised LearningSemi-supervised LearningTransfer Learning2019/6/105SupervisedUnsupervisedInstances for learning(X,Y)pair,usually with human involvementX only,usu
4、ally without humaninvolvementGoal of learningLearning relation between X and YLearning structure of XMeasure of successLoss functionNo ApplicationPrediction:X=input,Y=outputAnalysis:X=inputintroduction to machine learning:unsupervised learning10Supervised and Unsupervised LearningSupervisedUnsupervi
5、sedInstances for learning(X,Y)pair,usually with human involvementX only,usually without humaninvolvementGoal of learningLearning relation between X and YLearning structure of XMeasure of successLoss functionNo ApplicationPrediction:X=input,Y=outputAnalysis:X=inputintroduction to machine learning:uns
6、upervised learning11Supervised and Unsupervised Learning2019/6/106SupervisedUnsupervisedInstances for learning(X,Y)pair,usually with human involvementX only,usually without humaninvolvementGoal of learningLearning relation between X and YLearning structure of XMeasure of successLoss functionNo Appli
7、cationPrediction:X=input,Y=outputAnalysis:X=inputintroduction to machine learning:unsupervised learning12Supervised and Unsupervised Learningintroduction to machine learning:overview13SupervisedUnsupervisedInstances for learning(X,Y)pair,usually with human involvementX only,usually without humaninvo
8、lvementGoal of learningLearning relation between X and YLearning structureof XMeasure of successLoss functionNo ApplicationPrediction:X=input,Y=outputAnalysis:X=inputSupervised and Unsupervised Learning2019/6/107II.Machine learning approaches(part I)Supervised Learningintroduction to machine learnin
9、g:overview15Supervised learningLearning ModulePrediction Modulennyyyxxx!2121?1+ny)|(XYPKnowledge1+nxY is simpleX is complex2019/6/108introduction to machine learning:overview16A1.Decision treelData effectively modeled via decision splits on attributesX3Y5-1+1-1noyesyesnoXY35+1-1-1introduction to mac
10、hine learning:overview17Decision tree:use concepts/rules to represent hypothesisIntuitional,easy to get the explanation of the data by the learned hypothesisBut what if we cannot find the obvious rules for the observed data?Using statistical approachesUsing statistical approaches2019/6/109introducti
11、on to machine learning:overview18A2.Bayesian LearningCondition Resulte.g.pneumonia lung cancer?Hard to tell directlyReversed thinking (Result Causal)e.g.How many lung cancer patients have suffered from pneumonia?introduction to machine learning:overview19Bayesian LearningBayes theoremUse prior proba
12、bility to inference posterior probabilityMax A Posterior,MAP,hMAP,极大后验假设Generally we want the most probable hypothesis given the training dataMaximum Likelihood,ML,hML,极大似然假设The smart man always learns the most from experiences if he knows p(h)ML vs.LSE(Least Square Error)Nave Bayes,NB,朴素贝叶斯Independ
13、ent assumptionNB vs.MAPMinimum description length,MDL,最小描述长度Tradeoff:hypothesis complexity vs.errors by hMDL vs.MAP)()()|()|(DPhPhDPDhP=2019/6/1010introduction to machine learning:overview20A3.HMMWhat if the observed data is indirect evidence?hidden state existsProperty 1:Markov assumptionp(Xi|Xi-1X
14、1)=p(Xi|Xi-1)Property 2:time invariance assumptionp(Xi+1|Xi)=p(Xj+1|Xj),for any i,jProperty 3:independent observation assumptionp(O1,.,OT|X1,.,XT)=p(Ot|Xt)oTo1otot-1ot+1ABAAABBBBintroduction to machine learning:overview21HMM basic Problem1 EstimationEstimation problem:Compute the probability of a gi
- 配套讲稿:
如PPT文件的首页显示word图标,表示该PPT已包含配套word讲稿。双击word图标可打开word文档。
- 特殊限制:
部分文档作品中含有的国旗、国徽等图片,仅作为作品整体效果示例展示,禁止商用。设计者仅对作品中独创性部分享有著作权。
- 关 键 词:
- 机器学习概论机器学习概论 9 机器 学习 概论
限制150内