机器学习 吴恩达
共113课时 19小时28分58秒秒
简介
此课程将广泛介绍机器学习、数据挖掘与统计模式识别的知识。主题包括:(i) 监督学习(参数/非参数算法、支持向量机、内核、神经网络)。(ii) 非监督学习(聚类、降维、推荐系统、深度学习)。(iii) 机器学习的优秀案例(偏差/方差理论;机器学习和人工智能的创新过程)课程将拮取案例研究与应用,学习如何将学习算法应用到智能机器人(观感,控制)、文字理解(网页搜索,防垃圾邮件)、计算机视觉、医学信息学、音频、数据挖掘及其他领域上。
吴恩达 Andrew Ng, Co-founder, Coursera; Adjunct Professor, Stanford University; formerly head of Baidu AI Group/Google Brain
章节
- 课时1:Welcome (6分54秒)
- 课时2:What is Machine Learning (7分14秒)
- 课时3:Supervised Learning (12分29秒)
- 课时4:Unsupervised Learning (14分13秒)
- 课时5:Model Representation (8分10秒)
- 课时6:Cost Function (8分12秒)
- 课时7:Cost Function - Intuition I (11分9秒)
- 课时8:Cost Function - Intuition II (8分48秒)
- 课时9:Gradient Descent (11分30秒)
- 课时10:Gradient Descent Intuition (11分50秒)
- 课时11:Gradient Descent For Linear Regression (10分20秒)
- 课时12:What-'s Next (5分49秒)
- 课时13:Matrices and Vectors (8分45秒)
- 课时14:Addition and Scalar Multiplication (6分53秒)
- 课时15:Matrix Vector Multiplication (13分39秒)
- 课时16:Matrix Matrix Multiplication (11分9秒)
- 课时17:Matrix Multiplication Properties (9分2秒)
- 课时18:Inverse and Transpose (11分12秒)
- 课时19:Multiple Features (8分22秒)
- 课时20:Gradient Descent for Multiple Variables (5分4秒)
- 课时21:Gradient Descent in Practice I - Feature Scaling (8分51秒)
- 课时22:Gradient Descent in Practice II - Learning Rate (8分58秒)
- 课时23:Features and Polynomial Regression (7分39秒)
- 课时24:Normal Equation (16分17秒)
- 课时25:Normal Equation Noninvertibility (Optional) (5分58秒)
- 课时26:Basic Operations (13分59秒)
- 课时27:Moving Data Around (16分7秒)
- 课时28:Computing on Data (13分14秒)
- 课时29:Plotting Data (9分38秒)
- 课时30:Control Statements- for, while, if statements (12分55秒)
- 课时31:Vectorization (13分48秒)
- 课时32:Working on and Submitting Programming Exercises (3分33秒)
- 课时33:Classification (8分8秒)
- 课时34:Hypothesis Representation (7分24秒)
- 课时35:Decision Boundary (14分49秒)
- 课时36:Cost Function (11分25秒)
- 课时37:Simplified Cost Function and Gradient Descent (10分14秒)
- 课时38:Advanced Optimization (14分6秒)
- 课时39:Multiclass Classification- One-vs-all (6分15秒)
- 课时40:The Problem of Overfitting (9分42秒)
- 课时41:Cost Function (10分10秒)
- 课时42:Regularized Linear Regression (10分40秒)
- 课时43:Regularized Logistic Regression (8分33秒)
- 课时44:Non-linear Hypotheses (9分35秒)
- 课时45:Neurons and the Brain (7分47秒)
- 课时46:Model Representation I (12分1秒)
- 课时47:Model Representation II (11分46秒)
- 课时48:Examples and Intuitions I (7分15秒)
- 课时49:Examples and Intuitions II (10分20秒)
- 课时50:Multiclass Classification (3分51秒)
- 课时51:Cost Function (6分43秒)
- 课时52:Backpropagation Algorithm (11分59秒)
- 课时53:Backpropagation Intuition (12分44秒)
- 课时54:Implementation Note- Unrolling Parameters (7分47秒)
- 课时55:Gradient Checking (11分37秒)
- 课时56:Random Initialization (6分51秒)
- 课时57:Putting It Together (13分23秒)
- 课时58:Autonomous Driving (6分30秒)
- 课时59:Deciding What to Try Next (5分50秒)
- 课时60:Evaluating a Hypothesis (7分35秒)
- 课时61:Model Selection and Train-Validation-Test Sets (12分3秒)
- 课时62:Diagnosing Bias vs. Variance (7分42秒)
- 课时63:Regularization and Bias-Variance (11分20秒)
- 课时64:Learning Curves (11分53秒)
- 课时65:Deciding What to Do Next Revisited (6分50秒)
- 课时66:Prioritizing What to Work On (9分29秒)
- 课时67:Error Analysis (13分11秒)
- 课时68:Error Metrics for Skewed Classes (11分35秒)
- 课时69:Trading Off Precision and Recall (14分5秒)
- 课时70:Data For Machine Learning (11分9秒)
- 课时71:Optimization Objective (14分47秒)
- 课时72:Large Margin Intuition (10分36秒)
- 课时73:Mathematics Behind Large Margin Classification (Optional) (19分41秒)
- 课时74:Kernels I (15分44秒)
- 课时75:Kernels II (15分43秒)
- 课时76:Using An SVM (21分2秒)
- 课时77:Unsupervised Learning- Introduction (3分16秒)
- 课时78:K-Means Algorithm (12分32秒)
- 课时79:Optimization Objective (7分4秒)
- 课时80:Random Initialization (7分49秒)
- 课时81:Choosing the Number of Clusters (8分22秒)
- 课时82:Motivation I- Data Compression (10分9秒)
- 课时83:Motivation II- Visualization (5分27秒)
- 课时84:Principal Component Analysis Problem Formulation (9分5秒)
- 课时85:Principal Component Analysis Algorithm (15分13秒)
- 课时86:Choosing the Number of Principal Components (10分30秒)
- 课时87:Reconstruction from Compressed Representation (3分54秒)
- 课时88:Advice for Applying PCA (12分48秒)
- 课时89:Problem Motivation (7分38秒)
- 课时90:Gaussian Distribution (10分27秒)
- 课时91:Algorithm (12分2秒)
- 课时92:Developing and Evaluating an Anomaly Detection System (13分7秒)
- 课时93:Anomaly Detection vs. Supervised Learning (7分36秒)
- 课时94:Choosing What Features to Use (12分17秒)
- 课时95:Multivariate Gaussian Distribution (Optional) (13分45秒)
- 课时96:Anomaly Detection using the Multivariate Gaussian Distribution (Optional) (14分3秒)
- 课时97:Problem Formulation (7分54秒)
- 课时98:Content Based Recommendations (14分31秒)
- 课时99:Collaborative Filtering (10分14秒)
- 课时100:Collaborative Filtering Algorithm (8分26秒)
- 课时101:Vectorization- Low Rank Matrix Factorization (8分27秒)
- 课时102:Implementational Detail- Mean Normalization (8分30秒)
- 课时103:Learning With Large Datasets (5分44秒)
- 课时104:Stochastic Gradient Descent (13分19秒)
- 课时105:Mini-Batch Gradient Descent (6分17秒)
- 课时106:Stochastic Gradient Descent Convergence (11分31秒)
- 课时107:Online Learning (12分50秒)
- 课时108:Map Reduce and Data Parallelism (14分8秒)
- 课时109:Problem Description and Pipeline (7分2秒)
- 课时110:Sliding Windows (14分39秒)
- 课时111:Getting Lots of Data and Artificial Data (16分20秒)
- 课时112:Ceiling Analysis- What Part of the Pipeline to Work on Next (13分50秒)
- 课时113:Summary and Thank You (4分41秒)
热门下载
热门帖子