Math for Machine Learning

Math for Machine Learning

English | MP4 | AVC 1280×720 | AAC 44KHz 2ch | 5h 43m | 504 MB

Would you like to learn a mathematics subject that is crucial for many high-demand lucrative career fields such as: Computer Science Data Science Artificial Intelligence If you’re looking to gain a solid foundation in Machine Learning to further your career goals, in a way that allows you to study on your own schedule at a fraction of the cost it would take at a traditional university, this online course is for you. If you’re a working professional needing a refresher on machine learning or a complete beginner who needs to learn Machine Learning for the first time, this online course is for you. Why you should take this online course: You need to refresh your knowledge of machine learning for your career to earn a higher salary. You need to learn machine learning because it is a required mathematical subject for your chosen career field such as data science or artificial intelligence. You intend to pursue a masters degree or PhD, and machine learning is a required or recommended subject. Why you should choose this instructor: I earned my PhD in Mathematics from the University of California, Riverside. I have created many successful online math courses that students around the world have found invaluable—courses in linear algebra, discrete math, and calculus.

Table of Contents

01 Course Promo
02 Course Introduction
03 Linear Regression
04 The Least Squares Method
05 Linear Algebra Solution to Least Squares Problem
06 Example Linear Regression
07 Summary Linear Regression
08 Classification
09 Linear Discriminant Analysis
10 The Posterior Probability Functions
11 Modelling the Posterior Probability Functions
12 Linear Discriminant Functions
13 Estimating the Linear Discriminant Functions
14 Classifying Data Points Using Linear Discriminant Functions
15 LDA Example 1
16 LDA Example 2
17 Summary Linear Discriminant Analysis
18 Logistic Regression
19 Logistic Regression Model of the Posterior Probability Function
20 Estimating the Posterior Probability Function
21 The Multivariate Newton-Raphson Method
22 Maximizing the Log-Likelihood Function
23 Logistic Regression Example
24 Summary Logistic Regression
25 Artificial Neural Networks
26 Neural Network Model of the Output Functions
27 Forward Propagation
28 Choosing Activation Functions
29 Estimating the Output Functions
30 Error Function for Regression
31 Error Function for Binary Classification
32 Error Function for Multiclass Classification
33 Minimizing the Error Function Using Gradient Descent
34 Backpropagation Equations
35 Summary of Backpropagation
36 Summary Artificial Neural Networks
37 Maximal Margin Classifier
38 Definitions of Separating Hyperplane and Margin
39 Proof 1
40 Maximizing the Margin
41 Definition of Maximal Margin Classifier
42 Reformulating the Optimization Problem
43 Proof 2
44 Proof 3
45 Proof 4
46 Proof 5
47 Solving the Convex Optimization Problem
48 KKT Conditions
49 Primal and Dual Problems
50 Solving the Dual Problem
51 The Coefficients for the Maximal Margin Hyperplane
52 The Support Vectors
53 Classifying Test Points
54 Maximal Margin Classifier Example 1
55 Maximal Margin Classifier Example 2
56 Summary Maximal Margin Classifier
57 Support Vector Classifier
58 Slack Variables Points on Correct Side of Hyperplane
59 Slack Variables Points on Wrong Side of Hyperplane
60 Formulating the Optimization Problem
61 Definition of Support Vector Classifier
62 A Convex Optimization Problem
63 Solving the Convex Optimization Problem (Soft Margin)
64 The Coefficients for the Soft Margin Hyperplane
65 Classifying Test Points (Soft Margin)
66 The Support Vectors (Soft Margin)
67 Support Vector Classifier Example 1
68 Support Vector Classifier Example 2
69 Summary Support Vector Classifier
70 Support Vector Machine Classifier
71 Enlarging the Feature Space
72 The Kernel Trick
73 Summary Support Vector Machine Classifier