English | MP4 | AVC 1920×1080 | AAC 44KHz 2ch | 84 Lessons (11h 59m) | 1.90 GB
Launch your career in AI with this course that teaches you to build, train, and deploy your own AI models using two of the most important AI tools used in the real world: AWS SageMaker and Hugging Face. No machine learning knowledge required!
Learn to build end-to-end AI applications using AWS SageMaker: from gathering and preparing your own data, to training and modifying your own models, and deploying and scaling your AI application into the real world.
What you’ll learn
- Build and deploy cutting-edge artificial intelligence & machine learning models to the cloud
- Utilize powerful pre-trained models from Hugging Face with AWS SageMaker
- Uncover the mathematical secrets behind how Large Language Models work with a deep-dive into the Transformer architecture, tokenization, and more
- Customize models to meet the needs of your AI applications using PyTorch to create unique solutions
- Train and test models, ensuring they deliver accurate results every time
- Learn best practices for monitoring and optimizing your models, including load testing and scaling for massive user demand
Table of Contents
1 AI Engineering Bootcamp Learn AWS SageMaker with Patrik Szepesi
2 Course Introduction
3 Setting Up Our AWS Account
4 Set Up IAM Roles + Best Practices
5 AWS Security Best Practices
6 Set Up AWS SageMaker Domain
7 UI Domain Change
8 Setting Up SageMaker Environment
9 SageMaker Studio and Pricing
10 Setup SageMaker Server + PyTorch
11 HuggingFace Models, Sentiment Analysis, and AutoScaling
12 Get Dataset for Multiclass Text Classification
13 Creating Our AWS S3 Bucket
14 Uploading Our Training Data to S3
15 Exploratory Data Analysis – Part 1
16 Exploratory Data Analysis – Part 2
17 Data Visualization and Best Practices
18 Setting Up Our Training Job Notebook + Reasons to Use SageMaker
19 Python Script for HuggingFace Estimator
20 Creating Our Optional Experiment Notebook – Part 1
21 Creating Our Optional Experiment Notebook – Part 2
22 Encoding Categorical Labels to Numeric Values
23 Understanding the Tokenization Vocabulary
24 Encoding Tokens
25 Practical Example of Tokenization and Encoding
26 Creating Our Dataset Loader Class
27 Setting Pytorch DataLoader
28 Which Path Will You Take_
29 DistilBert vs. Bert Differences
30 Embeddings In A Continuous Vector Space
31 Introduction To Positional Encodings
32 Positional Encodings – Part 1
33 Positional Encodings – Part 2 (Even and Odd Indices)
34 Why Use Sine and Cosine Functions
35 Understanding the Nature of Sine and Cosine Functions
36 Visualizing Positional Encodings in Sine and Cosine Graphs
37 Solving the Equations to Get the Values for Positional Encodings
38 Introduction to Attention Mechanism
39 Query, Key and Value Matrix
40 Getting Started with Our Step by Step Attention Calculation
41 Calculating Key Vectors
42 Query Matrix Introduction
43 Calculating Raw Attention Scores
44 Understanding the Mathematics Behind Dot Products and Vector Alignment
45 Visualizing Raw Attention Scores in 2D
46 Converting Raw Attention Scores to Probability Distributions with Softmax
47 Normalization
48 Understanding the Value Matrix and Value Vector
49 Calculating the Final Context Aware Rich Representation for the Word _River_
50 Understanding the Output
51 Understanding Multi Head Attention
52 Multi Head Attention Example and Subsequent Layers
53 Masked Language Learning
54 Exercise Imposter Syndrome
55 Creating Our Custom Model Architecture with PyTorch
56 Adding the Dropout, Linear Layer, and ReLU to Our Model
57 Creating Our Accuracy Function
58 Creating Our Train Function
59 Finishing Our Train Function
60 Setting Up the Validation Function
61 Passing Parameters In SageMaker
62 Setting Up Model Parameters For Training
63 Understanding The Mathematics Behind Cross Entropy Loss
64 Finishing Our Script.py File
65 Quota Increase
66 Starting Our Training Job
67 Debugging Our Training Job With AWS CloudWatch
68 Analyzing Our Training Job Results
69 Creating Our Inference Script For Our PyTorch Model
70 Finishing Our PyTorch Inference Script
71 Setting Up Our Deployment
72 Deploying Our Model To A SageMaker Endpoint
73 Introduction to Endpoint Load Testing
74 Creating Our Test Data for Load Testing
75 Upload Testing Data to S3
76 Creating Our Model for Load Testing
77 Starting Our Load Test Job
78 Analyze Load Test Results
79 Deploying Our Endpoint
80 Creating Lambda Function to Call Our Endpoint
81 Setting Up Our AWS API Gateway
82 Testing Our Model with Postman, API Gateway and Lambda
83 Cleaning Up Resources
84 Thank You!
Resolve the captcha to access the links!