Up and Running with PyTorch (Video Course)

Up and Running with PyTorch (Video Course)

English | MP4 | AVC 1280×720 | AAC 44KHz 2ch | 2h 55m | 705 MB

Up and Running with PyTorch begins with an introduction to PyTorch and deep learning frameworks in general. You learn how the combination of automatic differentiation and transparent computation on GPUs have really enabled the current explosion of deep learning research. The video shows how you can use PyTorch to implement and learn a linear regression model as a stepping stone to building much more complex neural networks. Finally, the video demonstrates how to combine all of the components that PyTorch provides to build a simple feedforward multi-layer perceptron.

Table of Contents

Introduction
Up and Running with PyTorch: Introduction
Lesson 1: PyTorch for the Impatient
1.1 What Is PyTorch?
1.2 The PyTorch Layer Cake
1.3 The Deep Learning Software Trilemma
1.4 What Are Tensors, Really?
1.5 Tensors in PyTorch
1.6 Introduction to Computational Graphs
1.7 Backpropagation Is Just the Chain Rule
1.8 Effortless Backpropagation with torch.autograd
1.9 PyTorch’s Device Abstraction (i.e., GPUs)
1.10 Working with Devices
1.11 Components of a Learning Algorithm
1.12 Introduction to Gradient Descent
1.13 Getting to Stochastic Gradient Descent (SGD)
1.14 Comparing Gradient Descent and SGD
1.15 Linear Regression with PyTorch
1.16 Perceptrons and Neurons
1.17 Layers and Activations with torch.nn
1.18 Multi-layer Feedforward Neural Networks (MLP)
Summary
Up and Running with PyTorch: Summary

Homepage