Build Your Own AI Lab (Video Course)

Build Your Own AI Lab (Video Course)

English | MP4 | AVC 1280×720 | AAC 44KHz 2ch | 3h 18m | 666 MB

Set up your own AI lab and infrastructure at home.

  • Hands-on guide to home and cloud-based AI labs and infrastructure
  • Learn to set up and optimize labs to research and experiment in a secure environment
  • Focused training on open-source large language models (LLMs) and generative AI

Build Your Own AI Lab covers how to create powerful and secure AI research environments. The course covers both home-based and cloud-based AI labs, offering comprehensive guidance on hardware and software setup, security best practices, cost management, and scalability. Key topics covered are how to run open-source models that can be accessed at Hugging Face, such as Llama 3, Phi 3, Mistral, Gemma, and other models; how to integrate and leverage the strengths of both environments, ensuring they have the flexibility to meet diverse research needs; and how to use Ollama to run these models easily in your home. It is an all-rounded resource with overview on Amazon Bedrock, Amazon SageMaker, Google Vertex AI, Microsoft Azure Cognitive Services, and high-performance computing and edge AI.

Table of Contents

Introduction
1 Build Your Own AI Lab Introduction

Lesson 1 Introduction to AI Labs and Sandboxes
2 Learning objectives
3 Overview of AI Labs and Sandboxes Home-based vs. Cloud-based
4 Choosing the Right Hardware (GPUs, CPUs, Memory, etc.)
5 Building or Buying Pre-built Systems
6 Choosing the Operating Systems (Linux, Windows, macOS)
7 Surveying Essential Software (Python, Anaconda, Jupyter Notebooks, and Other Frameworks)
8 Introducing Hugging Face
9 Introducing Ollama
10 Installing Ollama
11 Ollama Integrations
12 Exploring the Ollama REST API
13 Introducing Retrieval Augmented Generation (RAG)
14 Leveraging RAGFlow

Lesson 2 Cloud-Based AI Labs and Sandboxes
15 Learning objectives
16 Advantages and Disadvantages of Cloud AI Labs and Sandboxes
17 Introducing Amazon Bedrock
18 Surveying Amazon SageMaker
19 Exploring Google Vertex AI
20 Using Microsoft Azure AI Foundry
21 Discussing Cost Management and Security
22 Using Terraform to Deploy Ollama in the Cloud

Lesson 3 Integrating and Leveraging AI Environments
23 Learning objectives
24 Using Hybrid AI Labs Combining Home and Cloud Resources
25 Synchronizing Data and Projects
26 Leveraging the Strengths of Both Environments
27 Running Open-Source Models Available on Hugging Face
28 Introducing LangChain
29 Introducing LlamaIndex
30 Understanding Embedding Models
31 Using Vector Databases

Lesson 4 Advanced Topics
32 Learning objectives
33 Leveraging LangSmith and LangGraph
34 Using Fine Tuning Frameworks
35 High-Performance Computing and Edge AI

Summary
36 Build Your Own AI Lab Summary

Homepage