Generative AI Engineering with LLMs Specialization

Generative AI Engineering with LLMs Specialization

English | MP4 | AVC 1280×720 | AAC 44KHz 2ch | 77 Lessons (7h 48m) | 1.27 GB

Advance your ML career with Gen AI and LLMs . Master the essentials of Gen AI engineering and large language models (LLMs) in just 3 months.

What you’ll learn

  • In-demand, job-ready skills in gen AI, NLP apps, and large language models in just 3 months.
  • How to tokenize and load text data to train LLMs and deploy Skip-Gram, CBOW, Seq2Seq, RNN-based, and Transformer-based models with PyTorch
  • How to employ frameworks and pre-trained models such as LangChain and Llama for training, developing, fine-tuning, and deploying LLM applications.
  • How to implement a question-answering NLP system by preparing, developing, and deploying NLP applications using RAG.

Skills you’ll gain

  • NLP Model Fine-Tuning
  • Large Language Models
  • Hugging Face Transformers
  • PyTorch
  • Large Language Models
  • Generative AI Architecture

The Gen AI market is expected to grow 46% . yearly till 2030 (Source: Statista). Gen AI engineers are high in demand. This program gives aspiring data scientists, machine learning engineers, and AI developers essential skills in Gen AI, large language models (LLMs), and natural language processing (NLP) employers need.

Gen AI engineers design systems that understand human language. They use LLMs and machine learning to build these systems.

During this program, you will develop skills to build apps using frameworks and pre-trained foundation models such as BERT, GPT, and LLaMA. You’ll use the Hugging Face transformers library, PyTorch deep learning library, RAG and LangChain framework to develop and deploy LLM NLP-based apps. Plus, you’ll explore tokenization, data loaders, language and embedding models, transformer techniques, attention mechanisms, and prompt engineering.

Through the series of short-courses in this specialization, you’ll also gain practical experience through hands-on labs and a project, which is great for interviews.

This program is ideal for gaining job-ready skills that GenAI engineers, machine learning engineers, data scientists and AI developers require. Note, you need a working knowledge of Python, machine learning, and neural networks.. Exposure to PyTorch is helpful.

Applied Learning Project

Through the hands-on labs and projects in each course, you will gain practical skills in using LLMs for developing NLP-based applications.

Labs and projects you will complete include

  • creating an NLP data loader
  • developing and training a language model with a neural network
  • applying transformers for classification, building, and evaluating a translation model
  • engineering prompts and in-context learning
  • fine-tuning models
  • applying LangChain tools
  • building AI agents and applications with RAG and LangChain

In the final course, you will complete a capstone project, applying what you have learned to develop a question-answering bot through a series of hands-on labs. You begin by loading your document from various sources, then apply text splitting strategies to enhance model responsiveness, and use watsonx for embedding. You’ll also implement RAG to improve retrieval and set up a Gradio interface to construct your QA bot. Finally, you will test and deploy your bot.

Table of Contents

fundamentals-of-ai-agents-using-rag-and-langchain

rag-framework

welcome-to-the-course
1 course-introduction
2 course-overview_instructions
3 specialization-overview_instructions

introduction-to-rag
4 rag
5 rag-encoders-and-faiss
6 reading-summary-and-highlights_instructions

prompt-engineering-and-langchain

prompt-engineering-and-langchain
7 introduction-to-langchain
8 introduction-to-in-context-learning
9 advanced-methods-of-prompt-engineering
10 langchain-core-concepts
11 langchain-documents-for-building-rag-applications
12 langchain-chains-and-agents-for-building-applications
13 summary-and-highlights_instructions

course-wrap-up
14 course-conclusion_instructions
15 congratulations-and-next-steps_instructions
16 thanks-from-the-course-team_instructions

gen-ai-foundational-models-for-nlp-and-language-understanding

fundamentals-of-language-understanding

welcome
17 course-introduction
18 course-overview_instructions
19 specialization-overview_instructions

language-understanding-with-neural-networks
20 converting-words-to-features
21 document-categorization-prediction-with-torchtext
22 document-categorization-training-with-torchtext
23 training-the-model-in-pytorch
24 summary-and-highlights_instructions

n-gram-model
25 language-modeling-with-n-grams
26 n-grams-as-neural-networks-with-pytorch
27 summary-and-highlights_instructions

word2vec-and-sequence-to-sequence-models

word2vec-sequence-to-sequence-models-and-evaluation
28 word2vec-introduction-and-cbow-models
29 word2vec-skip-gram-and-pretrained-models
30 introduction-to-sequence-to-sequence-models-and-recurrent-neural-networks
31 encoder-decoder-rnn-models-training-and-inference
32 encoder-decoder-rnn-models-translation
33 metrics-for-evaluating-the-quality-of-generated-text
34 ethical-implications-of-word-embeddings-and-language-models_instructions
35 summary-and-highlights_instructions

course-wrap-up
36 course-conclusion_instructions
37 congratulations-and-next-steps_instructions
38 team-and-acknowledgements_instructions

generative-ai-advanced-fine-tuning-for-llms

different-approaches-to-fine-tuning

welcome-to-the-course
39 course-introduction
40 course-overview_instructions
41 specialization-overview_instructions

instruction-tuning-and-reward-modeling
42 basics-of-instruction-tuning
43 instruction-tuning-with-hugging-face
44 best-practices-for-instruction-tuning-large-language-models_instructions
45 reward-modeling-response-evaluation
46 reward-model-training
47 reward-modeling-with-hugging-face
48 summary-and-highlights_instructions

fine-tuning-causal-llms-with-human-feedback-and-direct-preference

ppo
49 large-language-models-llms-as-distributions
50 from-distributions-to-policies
51 reinforcement-learning-from-human-feedback-rlhf
52 proximal-policy-optimization-ppo
53 ppo-with-hugging-face
54 ppo-trainer
55 summary-and-highlights_instructions

dpo
56 dpo-partition-function
57 dpo-optimal-solution
58 from-optimal-policy-to-dpo
59 dpo-with-hugging-face
60 summary-and-highlights_instructions

course-wrap-up
61 course-conclusion_instructions
62 congratulations-and-next-steps_instructions
63 thanks-from-the-course-team_instructions

generative-ai-engineering-and-fine-tuning-transformers

transformers-and-fine-tuning

course-introduction
64 course-introduction
65 course-overview_instructions
66 specialization-overview_instructions
67 helpful-tips-for-course-completion_Helpful_Tips_IBM_SkillsNetwork
68 helpful-tips-for-course-completion_instructions

transfer-learning-in-nlp
69 hugging-face-vs-pytorch
70 using-pre-trained-transformers-and-fine-tuning
71 fine-tuning-with-pytorch
72 fine-tuning-with-hugging-face
73 reading-summary-and-highlights_instructions

parameter-efficient-fine-tuning-peft

parameter-efficient-fine-tuning-peft
74 video-introduction-to-peft
75 low-rank-adaptation-lora
76 lora-with-hugging-face-and-pytorch
77 from-quantization-to-qlora
78 ethical-considerations-in-fine-tuning-large-language-models_instructions
79 summary-and-highlights_instructions

course-wrap-up
80 course-conclusion_instructions
81 congratulations-and-next-steps_instructions
82 thanks-from-the-course-team_instructions

generative-ai-language-modeling-with-transformers

fundamental-concepts-of-transformer-architecture

welcome
83 course-introduction
84 course-overview_instructions
85 specialization-overview_instructions

positional-encoding-attention-and-application-in-classification
86 positional-encoding
87 attention-mechanism
88 self-attention-mechanism
89 from-attention-to-transformers
90 transformers-for-classification-encoder
91 optimization-techniques-for-efficient-transformer-training_instructions
92 summary-and-highlights_instructions

advanced-concepts-of-transformer-architecture

decoder-models
93 language-modeling-with-the-decoders-and-gpt-like-models
94 training-decoder-models
95 decoder-models-pytorch-implementation-causal-lm
96 decoder-models-pytorch-implementation-using-training-and-inference
97 summary-and-highlights_instructions

encoder-models
98 encoder-models-with-bert-pretraining-using-mlm
99 encoder-models-with-bert-pretraining-using-nsp
100 data-preparation-for-bert-with-pytorch
101 pretraining-bert-models-with-pytorch
102 summary-and-highlights_instructions

application-of-transformers-for-translation
103 transformer-architecture-for-language-translation
104 transformer-architecture-for-translation-pytorch-implementation
105 summary-and-highlights_instructions

course-wrap-up
106 course-conclusion_instructions
107 thanks-from-the-course-team_instructions
108 congratulations-and-next-steps_instructions

generative-ai-llm-architecture-data-preparation

generative-ai-architecture

welcome
109 overview-of-ai-engineering-with-llms
110 course-introduction
111 course-overview_instructions

generative-ai-overview-and-architecture
112 significance-of-generative-ai
113 generative-ai-architectures-and-models
114 generative-ai-for-nlp
115 summary-and-highlights_instructions

data-preparation-for-llms

preparing-data
116 tokenization
117 overview-of-data-loaders
118 data-quality-and-diversity-for-effective-llm-training_instructions
119 summary-and-highlights_instructions

course-wrap-up
120 course-conclusion_instructions
121 congratulations-and-next-steps_instructions
122 team-and-acknowledgments_instructions

project-generative-ai-applications-with-rag-and-langchain

document-loader-using-langchain

welcome-to-the-course
123 course-introduction
124 course-overview_instructions
125 specialization-overview_instructions

different-document-loaders-from-langchain
126 load-your-document-from-different-sources
127 best-practices-for-loading-documents-in-langchain-applications_instructions

text-splitter
128 strategies-for-splitting-text-for-optimal-processing

module-summary
129 reading-summary-and-highlights_instructions

rag-using-langchain

embedding-the-document
130 introduction-to-vector-databases-for-storing-embeddings

retriever
131 explore-advanced-retrievers-in-langchain-part-1
132 explore-advanced-retrievers-in-langchain-part-2

rag-using-langchain-summary
133 module-summary-rag-using-langchain_instructions

create-a-qa-bot-to-read-your-document

introduction-to-gradio
134 getting-started-with-gradio

build-a-qa-bot-web-app-summary
135 module-summary-create-a-qa-bot-to-read-your-document_instructions

course-wrap-up
136 course-conclusion_instructions
137 congratulations-and-next-steps_instructions
138 thanks-from-the-course-team_instructions

Homepage