Ensemble Methods for Machine Learning, Video Edition

Ensemble Methods for Machine Learning, Video Edition

English | MP4 | AVC 1280×720 | AAC 44KHz 2ch | 55 Lessons (11h 05m) | 1.52 GB

Ensemble machine learning combines the power of multiple machine learning approaches, working together to deliver models that are highly performant and highly accurate.

Inside Ensemble Methods for Machine Learning you will find:

  • Methods for classification, regression, and recommendations
  • Sophisticated off-the-shelf ensemble implementations
  • Random forests, boosting, and gradient boosting
  • Feature engineering and ensemble diversity
  • Interpretability and explainability for ensemble methods

Ensemble machine learning trains a diverse group of machine learning models to work together, aggregating their output to deliver richer results than a single model. Now in Ensemble Methods for Machine Learning you’ll discover core ensemble methods that have proven records in both data science competitions and real-world applications. Hands-on case studies show you how each algorithm works in production. By the time you’re done, you’ll know the benefits, limitations, and practical methods of applying ensemble machine learning to real-world data, and be ready to build more explainable ML systems.

Automatically compare, contrast, and blend the output from multiple models to squeeze the best results from your data. Ensemble machine learning applies a “wisdom of crowds” method that dodges the inaccuracies and limitations of a single model. By basing responses on multiple perspectives, this innovative approach can deliver robust predictions even without massive datasets.

Ensemble Methods for Machine Learning teaches you practical techniques for applying multiple ML approaches simultaneously. Each chapter contains a unique case study that demonstrates a fully functional ensemble method, with examples including medical diagnosis, sentiment analysis, handwriting classification, and more. There’s no complex math or theory—you’ll learn in a visuals-first manner, with ample code for easy experimentation!

What’s inside

  • Bagging, boosting, and gradient boosting
  • Methods for classification, regression, and retrieval
  • Interpretability and explainability for ensemble methods
  • Feature engineering and ensemble diversity
Table of Contents

1 Ensemble methods Hype or hallelujah
2 Fit vs. complexity in individual models
3 Our first ensemble
4 Summary
5 Terminology and taxonomy for ensemble methods
6 Why you should care about ensemble learning
7 Bagging Bootstrap aggregating
8 Case study Breast cancer diagnosis
9 Homogeneous parallel ensembles Bagging and random forests
10 More homogeneous parallel ensembles
11 Random forests
12 Summary
13 Case study Sentiment analysi
14 Combining predictions by meta-learning
15 Combining predictions by weighting
16 Heterogeneous parallel ensembles Combining strong learners
17 Summary
18 AdaBoost Adaptive boosting
19 AdaBoost in practice
20 Case study Handwritten digit classification
21 LogitBoost Boosting with the logistic loss
22 Sequential ensembles Adaptive boosting
23 Summary
24 Case study Document retrieval
25 Gradient boosting Gradient descent + boosting
26 LightGBM A framework for gradient boosting
27 LightGBM in practice
28 Sequential ensembles Gradient boosting
29 Summary
30 Case study redux Document retrieval
31 Newton boosting Newton’s method + boosting
32 Sequential ensembles Newton boosting
33 Summary
34 XGBoost A framework for Newton boosting
35 XGBoost in practice
36 Case study Demand forecasting
37 Learning with continuous and count labels
38 Parallel ensembles for regression
39 Sequential ensembles for regression
40 Summary
41 Case study Income prediction
42 CatBoost A framework for ordered boosting
43 Encoding high-cardinality string features
44 Learning with categorical features
45 Summary
46 Black-box methods for global explainability
47 Black-box methods for local explainability
48 Case study Data-driven marketing
49 Explaining your ensembles
50 Glass-box ensembles Training for interpretability
51 Summary
52 Epilogue
53 The basics of ensembles
54 Essential ensemble methods
55 Ensembles in the wild Adapting ensemble methods to your data

Homepage