Files

Download

Download Full Text (11.9 MB)

Course

Machine Learning

Description

This presentation explores how to make the best of models impacted by bias and variance. Meta-learning minimizes loss. Ensemble methods, including Bagging, Adaboost, Random Forest, Gradient Boosting, and Stacking, are discussed. These methods perturb data (X or Y) using techniques like bootstrap sampling, k-fold sampling, weighted sampling, and random subspaces. Models are generated in parallel or sequentially, with aggregation strategies such as mean, mode, weighted response, and metamodel. The presentation also touches upon deep learning and one-shot learning, and explains how distances become less meaningful in high dimensions.

More details: https://events.vtools.ieee.org/m/315184

Video Recording: https://ieeetv.ieee.org/video/meta-algorithms-in-machine-learning

Publication Date

Spring 5-31-2022

Document Type

Presentation

Keywords

Machine Learning, Artificial Intelligence, Ensemble Methods, daboost, Random Forest, Gradient Boosting, Stacking, Perturb data, Bootstrap sampling, K-fold sampling, Weighted sampling, Random subspaces, Parallel models, Sequential models, Aggregation strategies, Mean, Mode, Weighted response, Metamodel, Deep learning, One-shot learning, Distances, High dimensions

Disciplines

Computational Engineering | Computer Engineering | Engineering

Creative Commons License

Creative Commons License
This work is licensed under a Creative Commons Attribution-No Derivative Works 4.0 License.

Meta-algorithms in Machine Learning

Share

COinS