Implement machine learning algorithms to build ensemble models using Keras, H2O, Scikit-Learn, Pandas and more
Key Features
Apply popular machine learning algorithms using a recipe-based approach
Implement boosting, bagging, and stacking ensemble methods to improve machine learning models
Discover real-world ensemble applications and encounter complex challenges in Kaggle competitions
Book DescriptionEnsemble modeling is an approach used to improve the performance of machine learning models. It combines two or more similar or dissimilar machine learning algorithms to deliver superior intellectual powers. This book will help you to implement popular machine learning algorithms to cover different paradigms of ensemble machine learning such as boosting, bagging, and stacking.
The Ensemble Machine Learning Cookbook will start by getting you acquainted with the basics of ensemble techniques and exploratory data analysis. You'll then learn to implement tasks related to statistical and machine learning algorithms to understand the ensemble of multiple heterogeneous algorithms. It will also ensure that you don't miss out on key topics, such as like resampling methods. As you progress, you’ll get a better understanding of bagging, boosting, stacking, and working with the Random Forest algorithm using real-world examples. The book will highlight how these ensemble methods use multiple models to improve machine learning results, as compared to a single model. In the concluding chapters, you'll delve into advanced ensemble models using neural networks, natural language processing, and more. You’ll also be able to implement models such as fraud detection, text categorization, and sentiment analysis.
By the end of this book, you'll be able to harness ensemble techniques and the working mechanisms of machine learning algorithms to build intelligent models using individual recipes.
What you will learn
Understand how to use machine learning algorithms for regression and classification problems
Implement ensemble techniques such as averaging, weighted averaging, and max-voting
Get to grips with advanced ensemble methods, such as bootstrapping, bagging, and stacking
Use Random Forest for tasks such as classification and regression
Implement an ensemble of homogeneous and heterogeneous machine learning algorithms
Learn and implement various boosting techniques, such as AdaBoost, Gradient Boosting Machine, and XGBoost
Who this book is forThis book is designed for data scientists, machine learning developers, and deep learning enthusiasts who want to delve into machine learning algorithms to build powerful ensemble models. Working knowledge of Python programming and basic statistics is a must to help you grasp the concepts in the book.