Skip to content

Wine 🍷 Dataset Exploration, XGBoost Regression, Hyperparameter Tuning with Optuna & AutoML

Notifications You must be signed in to change notification settings

Dan-PN/Wine-XGBoost-Optuna-AutoML

Repository files navigation

Wine 🍷 Dataset Exploration and XGBoost Regression Hyperparameter Tuning with Optuna & AutoML

Part 1 - EDA Exploration with dataprep

Objective: Understand data distribution -> Python_file_EDA// Notebook_EDA

Part 2 - Data Extraction / Feature Eng / Data Encoding

Objective: Extract features from text fields, One Hot Encoding of Categorical Features ->> Data_Processing

Part 3 - Train XGBoost to Predict Wine Score, Parameter Hypertuning with Optuna, Feature Importance

Objectives: Train model, tune with bayesian hyperparameter optimization (Optuna), Evaluate feature importance -> Notebook // Python_file

  • A) Create XGBoost model to predict Wine score based on Wine Origin, Price and description features.

  • B) Use Optuna to tune hyperparameters, evaluate the most important hyperparamenters.

  • C) Based on tuned parameters evaluate model feature importance.

Part 4 - AutoMl with mljar-supervised

Objective: Run AutoMl on same Wine dataset, compare performance ->Notebook

About

Wine 🍷 Dataset Exploration, XGBoost Regression, Hyperparameter Tuning with Optuna & AutoML

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published