YouTube doesn't want you know this subscribers secret
Get Free YouTube Subscribers, Views and Likes

SONAR Rock vs Mine Prediction with 9 SkLearn u0026 XGBoost Algorithms + Bayesian Hyperparameters [2024]

Follow
Dr. Maryam Miradi

#scikitlearn #xgboost #datascienceprojects #hyperparametertuning
This is the 1st video in a series on End to End Data Science Projects with Machine Learning and Deep Learning. Transform your data science skills with this end to end data science project on Kaggle dataset 60feature SONAR Submarine data (mine vs Rock), A supervised Learning classification project, featuring 9 algorithms from Python Libraries ScikitLearn and XGBoost including ADABoost, and XGBoost hyperparameter tuning using Hyperopt. Learn essential techniques like feature engineering, kfold crossvalidation for model selection, and area under precision and recall for imbalanced data. Master Python libraries such as Scikit learn, XGBoost, and Seaborn, and build a strong data science portfolio with full code included.

===========================
Get Access to 20+ Years Experience in AI:
===========================
⚡Free guide: https://www.maryammiradi.com/freeguide
⚡AI Training: https://www.maryammiradi.com/training

===========================
Connect with Me!
=============================
Linkedin ➡   / maryammiradi  

===========================
Mentioned in this Video
=============================
Link to Python Code ➡
https://colab.research.google.com/dri...
Data Science Projects ➡    • Data Science Projects End to End  

⏰ Timecodes ⏰

0:00 Introduction
0:09 Start of Handson Tutorial in Google Colab Data Exploration
8:12 Data Preparation including data cleaning and data formatting and Visualizations using Seaborn and Matplotlib
14:11 Feature Engineering including Feature Transformation, Encoding and Custom Scikit Learn Pipeline
26:00 Building 9 Classification Models with ScikitLearn and XGBoost and Model Comparison using
34:33 Hyperparameter Search for ADABoost and Logistic Regression with Grid Search
39:03 Improving Area under precision and recall plot with SavitzkyGolay filter from Scipy Library
40:25 Model Selection using auc pr metric and k fold cross validation and Results in Confusion Matrix
44:20 Model Fine Tuning: XGBoost Hyperparameter Tuning with Hyperopt using Bayesian Optimization
51:16 Building Final XGBoost Model in a Sklearn Pipeline
51:30 Testing XGBoost Results on Test Set using Conusion Matrix and the classification metrics area under precision and recall (auc pr)
53:51 Outro

====================

✍ Please Leave any questions you have about AI & Data Science in the comments!

#ai #machinelearning #python

posted by vivacioustesta3