Talks # 4: Sebastien Fischman - Pytorch-TabNet: Beating XGBoost on Tabular Data Using Deep Learning
Talks # 4:
Speaker: Sebastien Fischman (https://www.linkedin.com/in/sebastienfischman/)
Title : Pytorch-tabnet : Beating XGBoost on tabular data with deep learning?
Abstract: #DeepLearning has set up new benchmarks for Computer Vision, NLP, Speech, Reinforcement Learning in the past few years.
However tabular data competitions are still dominated by gradient boosted trees (GBTs) libraries like XGBoost, LightGBM and Catboost.
Tabnet is a new promising deep learning architecture based on sequential attention transformers proposed by Arik & Pfister that aims to fill the gap between GBTs and neural networks.
Pytorch-tabnet is an open source library that provides a scikit-like interface for training a TabNetClassifier or TabNetRegressor. It's ease of use allow any developer to quickly try a #TabNet architecture on any dataset, hopefully setting up new benchmarks.
Bio: Worked as a Data Scientist in France and Australia on very different topics:
- user segmentation based on their shopping habits for WoolWorth @Quantium
- real time bidding advertising @Tradelab
- stock market predictions based on sentiment analysis from social medias @SESAMm
- auto ML platform with explainable AI @DreamQuark
- now working on early stage cancer detection on new OCT-3D images @DamaeMedical
To give a talk in Talks, fill out this form here: https://bit.ly/AbhishekTalks
----
Follow me on:
Twitter: https://twitter.com/abhi1thakur
LinkedIn: https://www.linkedin.com/in/abhi1thakur/
Kaggle: https://kaggle.com/abhishek
Видео Talks # 4: Sebastien Fischman - Pytorch-TabNet: Beating XGBoost on Tabular Data Using Deep Learning канала Abhishek Thakur
Speaker: Sebastien Fischman (https://www.linkedin.com/in/sebastienfischman/)
Title : Pytorch-tabnet : Beating XGBoost on tabular data with deep learning?
Abstract: #DeepLearning has set up new benchmarks for Computer Vision, NLP, Speech, Reinforcement Learning in the past few years.
However tabular data competitions are still dominated by gradient boosted trees (GBTs) libraries like XGBoost, LightGBM and Catboost.
Tabnet is a new promising deep learning architecture based on sequential attention transformers proposed by Arik & Pfister that aims to fill the gap between GBTs and neural networks.
Pytorch-tabnet is an open source library that provides a scikit-like interface for training a TabNetClassifier or TabNetRegressor. It's ease of use allow any developer to quickly try a #TabNet architecture on any dataset, hopefully setting up new benchmarks.
Bio: Worked as a Data Scientist in France and Australia on very different topics:
- user segmentation based on their shopping habits for WoolWorth @Quantium
- real time bidding advertising @Tradelab
- stock market predictions based on sentiment analysis from social medias @SESAMm
- auto ML platform with explainable AI @DreamQuark
- now working on early stage cancer detection on new OCT-3D images @DamaeMedical
To give a talk in Talks, fill out this form here: https://bit.ly/AbhishekTalks
----
Follow me on:
Twitter: https://twitter.com/abhi1thakur
LinkedIn: https://www.linkedin.com/in/abhi1thakur/
Kaggle: https://kaggle.com/abhishek
Видео Talks # 4: Sebastien Fischman - Pytorch-TabNet: Beating XGBoost on Tabular Data Using Deep Learning канала Abhishek Thakur
Показать
Комментарии отсутствуют
Информация о видео
Другие видео канала
![](https://i.ytimg.com/vi/csyva7klk3k/default.jpg)
![Deep Learning for Tabular Data: A Bag of Tricks | ODSC 2020](https://i.ytimg.com/vi/WPQOkoXhdBQ/default.jpg)
![Categorical Embeddings in Structured Data](https://i.ytimg.com/vi/AOgbSRjG3FA/default.jpg)
![Top 4 Dying Programming Languages of 2019 | by Clever Programmer](https://i.ytimg.com/vi/H9Ht27r7ROk/default.jpg)
![Pytorch Transformers from Scratch (Attention is all you need)](https://i.ytimg.com/vi/U0s0f995w14/default.jpg)
![Interview with Abhishek Thakur | World's First Triple Grandmaster | Kaggle](https://i.ytimg.com/vi/8lniZVqRLA0/default.jpg)
![Extract Tables From Document Using Deep Learning | Data Science | Machine Learning](https://i.ytimg.com/vi/KElGGK1Kf8I/default.jpg)
![Image classification using CNN (CIFAR10 dataset) | Deep Learning Tutorial 24 (Tensorflow & Python)](https://i.ytimg.com/vi/7HPwo4wnJeA/default.jpg)
![Deep Learning for tabular data - #DevFestVeneto19](https://i.ytimg.com/vi/nQgUt_uADSE/default.jpg)
![PyTorch at Tesla - Andrej Karpathy, Tesla](https://i.ytimg.com/vi/oBklltKXtDE/default.jpg)
![196 - What is Light GBM and how does it compare against XGBoost?](https://i.ytimg.com/vi/n_ZMQj09S6w/default.jpg)
![AutoXGB - Automated ML with xgboost + optuna + fastapi | Kaggle Demo](https://i.ytimg.com/vi/vIiUTRgXsd8/default.jpg)
![Google TabNet: Interpretable Tabular Data Learning | AAAI](https://i.ytimg.com/vi/Am_zwQlf7Ks/default.jpg)
![What is YOLO algorithm? | Deep Learning Tutorial 31 (Tensorflow, Keras & Python)](https://i.ytimg.com/vi/ag3DLKsl2vk/default.jpg)
![Better than Deep Learning: Gradient Boosting Machines (GBM)](https://i.ytimg.com/vi/9GCEVv94udY/default.jpg)
![Hyperparameter Optimization: This Tutorial Is All You Need](https://i.ytimg.com/vi/5nYqK-HaoKY/default.jpg)
![Kaggle's 30 Days Of ML (Competition Part-4): Hyperparameter tuning using Optuna](https://i.ytimg.com/vi/m5YSKPMjkrk/default.jpg)
![How do I start my career in Data Science?](https://i.ytimg.com/vi/BFFM1JRo14E/default.jpg)
![Detecting Skin Cancer (Melanoma) With Deep Learning](https://i.ytimg.com/vi/WaCFd-vL4HA/default.jpg)
![Daniel Falber - Torch for R](https://i.ytimg.com/vi/On2vLEJMFgg/default.jpg)