Chris Moody introduces lda2vec
Chris speaks at data.bythebay.io!
Q&A with Chris and Alexy: https://youtu.be/GXAgzxivze4
Standard natural language processing (NLP) is a messy and difficult affair. It requires teaching a computer about English-specific word ambiguities as well as the hierarchical, sparse nature of words in sentences. At Stitch Fix, word vectors help computers learn from the raw text in customer notes. Our systems need to identify a medical professional when she writes that she 'used to wear scrubs to work', and distill 'taking a trip' into a Fix for vacation clothing. Applied appropriately, word vectors are dramatically more meaningful and more flexible than current techniques and let computers peer into text in a fundamentally new way. I'll try to convince you that word vectors give us a simple and flexible platform for understanding text while speaking about word2vec, LDA, and introduce our hybrid algorithm lda2vec. ----------------------------------------------------------------------------------------------------------------------------------------
Scalæ By the Bay 2016 conference
http://scala.bythebay.io
-- is held on November 11-13, 2016 at Twitter, San Francisco, to share the best practices in building data pipelines with three tracks:
* Functional and Type-safe Programming
* Reactive Microservices and Streaming Architectures
* Data Pipelines for Machine Learning and AI
Видео Chris Moody introduces lda2vec канала FunctionalTV
Q&A with Chris and Alexy: https://youtu.be/GXAgzxivze4
Standard natural language processing (NLP) is a messy and difficult affair. It requires teaching a computer about English-specific word ambiguities as well as the hierarchical, sparse nature of words in sentences. At Stitch Fix, word vectors help computers learn from the raw text in customer notes. Our systems need to identify a medical professional when she writes that she 'used to wear scrubs to work', and distill 'taking a trip' into a Fix for vacation clothing. Applied appropriately, word vectors are dramatically more meaningful and more flexible than current techniques and let computers peer into text in a fundamentally new way. I'll try to convince you that word vectors give us a simple and flexible platform for understanding text while speaking about word2vec, LDA, and introduce our hybrid algorithm lda2vec. ----------------------------------------------------------------------------------------------------------------------------------------
Scalæ By the Bay 2016 conference
http://scala.bythebay.io
-- is held on November 11-13, 2016 at Twitter, San Francisco, to share the best practices in building data pipelines with three tracks:
* Functional and Type-safe Programming
* Reactive Microservices and Streaming Architectures
* Data Pipelines for Machine Learning and AI
Видео Chris Moody introduces lda2vec канала FunctionalTV
Показать
Комментарии отсутствуют
Информация о видео
Другие видео канала
![Text By the Bay 2015: Richard Socher, Deep Learning for Natural Language Processing](https://i.ytimg.com/vi/tdLmf8t4oqM/default.jpg)
![Representation Learning: Word2Vec Intuition](https://i.ytimg.com/vi/aCLMeJuGx9k/default.jpg)
![LDA Algorithm Description](https://i.ytimg.com/vi/DWJYZq_fQ2A/default.jpg)
![word2vec 前篇(上):语言模型](https://i.ytimg.com/vi/pw187aaz49o/default.jpg)
![Bay Area AI: Jim Downing, The Feature Store: the missing API between Data Engineering Science?](https://i.ytimg.com/vi/N1BjPk1smdg/default.jpg)
![StatQuest: Linear Discriminant Analysis (LDA) clearly explained.](https://i.ytimg.com/vi/azXCzI57Yfc/default.jpg)
![Ali Ghodsi, Lec 13: Word2Vec Skip-Gram](https://i.ytimg.com/vi/GMCwS7tS5ZM/default.jpg)
![ai.bythebay.io: Francois Chollet, Advances in Deep Learning for Mathematical Theorem Proving](https://i.ytimg.com/vi/UAa2o0W7vcg/default.jpg)
![Text By the Bay 2015: Chris Moody, A Word is Worth a Thousand Vectors](https://i.ytimg.com/vi/vkfXBGnDplQ/default.jpg)
![But what *is* a Neural Network? | Deep learning, chapter 1](https://i.ytimg.com/vi/aircAruvnKk/default.jpg)
![07 Bag of words model](https://i.ytimg.com/vi/hgQTIMrGwtg/default.jpg)
![The Intraday Pivot Point Indicator on TradingView by Rob Booker](https://i.ytimg.com/vi/7NA4-PxWyZQ/default.jpg)
![Topic Models](https://i.ytimg.com/vi/yK7nN3FcgUs/default.jpg)
![Word2Vec (introduce and tensorflow implementation)](https://i.ytimg.com/vi/64qSgA66P-8/default.jpg)
![Tomáš Mikolov - Distributed Representations for NLP (Machine Learning Prague 2016)](https://i.ytimg.com/vi/fwcJpSYNsNs/default.jpg)
![Machine Reading with Word Vectors (ft. Martin Jaggi)](https://i.ytimg.com/vi/_YYQNpjvvLE/default.jpg)
![Ali Ghodsi, Lec 12: Neural Networks, Autoencoders, Word2Vec](https://i.ytimg.com/vi/syWB-YMYZvI/default.jpg)
![07 word and sentence embeddings 03 word2vec and doc2vec and how to evaluate them](https://i.ytimg.com/vi/-aFhwxLlSqA/default.jpg)
![Language Learning with BERT - TensorFlow and Deep Learning Singapore](https://i.ytimg.com/vi/0EtD5ybnh_s/default.jpg)
![word2vec : le Machine Learning pour les nuls](https://i.ytimg.com/vi/5RzrhgEZkbI/default.jpg)