Fireside Chat with Christopher Manning
Christopher Manning is a Professor of Computer Science and Linguistics at Stanford University. His Ph.D. is from Stanford in 1995, and he held faculty positions at Carnegie Mellon University and the University of Sydney before returning to Stanford. He is a fellow of ACM, AAAI, and the Association for Computational Linguistics. Manning has coauthored leading textbooks on statistical approaches to natural language processing (Manning and Schuetze, 1999) and information retrieval (Manning, Raghavan, and Schuetze, 2008). His recent work has concentrated on probabilistic approaches to natural language processing (NLP) problems and computational semantics, particularly including such topics as statistical parsing, robust textual inference, machine translation, large-scale joint inference for NLP, computational pragmatics, and hierarchical deep learning for NLP.
See more at https://www.microsoft.com/en-us/research/video/fireside-chat-with-christopher-manning/
Видео Fireside Chat with Christopher Manning канала Microsoft Research
See more at https://www.microsoft.com/en-us/research/video/fireside-chat-with-christopher-manning/
Видео Fireside Chat with Christopher Manning канала Microsoft Research
Показать
Комментарии отсутствуют
Информация о видео
Другие видео канала
Yann LeCun and Christopher Manning discuss Deep Learning and Innate PriorsFireside Chat with David BleiWhat really matters at the end of life | BJ MillerAI That Doesn't Try Too Hard - Maximizers and SatisficersAccent Expert Breaks Down Tongue Twisters in Different Accents | WIREDComputer Scientist Regina Barzilay | 2017 MacArthur FellowGeneral Artificial Intelligence: Making sci-fi a reality | Darya Hvizdalova | TEDxTrencinBuilding Neural Network Models That Can ReasonDeep-Learning and NLP :: Abigail See :: Stanford UniversityJeff Dean discusses the future of machine learning at TF World ‘19 (TensorFlow Meets)Christopher Manning: How do we get computers to understand human language?PhD English Language and Applied Linguistics (distance learning)Training AI Without Writing A Reward Function, with Reward ModellingEmergent linguistic structure in deep contextual neural word representations - Chris ManningHeroes of NLP: Chris ManningThe opportunities with AI and machine learning21. Chaos and ReductionismLecture 2 | Word Vector Representations: word2vec[SAIF2020] Day2: Natural Language Processing - Christopher Manning | Samsung