Data Are Gold, So Why Share Your LLM?
Merged LLMs are the future, and we’re exploring how with Mark McQuade and Charles Goddard from Arcee AI on this episode with @JonKrohnLearns. Learn how to combine multiple LLMs without adding bulk, train more efficiently, and dive into different expert approaches. Discover how smaller models can outperform larger ones and leverage open-source projects for big enterprise wins. This episode is packed with must-know insights for data scientists and ML engineers. Don’t miss out!
Watch the full interview “801: Merged LLMs Are Smaller And More Capable — with Arcee AI's Mark McQuade and Charles Goddard” here: https://www.superdatascience.com/801
Видео Data Are Gold, So Why Share Your LLM? канала Super Data Science: ML & AI Podcast with Jon Krohn
Watch the full interview “801: Merged LLMs Are Smaller And More Capable — with Arcee AI's Mark McQuade and Charles Goddard” here: https://www.superdatascience.com/801
Видео Data Are Gold, So Why Share Your LLM? канала Super Data Science: ML & AI Podcast with Jon Krohn
Data Are Gold why share your llm Merged LLMs Are Smaller And More Capable Arcee AI Mark McQuade Charles Goddard how to combine multiple LLMs smaller models can outperform larger ones SMLs small language models vs large language model Mixture of Experts Mixture of Agents Mixture of Experts (MoE) vs. Mixture of Agents MergeKit Chief of Frontier Research Model Merging Technology Evolutionary Model Merging spectrum project data science podcast jon krohn
Комментарии отсутствуют
Информация о видео
19 июля 2024 г. 23:00:23
00:00:57
Другие видео канала