Загрузка страницы

Meta, NVIDIA & Hugging Face on Open-Source LLMs, Fine-Tuning & the Future of AI

AI is evolving fast. Are you keeping up?

During Smallcon, I sat down with amazing guests:
🔹 Diego Guerra Orozco (Meta) – Discussing why open-source AI is the future
🔹 Loubna Ben Allal (Hugging Face) – Breaking down the rise of small models & on-device AI
🔹 Pavlo Molchanov (NVIDIA) – Exploring efficient architectures & the future of AI reasoning

We talk about:
✅ The $200 billion AI race – and why Meta, Microsoft, and Amazon are betting big
✅ Why small AI models are taking over (and when they outperform GPT-4)
✅ The biggest mistake AI teams make when choosing models
✅ The future of fine-tuning – why we’re moving beyond prompt engineering

🔔 Don’t miss this deep dive into the next frontier of AI 👉 https://www.youtube.com/ @DevIntheDetails

👉 Subscribe to my newsletter AI insights and building an AI startup: https://devinthedetail.substack.com/

👥 Follow me on LinkedIn: https://www.linkedin.com/in/devvret-rishi-b0857684/
🚀 Check out what I’m building at Predibase: https://predibase.com/

#opensourceai #finetuning #llm #aiarchitecture #AIResearch #metaai #huggingface #nvidia #llama3 #aifordevelopers #aicommunity #techtalk #aiinnovation #futureofai

00:00 – Intro: The Future of AI Panel
01:10 – Why Meta is Betting on Open-Source AI
04:50 – What Defines a "Small Model" in AI?
08:15 – Why Small Models Are Exploding in Popularity
12:00 – NVIDIA’s Take: The Future of Efficient AI Architectures
15:40 – Should You Use a Small Model or a Large Model?
19:30 – Why Open-Source AI Will Eventually Beat Closed-Source AI
23:50 – How Agentic Workflows Are Changing AI
27:30 – 2025 AI Predictions: What’s Next for LLMs?
34:00 – Closing Thoughts & Takeaways

Видео Meta, NVIDIA & Hugging Face on Open-Source LLMs, Fine-Tuning & the Future of AI канала Dev In the Details
Показать
Страницу в закладки Мои закладки
Все заметки Новая заметка Страницу в заметки