Загрузка...

Run 45,000 Hugging Face Models Locally With Ollama

If you use Ollama, you can run thousands of models that aren't on Ollama's official list — they're on Hugging Face, and one command pulls them straight in.

Syntax: ollama run hf.co/{username}/{repository}

▶ LINKS

▸ Use Ollama with any GGUF model on Hugging Face — official docs:
https://huggingface.co/docs/hub/ollama

▸ Local LLM playlist on this channel:
https://www.youtube.com/playlist?list=PLQP5dDPLts67psUzW096fJdMNHJB16Sob

▸ Ollama MLX blog post:
https://ollama.com/blog/mlx

▸ 1:1 help with your local AI setup:
https://cloudyeti.io/meet
▶ RELATED VIDEOS

- Ollama MLX 2x Faster on Mac: https://youtu.be/HDlMRaJq8FE
- Qwen 3.6 + Ollama (Local Claude Code Alternative): https://youtu.be/VjCPqmESUCg

▶ CHAPTERS

0:00 Thousands of models you didn't know about
0:16 Right model for the right task
1:05 Open-source isn't automatically uncensored
1:26 The command (live demo)
2:05 Roasting my own culture (Claude refuses)
2:55 Uncensored model answers
3:19 Why this trade-off matters
3:45 Who to trust on Hugging Face
4:07 What doesn't always work
4:31 Wrap up

#ollama #localai #huggingface #localllm #cloudyeti

Видео Run 45,000 Hugging Face Models Locally With Ollama канала CloudYeti | Local AI & AI ROI
Яндекс.Метрика
Все заметки Новая заметка Страницу в заметки
Страницу в закладки Мои закладки
На информационно-развлекательном портале SALDA.WS применяются cookie-файлы. Нажимая кнопку Принять, вы подтверждаете свое согласие на их использование.
О CookiesНапомнить позжеПринять