Рет қаралды 498
How to run Local AI Models, in 5 minutes, using LM Studio
You don't have to always pay for OpenAI or Anthropic APIs.
In this video, I'll show you how to set up and run Open Source LLMs on your machine with LM Studio.
It is completely free and private too!
📥 Download LM Studio: lmstudio.ai
I downloaded Llama 3 7B and have experimented with Llava.
Check HuggingFace at huggingface.co
Alternatives:
* Ollama: ollama.com
* Llama.cpp: github.com/ggerganov/llama.cpp
💡 Please comment what you plan to build with these models.
👍 If you found this video helpful, please like, subscribe, and share it with your friends or colleagues.
I have more coming and am planning a video on using your data with local models (RAG and other techniques).
🖥️ Make sure your machine meets the minimum system requirements.
I'm running on a MacBook Air M1 with 16GB RAM.
We can also chat on:
Threads: www.threads.net/@diogosnows
X: x.com/DiogoSnows
🔴 Watch me live on Twitch TV: / diogosnows
🕒 Timestamps:
0:00 - Introduction
1:14 - Downloading LM Studio
1:42 - Alternatives
1:56 - Choosing Model
2:44 - Chat with Local AI
4:34 - More on LM Studio
5:04 - Running AI Model Server
6:25 - Calling from Python
7:39 - Thoughts and see you soon