Free and Local AI in Home Assistant using Ollama

  Рет қаралды 23,273

KPeyanski

KPeyanski

3 ай бұрын

► MY HOME ASSISTANT INSTALLATION METHODS FREE WEBINAR - automatelike.pro/webinar
► DOWNLOAD MY FREE SMART HOME GLOSSARY - automatelike.pro/glossary
► MY RECORDING GEAR
MAIN CAMERA: amzn.to/3Ln8qzb
MAIN & 2ND ANGLE LENS: amzn.to/48bhxMZ
2ND ANGLE CAMERA: amzn.to/44RjRWs
SD CARDS: amzn.to/3sT7fRy & amzn.to/3sS0wHu
MICROPHONE: amzn.to/466Kxne
BACKUP MIC: amzn.to/468BSkb
EDITING MACHINE: amzn.to/45LWdvS
► SUPPORT MY WORK
Paypal - www.paypal.me/kpeyanski
Patreon - / kpeyanski
Bitcoin - 1GnUtPEXaeCUVWdJxCfDaKkvcwf247akva
Revolut - revolut.me/kiriltk3x
Join this channel to get access to perks - / @kpeyanski
✅ Don't Forget to like 👍 comment ✍ and subscribe to my channel!
► MY ARTICLE ABOUT THAT TOPIC - peyanski.com/home-assistant-o...
► DISCLAIMER
Some of the links above are affiliate links. If you click on these links and purchase an item I will earn a small commission with no additional cost for you. Of course, you don’t have to do so in case you don’t want to support my work!

Пікірлер: 71
@KPeyanski
@KPeyanski 3 ай бұрын
Are you going to try this Home Assistant Ollama Integration? And if yes, on what kind of device are you going to install the Ollama software?
@bugsub
@bugsub 2 ай бұрын
Wow! Fantastic tutorial! Really appreciate your channel!
@KPeyanski
@KPeyanski 2 ай бұрын
Glad it was helpful and thanks for the kind words!
@joeking5211
@joeking5211 2 ай бұрын
Looks a fantastic vid. Will keep an eye open for the Windows tutorial and come back then.
@KPeyanski
@KPeyanski 2 ай бұрын
it is almost the same for windows. You just have to install the ollama windows version and everything else is the same
@FrankGraffagnino
@FrankGraffagnino 2 ай бұрын
I _REALLY_ appreciate a tutorial that shows how to do this with a local LLM... very cool. Thanks!
@KPeyanski
@KPeyanski 2 ай бұрын
You're very welcome! Are you going to try it and on what device?
@FrankGraffagnino
@FrankGraffagnino 2 ай бұрын
@@KPeyanski probably not yet. But I just love when consumers can be better educated about local control. Thanks!
@KPeyanski
@KPeyanski 2 ай бұрын
Yes, I also prefer local. Unfortunately it is not always an option.
@AlonsoVPR
@AlonsoVPR 3 ай бұрын
I was waiting for someone to make a video about this! thank you sir!!
@KPeyanski
@KPeyanski 3 ай бұрын
Glad it was helpful! On what kind of device are you going to install the Ollama software?
@AlonsoVPR
@AlonsoVPR 3 ай бұрын
@@KPeyanski I don't have enough horsepower for this at the moment, I'm into low consumption at the moment but I'm thinking on getting a proxmox server with a dedicated GPU, At the moment all my house runs on a 2012 i5 Mac mini with 8gb of ram also using proxmox
@KPeyanski
@KPeyanski 3 ай бұрын
I understand, low power consumption is important but i5 is not that bad and you can try Ollama on it. If it is not OK just delete/uninstall it!
@AlonsoVPR
@AlonsoVPR 3 ай бұрын
@@KPeyanski Maybe when I get a better server with more ram :P sadly my old mac mini has 8gb of ram soldered to the motherboard and all my services are using about 72% of the ram at the moment:P Now I'm struggling on finding a good zigbee mmwave sensor that doesn't spams the network :/ Any recomendations? I have tried the TUYA-M100 and the MTG275-ZB-RL. although the MTG275-ZB-RL is way better than the TUYA it's still spamming my zigbee network several times per second
@ecotts
@ecotts 2 ай бұрын
I'm waiting for someone to make a video about all the data that META stole from your system as a result of the installation and then sold on to some random companies.
@RocketBoom1966
@RocketBoom1966 2 ай бұрын
Thank you, excellent content as usual. I have setup Ollama running in a Docker container on my Unraid server. The server has a low power Nvidia GPU which I make use of to speed up responses. Another fun thing to try is to modify the end of the prompt template with something like this: Answer the user's questions using the information about this smart home. Keep your answers brief and do not apologize. Speak in the style of Captain Picard from Star Trek. Yes, my assistant will respond with answers in the style of Captain Picard.
@KPeyanski
@KPeyanski 2 ай бұрын
Oh that is very interesting thanks for the info, but how you make the HA Ollama Integration to answer with voice?
@RocketBoom1966
@RocketBoom1966 2 ай бұрын
@@KPeyanski I have seen it done, however I have struggled to make it work. My modified prompt template only responds in text form as you explained in your video. Things are moving so fast with these AI integrations, I imagine it won't be long until Home Assistant includes powerful AI tools by default. Exciting times.
@KPeyanski
@KPeyanski 2 ай бұрын
exciting times indeed :)
@EvgenMo1111
@EvgenMo1111 15 күн бұрын
hi, what size is your LLM?
@BrettVilnis
@BrettVilnis 3 ай бұрын
Thanks, excellent video.
@KPeyanski
@KPeyanski 2 ай бұрын
Glad you enjoyed it! Are you going to try it?
@BrettVilnis
@BrettVilnis 2 ай бұрын
@@KPeyanski When voice is working
@KPeyanski
@KPeyanski 2 ай бұрын
no idea, hopefully soon
@SmartTechArabic
@SmartTechArabic 19 күн бұрын
Thanks for the informative tutorial. I have set Ollama server on a separate server, and it the local LLM is working well through the open web-UI, and I setup the Olama integration on home assistant, and I setup a home assistant assist to use Ollama. But unfortunately whenever I ask a qesution, I am not getting any response. What am I missing?
@KPeyanski
@KPeyanski 16 күн бұрын
try debug on your pipeline and check what is going on...
@fdb-you
@fdb-you 2 ай бұрын
So for the llama I need a second device to be always on? Is it possibly to install it directly on a hass server
@KPeyanski
@KPeyanski 2 ай бұрын
No, with this integration this is not possible. At least for now...
@miguelcid1965
@miguelcid1965 2 ай бұрын
With llama is it able to turn on lights or entities in general? I read in the integration page of Hassio that with the llama integration it isnt possible, but maybe was that before? Thanks.
@marcomow
@marcomow Ай бұрын
now it's possible, upgrade HA to 2024.06!
@PauloAbreu
@PauloAbreu 2 ай бұрын
Great tutorial! Thanks. Is English the only language available?
@KPeyanski
@KPeyanski 2 ай бұрын
not sure about that, but I think yes!
@Palleri
@Palleri 2 ай бұрын
Could you share the prompt template you are using?
@keviincosmos
@keviincosmos 2 ай бұрын
Would like that too 💪
@michaelthompson657
@michaelthompson657 3 ай бұрын
Im assuming since it can be installed on Linux you could have this on a separate pi on raspberry pi os lite and connect it to your other pi running HA? Just I have HA on a pi 4 and have a spare pi 3, just wondering if the pi 3 would be powerful enough to run ollama?
@KPeyanski
@KPeyanski 3 ай бұрын
This is interesting indeed, but I guess you have to try it out. It will be best if you share the result!
@michaelthompson657
@michaelthompson657 3 ай бұрын
@@KPeyanski do you think I could install it on raspberry pi os lite? Im very inexperienced with pi os
@KPeyanski
@KPeyanski 2 ай бұрын
I don't know, you can try...
@michaelthompson657
@michaelthompson657 2 ай бұрын
@@KPeyanski I’m not that good 🤣
@fred7flinstone
@fred7flinstone Ай бұрын
I am getting "Unexpected error during intent recognition".
@danninoash
@danninoash 2 ай бұрын
Hi, great video first of all, THANKS!! What is missing to me is the BT proxy...how do I configure it? it is a must? why this part isn't mentioned in the video? :(
@KPeyanski
@KPeyanski 2 ай бұрын
BT proxy is not needed at all here. The communication between Home Assistant and Ollama is over the IP network, so just follow the steps from the video and you will have it noting additional is needed
@danninoash
@danninoash 2 ай бұрын
@@KPeyanski SORRY!! I confused my question with another video of yours - the creating Apple Watch as a device in HA LOL :))
@danninoash
@danninoash 2 ай бұрын
@@KPeyanski What I wanted to ask here actually is - will I have to put a machine that will be turned on for 24\7?? (whether it's Win\LinuxMacOS) I didn't fully understand what should I do with it after I connect my HA with the Ollama integration? Qeustion #2 please - does it interrupts somehow to my Alexa or it works alongside it? THANKS!!
@danninoash
@danninoash 2 ай бұрын
???
@jacquesdupontd
@jacquesdupontd 2 ай бұрын
Thanks for the very good video. I know that you can now make a pretty good integration of GPT in HA and have a trigger and speech exchanges. I imagine it's gonne be even easier and perfect (and creepier at the same time) with GPT-4o. I'm sure we'll be able to control devices and have speech and trigger soon for Ollama. I subscribe to your channel
@KPeyanski
@KPeyanski 2 ай бұрын
Thanks for subscribing! Yes, integrating GPT into Home Assistant is becoming increasingly seamless, and GPT-4 will likely make it even more intuitive and powerful. It's exciting (and a bit creepy) to think about how advanced and interactive our smart homes can become soon. Stay tuned for more updates!
@jacquesdupontd
@jacquesdupontd 2 ай бұрын
@@KPeyanski I'm doing the researches to build some kind of Amazon Echo with Local LLM and maybe with a screen. A bit like the ESP32-S3-BOX but better. Not for commercialisation for now (i'm sure there are tons of projects like that being developped). I'm still not sure about what device to use to handle the local LLM. A GPU is a huge plus but takes too much place. The best would be a Mac Mini M1, Ollama LLMS works wonder on it. I have to check how well works Asahi linux and if i can pack everything in it (personnal home server, Home Assistant, Ollama, Voice assistant)
@jacquesdupontd
@jacquesdupontd 26 күн бұрын
Little update. I now have a few ESP32 (KORVO, S3, Atom Echo) and i've been playing a bit (you can check my last videos to see my little setup). For now i'm only using external A.I because Ollama is not able to control our devices yet and also, it is still quite slow compared to Google or GPT. It's working great. My next project is to take a bluetooth speaker and hack it with an ESP32-S3 to make it become a Voice Assistant device like Google Nest or Amazon Echo Dot
@sirmax91
@sirmax91 2 ай бұрын
can you make it run on raspberry pi 5 and link it to home assiatant
@KPeyanski
@KPeyanski 2 ай бұрын
I think yes, but I guess you have to try it.
@markrgriffin
@markrgriffin 2 ай бұрын
Probably a dumb question, but how do I expose OLLAMA on my network if I install on Windows. Instructions are not very specific
@KPeyanski
@KPeyanski 2 ай бұрын
Follow the instructions from the Ollama documentation and add the Ollama IP in your OLLAMA_HOST variable. These are the steps: On windows, Ollama inherits your user and system environment variables. First Quit Ollama by clicking on it in the task bar Edit system environment variables from the control panel Edit or create New variable(s) for your user account for OLLAMA_HOST, OLLAMA_MODELS, etc. Click OK/Apply to save Run ollama from a new terminal window
@markrgriffin
@markrgriffin 2 ай бұрын
@KPeyanski thanks for the reply. So just add the two variables names? With no values? That's where I'm stuck unfortunately. Do I not need to add a path to OLLAMA_MODELS and an ip for the host as variables?
@MichaelDomer
@MichaelDomer 2 ай бұрын
Get rid of that llama2, version 3 that was just released completely destroys it.
@KPeyanski
@KPeyanski 2 ай бұрын
sounds good, are you using it already? And for what exactly?
@hpsfresh
@hpsfresh Ай бұрын
This video needs chapters time codes
@KPeyanski
@KPeyanski 28 күн бұрын
sorry, I'm too lazy for that right now and there is no one willing to help either...
@OrlandoPaco
@OrlandoPaco 3 ай бұрын
Add voice!
@KPeyanski
@KPeyanski 3 ай бұрын
Yes, voice is needed here... Maybe in the next release!
@KubedPixel
@KubedPixel 2 ай бұрын
Under NO CIRCUMSTANCES is anything facebook related going ANYWHERE near my network, offline/local or not.
@KPeyanski
@KPeyanski 2 ай бұрын
no problem, you can select another model that have nothing in common with Meta & facebook
@andrewtfluck
@andrewtfluck 2 ай бұрын
Ollama, the tool, is separate from Facebook/Meta. You can run Llama on it, but you have a variety of other LLMs to choose from.
@KubedPixel
@KubedPixel 2 ай бұрын
@@andrewtfluck WhatsApp WAS a separate tool to Facebook.. not any more. Ollama was developed by meta (Facebook) and I'm 99% there's 'call home' beacons in the code somewhere. Also, just out of principle, I will not use anything Facebook related.
@Busy_Paws
@Busy_Paws Ай бұрын
Paranoia
@ecotts
@ecotts 2 ай бұрын
I will never in my life add anything META related intentionally on any of my systems. Hell No!! 😂
@rude_people_die_young
@rude_people_die_young 2 ай бұрын
Shouldn’t be hard to do function calling hey
@KPeyanski
@KPeyanski 2 ай бұрын
you mean voice function hey or something else?
@rude_people_die_young
@rude_people_die_young 2 ай бұрын
@@KPeyanski I mean where the LLM emits valid JSON that can be used in commands or API calls. It’s a confusing AI term.
This Llama 3 is powerful and uncensored, let’s run it
14:58
David Ondrej
Рет қаралды 92 М.
버블티로 체감되는 요즘 물가
00:16
진영민yeongmin
Рет қаралды 128 МЛН
Who has won ?? 😀 #shortvideo #lizzyisaeva
00:24
Lizzy Isaeva
Рет қаралды 62 МЛН
THEY WANTED TO TAKE ALL HIS GOODIES 🍫🥤🍟😂
00:17
OKUNJATA
Рет қаралды 23 МЛН
HOW DID HE WIN? 😱
00:33
Topper Guild
Рет қаралды 45 МЛН
GPT TOOK OVER MY HOME - I learned why it's SCARY | |  Chapter 4
17:54
Technithusiast
Рет қаралды 106 М.
Using Llama 3 to Control Home Assistant | Local AI
11:57
Home Assistant Scripts Hidden Feature!
9:24
SlackerLabs
Рет қаралды 15 М.
Local AI Just Got Easy (and Cheap)
13:27
Data Slayer
Рет қаралды 246 М.
Ollama UI - Your NEW Go-To Local LLM
10:11
Matthew Berman
Рет қаралды 97 М.
Has Generative AI Already Peaked? - Computerphile
12:48
Computerphile
Рет қаралды 870 М.
Что она делает?
0:34
Почему?
Рет қаралды 2,6 МЛН
Tisue ajaib
0:17
Nicholas Noah
Рет қаралды 13 МЛН
50 YouTubers Fight For $1,000,000
41:27
MrBeast
Рет қаралды 126 МЛН
WHAT’S THAT?
0:27
Natan por Aí
Рет қаралды 4,4 МЛН
小天使和小丑离家出走#short #angel #clown
0:36
Super Beauty team
Рет қаралды 30 МЛН
Всегда снимай кольцо на речке
0:34
RICARDO
Рет қаралды 6 МЛН