No video

Level Up Your Typescript Skills: Adding Ollama To Your Apps!

  Рет қаралды 24,676

Matt Williams

Matt Williams

4 ай бұрын

The Javascript library for Ollama makes it so much easier to build cools applications with AI. This video will get you up to speed on everything you need to know.
Find the code here: github.com/tec... then videoprojects then find the folder for this video.
Be sure to sign up to my monthly newsletter at technovangelis...
And if interested in supporting me, sign up for my patreon at / technovangelist

Пікірлер: 44
@userou-ig1ze
@userou-ig1ze 4 ай бұрын
This channel is top on my list lol, I'm addicted to the casual seeming, yet perfect presentations (besides the interesting content and impressive successive iterations of code...). It's clear, concise, intuitive flow and so calm I find myself not always using 2x speed
@technovangelist
@technovangelist 4 ай бұрын
Thanks so much. Nice to know it’s resonating with folks
@brogeby
@brogeby 3 ай бұрын
This is exactly the channel I was searching for. Your presentations are very well presented, great examples and you talk about very relevant topics. Thanks!
@DevynDowler-Lewis
@DevynDowler-Lewis 4 ай бұрын
This is remarkably well presented
@technovangelist
@technovangelist 4 ай бұрын
Thanks so much
@yoemasey
@yoemasey 4 ай бұрын
Thank you for explaining the details so carefully and easy to understand. Can't wait to try out Ollama :)
@lucioussmoothy
@lucioussmoothy 4 ай бұрын
nice over view -I'd love to see a view on how we can use JS to train and fine tune an Ollama model (if you haven't posted one yet)
@moonchildeverlasting9904
@moonchildeverlasting9904 4 ай бұрын
this is great matt, you probably forgot that someone made a demo showing mario kart in JS. I think you come abit longer with knowledge, then ai, but we all think differently right?
@photorealm
@photorealm 4 ай бұрын
Best video I have seen on this subject by far. Great usable information.
@technovangelist
@technovangelist 4 ай бұрын
Glad it was helpful!
@jacknimble8331
@jacknimble8331 4 ай бұрын
Thank you very much for these helpful videos! If you have the bandwidth could you go over incorporating embedding for RAG in this app?
@technovangelist
@technovangelist 4 ай бұрын
I'll have that in a video coming next week
@DC-xt1ry
@DC-xt1ry 4 ай бұрын
Thank you for sharing, Matt! I did not realize that JS was an option :-). Any plans to add a video showing Ollama + Lanchain ?
@technovangelist
@technovangelist 4 ай бұрын
Yes. But for most projects it doesn’t add anything. I have to think of a good use case to leverage it.
@adityasingh017
@adityasingh017 Ай бұрын
Hey anybody knows, how could i use this library in my BROWSER ?
@technovangelist
@technovangelist Ай бұрын
What do you want to do? What have you tried? Best to ask on the discord
@LucianoTonet
@LucianoTonet 4 ай бұрын
I'm learning so much with this channel. @technovangelist it could be better to see if you add some syntax highlighting on your code I think. Thanks
@MuhammadAzhar-eq3fi
@MuhammadAzhar-eq3fi 4 ай бұрын
Thanks a lot, Sir. ❤
@CookerSingh
@CookerSingh 4 ай бұрын
Which models currently support function calling, and is there a function calling proxy available for any of these models, or does Ollama provide one?
@technovangelist
@technovangelist 4 ай бұрын
All of them. Well except the embedding models. But it’s not a feature the model itself needs to support.
@grigorikochanov3244
@grigorikochanov3244 4 ай бұрын
Sorry, at 7:00 you say that in chat endpoint parameters are replaced with messages, and show a code sample with /api/generate endpoint. The chat endpoint for Ollama is /api/chat. Is it a mistyping?
@technovangelist
@technovangelist 4 ай бұрын
Doh. Good catch. Yes a typo. It’s right in the GitHub repo.
@Pdizzle-ic5sk
@Pdizzle-ic5sk 4 ай бұрын
I downloaded the webUi docker container (Chat GPT like interface) about 3 months ago. Everything worked great. Downloading models was super convenient and easy. Tried to download the latest image on a new machine and the interface got a lot more complex. I dont even know how to download a model anymore. Any chance you can make a video on the web UI docker container? Easy docker compose startup. Downloading the diferenent models, and and prompting them? This tutorial was great! Thanks,
@technovangelist
@technovangelist 4 ай бұрын
Personally I’m not a big fan of most of the web interfaces I have seen. But I should try again.
@abdulrahimannaufal5678
@abdulrahimannaufal5678 4 ай бұрын
Ollama doesn't stream the output, instead prints all at once when I tried with ngrook over https. Is it the same with tailscale ?
@technovangelist
@technovangelist 4 ай бұрын
Haven’t used ngrok before so don’t know how to configure it correctly. You should work on fixing that configuration. Tailscale definitely doesn’t screw with the output of applications
@mshonle
@mshonle 4 ай бұрын
This is random and only tangentially related, but: do you know how to do that thing where your browser sees the localhost as usernames dot machine and there’s also some local CA cert configuration so you can test using https? (Random, yes, but I comment because I care.)
@technovangelist
@technovangelist 4 ай бұрын
Not sure but I think you are talking about wildcard certs. You can do this with certs from letsencrypt using traefik. Technotim I think did a video about this a couple years ago as did Christian L something. Last name starts with l. Lempa? There are other alternatives I think with nginx instead of traefik.
@joshuandala7669
@joshuandala7669 2 ай бұрын
The link isn't working, but I was able to find it. Hopefully you see this comment so that the error can be changed
@technovangelist
@technovangelist 2 ай бұрын
Interesting. I think I fixed it, though.
@AINMEisONE
@AINMEisONE 4 ай бұрын
Hi Matt, here are some questions: 1. I have Ollama running but it will not see my Egpu, it's an RTX 4090. I use LM Studio but it's always full of bugs and creates strange replies or does not even load the model. But when it does it sees the CUDA GPU and uses 100% and wow it runs the 70b model perfectly...2. I want to be able to also move from the Intel Macbook Pro Ollama models I downloaded so I do not need to re-download them again and again. Can you make a video for this? You know that most apps now are not on Intel unless you do bootcamp Windows then it runs and the best thing is the RTX 4090 eGPU works incredibly! Help, please.
@technovangelist
@technovangelist 4 ай бұрын
I don’t have a good answer for you. I don’t think the team has put any effort into supporting Intel Macs. When we were first building it Apple silicon is what inspired us to create ollama. But supporting egpu may be a bit further out. If you wanted to load up Linux on there it may have a chance.
@AINMEisONE
@AINMEisONE 4 ай бұрын
@@technovangelist I have an M2 MAX 96GB RAM, the intel is faster, with the RTX 4090... by a lot. with LM studio for now.. but too many bugs in their stuff. It has gotten a lot better..but still not as good as Ollama, big models do not work at all 34b on up.. they get stuck or have very low-level responses.. I will send maybe some other questions. in the meantime to fix some things.
@AINMEisONE
@AINMEisONE 4 ай бұрын
@@technovangelist ok which Linux version to install? I tried Ubuntu 20.04/22.04 and it did not support Ollama it said.. I used the command prompt on the Ollama site and it never worked. Thanks!
@technovangelist
@technovangelist 4 ай бұрын
Asking in the discord will have better results. I know nothing about that.
@truehighs7845
@truehighs7845 4 ай бұрын
Great video as usual, but did you say we can infer with fine-tuned models from ollama?
@technovangelist
@technovangelist 4 ай бұрын
Yes, absolutely. Ollama doesn’t do the fine tuning but it can use the adapters.
@truehighs7845
@truehighs7845 4 ай бұрын
@@technovangelist How do you load them, with 'custom models' ?
@technovangelist
@technovangelist 4 ай бұрын
The docs show how. Create a modelfile and use the adapter instruction
@truehighs7845
@truehighs7845 4 ай бұрын
@@technovangelist ok thanks!
@florentflote
@florentflote 4 ай бұрын
@DoNotKillThePresiden
@DoNotKillThePresiden 4 ай бұрын
Thanks Matt 😁
@technovangelist
@technovangelist 4 ай бұрын
You bet!
@RickySupriyadi
@RickySupriyadi 4 ай бұрын
Matt Williams Channel just got EXP! 🆙 💫 +100 EXP 💫 +100 EXP 💫 +100 EXP 💫 +100 EXP 💫 +100 EXP LVL 2 (100/1000) EXP Matt Williams Unlocked Titles: 👐 **The Caring One** Lv. 1 (Cares for all types of audiences, taking care of newbies) 🧑‍🏫 **The Guru** Lv. 1 ("Clears Makes complex topics easy to understand) My likes for this channel have skyrocketed!
Unlocking The Power Of AI: Creating Python Apps With Ollama!
12:12
Matt Williams
Рет қаралды 25 М.
Getting Started on Ollama
11:26
Matt Williams
Рет қаралды 47 М.
Survive 100 Days In Nuclear Bunker, Win $500,000
32:21
MrBeast
Рет қаралды 147 МЛН
Little brothers couldn't stay calm when they noticed a bin lorry #shorts
00:32
Fabiosa Best Lifehacks
Рет қаралды 17 МЛН
Вы чего бл….🤣🤣🙏🏽🙏🏽🙏🏽
00:18
Это реально работает?!
00:33
БРУНО
Рет қаралды 4,3 МЛН
Every React Concept Explained in 12 Minutes
11:53
Code Bootcamp
Рет қаралды 551 М.
Is Open Webui The Ultimate Ollama Frontend Choice?
16:43
Matt Williams
Рет қаралды 75 М.
This AI Agent with RAG Manages MY LIFE
10:52
Cole Medin
Рет қаралды 10 М.
Is Dify the easiest way to build AI Applications?
13:50
Matt Williams
Рет қаралды 14 М.
I Analyzed My Finance With Local LLMs
17:51
Thu Vu data analytics
Рет қаралды 463 М.
Generics: The most intimidating TypeScript feature
18:19
Matt Pocock
Рет қаралды 171 М.
Supercharge your Python App with RAG and Ollama in Minutes
9:42
Matt Williams
Рет қаралды 33 М.
The Secret Behind Ollama's Magic: Revealed!
8:27
Matt Williams
Рет қаралды 31 М.
CS Professor Sounds Alarm on AI and Programmers
12:21
Travis Media
Рет қаралды 289 М.
Survive 100 Days In Nuclear Bunker, Win $500,000
32:21
MrBeast
Рет қаралды 147 МЛН