No video

Is Twinny an Even Better Local Copilot

  Рет қаралды 21,251

Matt Williams

Matt Williams

6 ай бұрын

I was pretty enamored by Llama Coder and Continue, but Twinny may be even better as it combines the best of both...maybe it's even the best local CoPilot....
Be sure to sign up to my monthly newsletter at technovangelis...
And if interested in supporting me, sign up for my patreon at / technovangelist

Пікірлер: 62
@richardmacarthy8569
@richardmacarthy8569 6 ай бұрын
Hey Matt, Richard here, the author of twinny. Thank you ever so much for the mention and support it means a lot. Your videos are great and love that we have a personal connection in Gower Wales, it's awesome. The name twinny is a play on twins as a brother / pair programmer, new features and fixes to come soon. All the best!
@zscoder
@zscoder 6 ай бұрын
Excited try it, just checked out your plugin on vscode 🙌
@AaronHiltonSPD
@AaronHiltonSPD 6 ай бұрын
Wonderful work! Switching to twinny today.
@maulikmadhavi
@maulikmadhavi 4 ай бұрын
thanks for your contribution to the community
@SuperWabo
@SuperWabo 3 ай бұрын
And thank you for making Twinny! ❤ Exactly the balance I have been looking for. Twinny is probably what most people want - they just haven't heard of it or tried it. Works perfectly with llama3 in ollama locally. No mandatory logins/accounts. No paywalled "Pro level feature" and so on. It is 10/10. Please organize a team around your vision for this extension - a no-nonsense, locally or API-hosted AI code completion. Again - Thank you for your contribution 🙏
@conFIGuredAPK
@conFIGuredAPK 3 ай бұрын
works great! But how do I connect it to my AI server? my AI is on another machine and i cant get it to connect to it after changing the settings to the IP and port.
@tiredofeverythingnew
@tiredofeverythingnew 6 ай бұрын
I just love the awkward silence at the end of each video, make me chucke every time.
@envivomedia
@envivomedia 6 ай бұрын
Agreed! Very funny.
@Tarbard
@Tarbard 6 ай бұрын
I know nothings going to happen at the end but I watch it anyway for some reason.
@technovangelist
@technovangelist 6 ай бұрын
Sweet. Maybe I should stand there staring at the camera for 4 hours and just rake in the ad revenue.
@rhyscampbell4178
@rhyscampbell4178 6 ай бұрын
Haha it’s my favourite
@HistoryIsAbsurd
@HistoryIsAbsurd 6 ай бұрын
Infinite money glitch, you have found.@@technovangelist
@piero957
@piero957 6 ай бұрын
Thank you Matt, it's really a pleasure to watch and learn from your videos. I'm from Italy, I learned English at school more than 60 years ago, nonetheless I really enjoy your very clear, understandable and even relaxing explanations. Keep going on, your format is perfect even out of the US/UK ecosystem, IMO.
@technovangelist
@technovangelist 6 ай бұрын
Thanks so much. I used to drive through Italy on my way to Konstansz from Amsterdam. There are more direct routes but I loved the drive. Also did a few cheap weekends in Rome and then two weeks in Florence and Cedri in Tuscany. Hoping to take our 5 year old daughter to see Italy soon. My wife did a semester in Rome while working one her architecture masters.
@lancemarchetti8673
@lancemarchetti8673 6 ай бұрын
Thanks Matt for your research in the AI coders field. It's so helpful.
@mbottambotta
@mbottambotta 6 ай бұрын
another great video, thank you Matt! if I could suggest a topic, it would be how to use Ollama embeddings for RAG purposes 🙏
@ErikTaraldsen
@ErikTaraldsen 6 ай бұрын
One thing I like about Continue is the ability to set "System Message" pr modell, or indeed different messages pr model. The models seems to prefer python, and if not given enough context - their response will be python code. So I like to add whatever programming language I mostly use in the system message. "You are an expert programmer that writes simple, concise code and short explanations. You prefer perl". Tips on good system messages, or how to not need to set them would make for a great video in my opinion. Other videos I'd like to see: Config tweeks I dont know about.
@technovangelist
@technovangelist 6 ай бұрын
I think I saw the Twinny prompts say to respond in whatever language the current file is set to in vscode .
@ErikTaraldsen
@ErikTaraldsen 6 ай бұрын
That would be smoother than my current system of having to remember to choose language manually. Guess I have to test out Twinny.
@simplir5644
@simplir5644 6 ай бұрын
Thanks Matt. Sure sharing the search function is cool and adding this feature to Ollama would be even cooler
@gpp2000
@gpp2000 Ай бұрын
Awesome stuff Matt. curious how you have found the "generate tests" feature in Twinny now that its been a few months? I'm mostly coding in Ruby/Rails and I've been trying out different models to see if I can get one that does a bang up job with RSpecs.
@vpd825
@vpd825 6 ай бұрын
Hey, Matt. Love your videos! I hear twins calling each other "twinny" where I am...and I'm not in Wales, nor on the continent Wales is. Maybe the creator meant something similar to "my pair programmer"?
@technovangelist
@technovangelist 6 ай бұрын
Yup. Richard let me know it’s meant to be your twin developer
@HistoryIsAbsurd
@HistoryIsAbsurd 6 ай бұрын
Great vid! Thanks alot I swear i subbed to you last week...thanks youtube!
@muhammadumarsotvoldiev8768
@muhammadumarsotvoldiev8768 6 ай бұрын
Thank you very much! Very useful for me!
@technovangelist
@technovangelist 6 ай бұрын
Glad to hear that!
@envivomedia
@envivomedia 6 ай бұрын
I admire your unique approach to utilizing the code editor, which differs significantly from mine. As a Product Owner for a Sitecore site that faces significant setup challenges, my options for visual components are quite restricted. I frequently rely on Copilot to generate HTML/CSS/JS snippets. In my role, I'm accustomed to drafting backlog items for developers, so I find myself interacting with the AI much like I would with a developer, often requesting it to generate complete functions for convenient copy-and-paste application.
@technovangelist
@technovangelist 6 ай бұрын
Then it sounds like twinny and its chat is perfect for you. The ability to generate large sections and preview it before adding to your code is fantastic.
@envivomedia
@envivomedia 6 ай бұрын
@@technovangelisti'll definatly give it a whirl! Thanks :D
@TimothyGraupmann
@TimothyGraupmann 6 ай бұрын
(2:10) Is there a code cleanup support group? AI was struggling with the ffmpeg syntax to get a frame from a MP4 video in Python today. But really there should be an extension to report code cleanup issues with the history that AI can learn from. Maybe if something like that existed, we'd be doing less code cleanup. Command+M and Command+L are useful. We need a Command+J for just go ahead and clean up my code. "Twinny" sounds like something from "The Dark Tower" (Stephen King) series. Twinners are doppelgangers. Thinny is a weakspot in the multiverse.
@kevyyar
@kevyyar 6 ай бұрын
how do you set it up? i've downloaded the extension but i get no autocomplete. I have also downloaded Ollama and have downloaded/installed llamacoder model. But how do you set it up on vscode?
@dionisii93
@dionisii93 6 ай бұрын
How about pieces?
@technovangelist
@technovangelist 6 ай бұрын
Pieces? Never heard of it. Hmm snippets with ai. Interesting.
@dionisii93
@dionisii93 6 ай бұрын
@@technovangelist has quite a number of supported ides, local + cloud models, attachments and storage
@technovangelist
@technovangelist 6 ай бұрын
Tried them, but they are being way to cryptic about how they work. Seems super sketchy, deleted.
@masnwilliams
@masnwilliams 5 ай бұрын
@@technovangelist Hey! I'm Mason from the Pieces team and apologize that we came across as cryptic to you! I'm not sure exactly what you are referring to as cryptic but really would love your feedback on our product as a whole. We try to be extremely transparent through livestreams and open source meetups but something obviously fell through the cracks here and this isn't how we want our brand perceived by anyone haha If you'd be open to it, I'd love to hop on a call and chat sometime about your thoughts.
@rtpHarry
@rtpHarry 6 ай бұрын
I just noticed the subscribe button under the video get a little animated flair as you said like and subscribe. Is that some feature of YT or a coincidence that you said it right before the end of the video and maybe it just does it then? Never noticed it before.
@technovangelist
@technovangelist 6 ай бұрын
I think it had to be coincidence. I didn’t do anything. But it would be cool if it did.
@actorjohanmatsfredkarlsson2293
@actorjohanmatsfredkarlsson2293 6 ай бұрын
Found privy recently
@niquedegraaff
@niquedegraaff 6 ай бұрын
I feel like a vampire when looking at your bright vscode theme.. aaaargglgghhhhsssshss
@technovangelist
@technovangelist 6 ай бұрын
I can’t stand dark mode. Hurts my eyes to look at for long.
@Machiuka
@Machiuka 6 ай бұрын
It needed Ollama that is not available on Windows yet. Anyway, very interesting project.
@technovangelist
@technovangelist 6 ай бұрын
Ollama has been available on windows for 3 months or so. Turn on Microsoft’s own feature WSL2 and then install Ollama.
@f2ame536
@f2ame536 6 ай бұрын
im very new at all these new ai coders and have only used github's copilot. how do these two compare?
@technovangelist
@technovangelist 6 ай бұрын
It’s often just as capable without the massive privacy and security risk. A huge number of companies ban use of tools like copilot
@Tymon0000
@Tymon0000 6 ай бұрын
Anyone knows how to use twinny on windows? I started ollama serve with wls and twinny doesn't connect to it. It is running on 127.0.0.1:11434 just like in the settings.
@technovangelist
@technovangelist 6 ай бұрын
I definitely need some windows content. Thanks.
@technovangelist
@technovangelist 6 ай бұрын
But I think the developer of twinny does it on windows
@GrantCelley
@GrantCelley 3 ай бұрын
continue can be code compleation but is in beta. Works great but still it's in beta.
@technovangelist
@technovangelist 3 ай бұрын
It wasn’t when the video was made
@sumitmamoria
@sumitmamoria 6 ай бұрын
Is there an extension that also generates commit messages?
@technovangelist
@technovangelist 6 ай бұрын
I was using something that did it but can’t remember what it was
@kokizzu
@kokizzu 6 ай бұрын
isnt cody can work with local ollama?
@technovangelist
@technovangelist 6 ай бұрын
Yes but it’s not fully supported and the tweets in the video are from the CEO saying how it may not be offline and it will probably involve a subscription and syncing with online services.
@geomorillo
@geomorillo 6 ай бұрын
Ollama still doesnt work with windows...
@technovangelist
@technovangelist 6 ай бұрын
It has worked on windows for 3 or 4 months
@testales
@testales 6 ай бұрын
I just installed it on WSL2 under Windows 10 and ran Mixtral on it. It look over a day to update this sh*tty Windows so that my GPU became available under WSL. It took like 5 minutes to install Ubuntu and another 2 minutes to install Ollama. Also even the quantized version of Mixtral is still 26GB. That Ollama can load it so quickly and it performs so well that it's totally usable seems like magic to me as my GPU only has 24GB.
@technovangelist
@technovangelist 6 ай бұрын
Yeah if you don’t have the memory in gpu it’s going to be slow. Mixtral with the context is going to need 30-40gb and there isn’t much that can be done there. Stick with models that perform well on you hw.
@testales
@testales 6 ай бұрын
@@technovangelist No, it works as I said, to me it's a little miracle as I had nearly given up on any MoE model. GPU memory of the 3090 in this system was mostly filled, CUDA usage surprisingly low but the model responds quicky enough that a fluid chat is possible, I've no idea how they made it working so well. I had a model with like 27 or 28GB recently running on LM Studio and had to offload some layers so it could fit into the VRAM. It was 60 layers in total or so, YI 6bit If I remember correctly. I could load up to 54 or so but it still was painfully slow. I chatted with Mixtral for well over an hour, the text generation didn't slow down much and it completed my question pool like no other model before, quite insane I have to say, it answered questions that no other local model could solve so far, including the YI model I got barely running. The context window size seemed to be a lot lower than 8k though, but I'm not sure. I haven't figured out yet where all the settings are in OLLAMA. This time I prefered to just enjoy the model running so well out of the box without me having to juggle with ChatTemplates and all kinds of parameters. :-)
Is Dify the easiest way to build AI Applications?
13:50
Matt Williams
Рет қаралды 14 М.
Ollama 0.1.26 Makes Embedding 100x Better
8:17
Matt Williams
Рет қаралды 45 М.
ROLLING DOWN
00:20
Natan por Aí
Рет қаралды 6 МЛН
SCHOOLBOY. Последняя часть🤓
00:15
⚡️КАН АНДРЕЙ⚡️
Рет қаралды 9 МЛН
Я обещал подарить ему самокат!
01:00
Vlad Samokatchik
Рет қаралды 10 МЛН
小蚂蚁被感动了!火影忍者 #佐助 #家庭
00:54
火影忍者一家
Рет қаралды 52 МЛН
Getting Started on Ollama
11:26
Matt Williams
Рет қаралды 47 М.
Using Llama Coder As Your AI Assistant
9:18
Matt Williams
Рет қаралды 68 М.
The Secret Behind Ollama's Magic: Revealed!
8:27
Matt Williams
Рет қаралды 31 М.
Step-by-Step Guide to Creating a NVIDIA Ollama AI Deployment on Harvester - Rancher
24:24
Clemenko - Kuberenetes Firefighter
Рет қаралды 313
Function Calling in Ollama vs OpenAI
8:49
Matt Williams
Рет қаралды 32 М.
Have You Picked the Wrong AI Agent Framework?
13:10
Matt Williams
Рет қаралды 58 М.
I Analyzed My Finance With Local LLMs
17:51
Thu Vu data analytics
Рет қаралды 463 М.
Finally Ollama has an OpenAI compatible API
10:47
Matt Williams
Рет қаралды 17 М.
CoPilot Review: My Thoughts After 6 Months
9:45
ThePrimeagen
Рет қаралды 542 М.
ROLLING DOWN
00:20
Natan por Aí
Рет қаралды 6 МЛН