Chat with your PDF Using Ollama Llama3 - RAG

  Рет қаралды 1,777

Sanjjushri Varshini

Sanjjushri Varshini

Күн бұрын

Discover how to effortlessly extract answers from PDFs in just 8 simple steps!
Useful Links:
📂 GitHub Repository: github.com/San...
🗄️ PDF Document: arxiv.org/abs/...
📖 Medium Article: / rag-pdf-q-a-using-llam...
Don't Forget to:
👍 Like this video if you found it helpful!
🔔 Subscribe for more tutorials on advanced AI!
💬 Comment below if you have any questions or suggestions!
Connect:
LinkedIn: / sanjjushri
Twitter: x.com/Sanjjush...
#chatwithpdf #rag #ollama #llama3 #MachineLearning #EvidentlyAI #DataScience # #Python #Tutorial #AI #ML

Пікірлер: 15
@sharsha315
@sharsha315 13 сағат бұрын
Thanks for the wonderful video about RAG. Great job!!!
@sanjjushri
@sanjjushri 4 сағат бұрын
Glad the videos helped you! Thanks!
@luizcamillo9933
@luizcamillo9933 Ай бұрын
I have been watching a lot of videos but this is by far the best! For the first time I was able to effectively chat with my documents and handle them efficiently. Thank you! Great job!
@sanjjushri
@sanjjushri Ай бұрын
Thanks! Glad the videos helped you!
@Lordoss
@Lordoss Ай бұрын
That was pretty useful! Many thanks! Please keep up bringing more
@sanjjushri
@sanjjushri Ай бұрын
Sure!
@pavankurapati6628
@pavankurapati6628 16 күн бұрын
You have used the Llama 3 LLM right? Does this project require any api?
@sanjjushri
@sanjjushri 13 күн бұрын
I have done without using API, if you want you can use API. I have included the medium article and github link for the code.
@saideepj6998
@saideepj6998 Ай бұрын
Hi. Thanks for the video. Did you run the program locally ? I had the long delay in the response from LLM. I don’t have a GPU on my laptop. Please share how to improve the response speed. Did your laptop had GPU ? Please let me know.
@sanjjushri
@sanjjushri Ай бұрын
When you train for the first time it consumes time, sure will upload videos how to reduce the response time as well.
@saideepj6998
@saideepj6998 Ай бұрын
@@sanjjushri Thanks 🙂
@saideepj6998
@saideepj6998 Ай бұрын
@@sanjjushri what are the PC / laptop specifications you are running the program on ?
@sanjjushri
@sanjjushri Ай бұрын
@@saideepj6998 I use MacBook Air M1, 16 GB RAM.
@saideepj6998
@saideepj6998 Ай бұрын
@@sanjjushri Thanks for response
Ollama: Run LLMs Locally On Your Computer (Fast and Easy)
6:06
Kind Waiter's Gesture to Homeless Boy #shorts
00:32
I migliori trucchetti di Fabiosa
Рет қаралды 15 МЛН
Yum 😋 cotton candy 🍭
00:18
Nadir Show
Рет қаралды 7 МЛН
А ВЫ УМЕЕТЕ ПЛАВАТЬ?? #shorts
00:21
Паша Осадчий
Рет қаралды 1,8 МЛН
RAG from the Ground Up with Python and Ollama
15:32
Decoder
Рет қаралды 29 М.
Bored of Jupyter? Explore Marimo!
13:23
Sanjjushri Varshini
Рет қаралды 162
How to build chat with your data using Pinecone, LangChain and OpenAI
15:05
This Llama 3 is powerful and uncensored, let’s run it
14:58
David Ondrej
Рет қаралды 131 М.
Ollama Tool Call: EASILY Add AI to ANY Application, Here is how
6:49
Image Recognition with LLaVa in Python
10:56
NeuralNine
Рет қаралды 9 М.
host ALL your AI locally
24:20
NetworkChuck
Рет қаралды 1 МЛН
Samsung vs Iphone
0:21
Takadori1
Рет қаралды 23 МЛН
Сделал из зарядного устройства нечто!
0:48
Электронный звонок #shorts
0:26
TheBestBike
Рет қаралды 614 М.
Тест Ryzen AI 9 HX 370 и графики 890m
27:29
PRO Hi-Tech
Рет қаралды 96 М.