I want to create a LLM which is trained on several PDFs and can generates the answer based on the questions given by the end user. I do not have the Paid API Keys. I want to use Ollama to create my LLM. Can I use RAG with Ollama?
For further actions, you may consider blocking this person and/or reporting abuse
Top comments (2)
Yes you can, I recently wrote an article on how to create a very basic RAG using OLLAMA, maybe it helps.
Ollama(LLM) + RAG[ES store embedding] + langchain