This is a submission for the Open Source AI Challenge with pgai and Ollama
What I Built
An AI Writing Assistant for Science Fiction Authors to help overcome writer's block with plot ideas based on their current writing. To build this project, I used postgresql with the pgai
extension and Ollama.
For this challenge, an author can enter text in the editor. The "AI assistant" will use this text to help the author brainstorm ideas for what can follow and generate new text. Additional attempted functionality includes saving and embedding story text to enable embedding queries.
Frontend and backend are separate and spun up as docker containers. Frontend is built with NuxtJS/Vue/TailwindCSS/DaisyUI; Backend is built on FastAPI/Python/Postgresl leveraging the pgai
and pgvector
extensions with Ollama and the generative AI model.
I attempted to work with pgvector
, pgvectorscale
, and pgai vectorizer
, but came upon several challenges in the process and ultimately didn't have enough time to overcome those challenges before the challenge deadline.
Demo
Quick Video Demo:
Code:
meditatingdragon / ollama-pgai-dev-challenge
Science Fiction Writing Assistant powered by AI
AI Writing Assistant for Sci-fi Authors
This project was developed for the Open Source AI Challenge with pgai and Ollama
It is a tool for science fiction authors to overcome writer's block by using AI as an assistant to help complete the story.
Tools
Dev Challenge Tools
- pgai / postgresql
- ollama
Other technologies
- Nuxt.js
- FastAPI
- TailwindCSS
- DaisyUI
Getting Started
This project can be run with docker:
docker compose up -d
This should start several containers for the frontend, backend, ollama model, and postgresql database. There is also a container for the vectorizer worker, though it is not used in this project since I did not end up using OpenAI.
For more details on the project, check out the challenge submission.
Tools Used
- Database: Timescale Docker Image
- Frameworks: FastAPI (Python), NuxtJs (Javascript/Vue.js)
- Model: Ollama
- Hackathon Tools:
pgai
,ollama
,pgvector
,pgvectorscale
Final Thoughts
This was one of my first AI projects where I dug further into the end-to-end implementation of AI applications. The idea of a writing assistant has been in my mind for many years though, and I'm glad I had a chance to tinker and explore this idea.
As I haven't really worked with vector databases before, there was a bit of a learning curve understanding how to embed and store vector data. I know that if I had more time, I would work on implementing some search functionality that could leverage the power of embeddings and vector databases.
I managed to successfully store text embeddings on saving a story, but did not have enough time to overcome some challenges I ran into while trying to query on these embeddings. I think this would have required some configuration updates at different points in the application, but with the project set up the way it was, it was not a simple thing to do. This would be the next feature I would tackle if I had more time.
I also wanted to experiment more with fine-tuning models for this particular use case supporting science fiction authors and being able to run a custom model with the Modelfile
in Ollama.
The "product vision" would be to be able to fine tune which elements of a story can be used for "inspiration" for the author (different story elements, such as location, characters, events, etc.) and to include AI generated graphics to help with visualizing story assets.
Overall, I learned a lot from this project and now have a better understanding of how pgai
and Ollama
can be used to build unique AI applications.
Prize Categories:
Aspired for all 3 prize categories, but could only incorporate one:
- Open-source Models from Ollama (Ollama via docker image with llama3.2 model)
Top comments (0)