DEV Community

Cover image for GitCone.com: Chat with all the repositories from this challenge, or with any other repository
adiian
adiian

Posted on

1

GitCone.com: Chat with all the repositories from this challenge, or with any other repository

This is a submission for the Open Source AI Challenge with pgai and Ollama

What I Built

I picked an application which would resonate to dev community, which demonstrate how ai applications are built using embeddings, vector databases and also provides a robust starting point for ai projects using docker and docker compose file including timescaledb and ollama:

  • timescaledb
  • pgvector and pgai
  • ollama container
  • fastapi provides the apis and serves static part

I wanted to explore all the options so the application works with the ollama or openai, via pgai extension, and it can also work with a cloud database from timescale, with pgai vectorizer.

The online version is hosted on a simple vps instance so it's configured to work with openai api, due to resource constraints.

Demo

I decided to publish the prototype as a site:

https://gitcone.com/

Image description

Tools Used

The website is using python to segment all the files from a repository into smaller chunks and they are stored in the database instance. It can use either the local timescaledb database with the pgai(pgvector) included, to store the chunks with their respective embeddings.

The application makes use of the pgai functions to either use ollama or open openai, making switching between configurations, very easy.

In an early version the application was using pgai vectorizer in the timescale cloud database, but because it was working only with openai, I used standard pgai approach with separate embeddings table.

Final Thoughts

I was pleasant surprised of how easy was to develop this prototype and about the fact it worked smoothly. Dockerizing the all of them was challenging but it brings a lot of benefits.

Prize Categories:
Open-source Models from Ollama, Vectorizer Vibe, All the Extensions

This is a submission for the Open Source AI Challenge with pgai and Ollama

What I Built

I picked an application which would resonate to dev community, which demonstrate how ai applications are built using embeddings, vector databases and also provides a robust starting point for ai projects using docker and docker compose file including timescaledb and ollama:

  • timescaledb
  • pgvector and pgai
  • ollama container
  • fastapi provides the apis and serves static part

I wanted to explore all the options so the application works with the ollama or openai, via pgai extension, and it can also work with a cloud database from timescale, with pgai vectorizer.

The online version is hosted on a simple vps instance so it's configured to work with openai api, due to resource constraints.

Demo

I decided to publish the prototype as a site:

https://gitcone.com/

Tools Used

The website is using python to segment all the files…

Further developments

For better selecting the context, I'm looking into adding additional queries and additional fields in the data chunks, that would work well extracting relevant data based on the user intent.

Timescale with pgai seems to fit very well multi agents architectures, as those api calls can go well as database queries without additional configurations.

Thanks for the challenge!

API Trace View

Struggling with slow API calls? πŸ•’

Dan Mindru walks through how he used Sentry's new Trace View feature to shave off 22.3 seconds from an API call.

Get a practical walkthrough of how to identify bottlenecks, split tasks into multiple parallel tasks, identify slow AI model calls, and more.

Read more β†’

Top comments (0)

A Workflow Copilot. Tailored to You.

Pieces.app image

Our desktop app, with its intelligent copilot, streamlines coding by generating snippets, extracting code from screenshots, and accelerating problem-solving.

Read the docs

πŸ‘‹ Kindness is contagious

Discover a treasure trove of wisdom within this insightful piece, highly respected in the nurturing DEV Community enviroment. Developers, whether novice or expert, are encouraged to participate and add to our shared knowledge basin.

A simple "thank you" can illuminate someone's day. Express your appreciation in the comments section!

On DEV, sharing ideas smoothens our journey and strengthens our community ties. Learn something useful? Offering a quick thanks to the author is deeply appreciated.

Okay