DEV Community

Cover image for 🆓 Local & Open Source AI: a kind ollama & LlamaIndex intro
adriens
adriens

Posted on

2

🆓 Local & Open Source AI: a kind ollama & LlamaIndex intro

❔ About

Sometimes, you may need a convenient yet powerful way to run many LLMs locally with:

  • Only CPU ( i5 like)
  • Little RAM (eg <= 8Go)
  • Being able to plug third party frameworks (Langchain, LlamaIndex) so you can build complex projects
  • Ease of use (few lines of code, powerful results)

👉 This is all what this post is about.

🎯 What you'll learn

In this short demo, you'll see how to:

  • Run on Kaggle (CPU)
  • Use ollama to run open source models
  • Play with a first LlamaIndex example

💡 Benefits & opportunities

Get rid of weekly GPU usage limits on free plan:

Image description

With this CPU approach, you are then able to schedule AI based workflow for free (as long as it does not exceed the 12h window limit).

🍿 Demo

Enough teasing, let's jump in the demo:

📜 Notebook

Image description

🔭 Further, stronger

To go further (48GB of RAM required, as well as GPU ), a full example around mixtral, see Running Mixtral 8x7 locally with LlamaIndex.

Image of Timescale

🚀 pgai Vectorizer: SQLAlchemy and LiteLLM Make Vector Search Simple

We built pgai Vectorizer to simplify embedding management for AI applications—without needing a separate database or complex infrastructure. Since launch, developers have created over 3,000 vectorizers on Timescale Cloud, with many more self-hosted.

Read full post →

Top comments (10)

Collapse
 
adriens profile image
adriens

Collapse
 
adriens profile image
adriens

Collapse
 
adriens profile image
adriens

Collapse
 
adriens profile image
adriens

Collapse
 
adriens profile image
adriens

Collapse
 
adriens profile image
adriens

Collapse
 
adriens profile image
adriens

Collapse
 
adriens profile image
adriens

Collapse
 
adriens profile image
adriens

Collapse
 
adriens profile image
adriens

A Workflow Copilot. Tailored to You.

Pieces.app image

Our desktop app, with its intelligent copilot, streamlines coding by generating snippets, extracting code from screenshots, and accelerating problem-solving.

Read the docs

👋 Kindness is contagious

Please leave a ❤️ or a friendly comment on this post if you found it helpful!

Okay