DEV Community

Alberto Barrago
Alberto Barrago

Posted on

RefAIne: Transform Casual Prompts into Expert-Level AI Instructions

A free, open-source FastAPI microservice that converts simple prompts into detailed, production-ready AI instructions. Supports multiple LLM providers including Claude, OpenAI, Ollama, and Groq.

The Problem

Getting AI assistants to understand exactly what you need often requires multiple iterations. When you ask to "create a REST API," you might get a basic example when you actually need authentication, error handling, validation, and production-ready code.

This gap between brief prompts and detailed requirements costs developers significant time in refinement cycles.

The Solution

RefAIne is a FastAPI microservice (approximately 145 lines) that transforms casual prompts into comprehensive engineering instructions.

Example Input: "make a function to sort a list"

Example Output:

Create a Python function that sorts a list with the following requirements:

  1. Function signature: Accept a list of comparable elements as input
  2. Return a new sorted list (do not modify the original)
  3. Use Python's built-in sorting (efficient O(n log n) Timsort)
  4. Add type hints and docstrings with examples
  5. Include error handling for None or non-list inputs
  6. Handle edge cases: empty list, single element, already sorted
  7. Make it generic to work with any comparable types

Provide clean, PEP 8 compliant code with tests.

RefAIne automatically adds technical context, specificity, best practices, and edge case considerations.

Key Features

Free Testing Options

Test locally with Ollama at no cost before using paid APIs. No credit card or API key required for initial testing.

Multiple Provider Support

Switch between LLM providers using a single environment variable:

  • Ollama (free, local, private)
  • Groq (free tier available)
  • Anthropic Claude (paid)
  • OpenAI (paid)
  • Any OpenAI-compatible API

Production Ready

Includes Docker configuration, comprehensive documentation, and environment-based configuration.

Focused Design

Single-purpose service with one endpoint and minimal dependencies.

About the Project

RefAIne was created by Alberto Barrago to automate the prompt refinement process. The project is open-source under the MIT license, allowing use in commercial projects without restrictions.

Supporting Development

RefAIne is free to use and will remain so. However, maintaining open-source software requires ongoing investment:

  • Server costs for testing and demonstrations
  • API credits for development across multiple providers
  • Time for bug fixes, features, and support
  • Documentation and example creation

If RefAIne has been useful in your work, consider supporting the project:

  • Star the repository: github.com/AlbertoBarrago/RefAIne
  • Contribute financially via GitHub Sponsors or donation platforms
  • Share the project with other developers
  • Report issues or contribute code improvements

Financial support helps maintain active development, add new provider integrations, improve refinement quality, and respond to community needs.

Quick Start

# Clone repository
git clone https://github.com/AlbertoBarrago/RefAIne.git
cd RefAIne

# Install and configure Ollama (for free local testing)
curl -fsSL https://ollama.com/install.sh | sh
ollama pull llama3.1

# Set up environment
cat > .env << EOF
LLM_PROVIDER=openai
OPENAI_BASE_URL=http://localhost:11434/v1
OPENAI_API_KEY=ollama
OPENAI_MODEL=llama3.1
EOF

# Install and run
uv pip install -r pyproject.toml
uvicorn main:app --reload

# Test the endpoint
curl -X POST http://localhost:8000/refine \
  -H "Content-Type: application/json" \
  -d '{"prompt": "create a REST API"}'
Enter fullscreen mode Exit fullscreen mode

Interactive API documentation is available at http://localhost:8000/docs.

Resources

Contributions and feedback are welcome.

Top comments (0)