I've released LLM Context, an open-source CLI tool designed to streamline context management when working with web-based LLM interfaces for software development. It addresses the challenge of efficiently providing relevant project information to AI assistants, particularly useful for models without native file attachment support (like OpenAI's new o1 models).
Key features:
- Rapid context updates using smart file selection (.gitignore-aware)
- Clipboard integration for seamless pasting into web LLM chats
- Support for various LLMs and project types
- Minimal workflow disruption
Technical details:
- Written in Python, installable via pipx
- Leverages .gitignore patterns for intelligent file selection
- Uses tree-sitter for experimental file outlining feature
Quick start:
pipx install llm-context
cd /path/to/your/project
lc-init
lc-sel-files
lc-context
# Paste output into your LLM chat
The tool was developed using itself, demonstrating its practical application in AI-assisted development. We've documented real-world usage examples, including how it was used to build and structure our website.
I'm particularly interested in feedback from developers working with models like o1-preview and o1-mini, or those who prefer web interfaces for AI-assisted development.
GitHub: https://github.com/cyberchitta/llm-context.py
Detailed rationale and examples: https://www.cyberchitta.cc/articles/llm-ctx-why.html
If you have any questions or feedback, please feel free to follow up in the comments.
Top comments (0)