<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom" xmlns:dc="http://purl.org/dc/elements/1.1/">
  <channel>
    <title>DEV Community: restlessronin</title>
    <description>The latest articles on DEV Community by restlessronin (@restlessronin).</description>
    <link>https://dev.to/restlessronin</link>
    
    <atom:link rel="self" type="application/rss+xml" href="https://dev.to/feed/restlessronin"/>
    <language>en</language>
    <item>
      <title>LLM Context: A CLI tool for efficient AI-assisted development</title>
      <dc:creator>restlessronin</dc:creator>
      <pubDate>Wed, 09 Oct 2024 04:27:08 +0000</pubDate>
      <link>https://dev.to/restlessronin/llm-context-a-cli-tool-for-efficient-ai-assisted-development-48a</link>
      <guid>https://dev.to/restlessronin/llm-context-a-cli-tool-for-efficient-ai-assisted-development-48a</guid>
      <description>&lt;p&gt;I've released LLM Context, an open-source CLI tool designed to streamline context management when working with web-based LLM interfaces for software development. It addresses the challenge of efficiently providing relevant project information to AI assistants, particularly useful for models without native file attachment support (like OpenAI's new o1 models).&lt;/p&gt;

&lt;p&gt;Key features:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Rapid context updates using smart file selection (.gitignore-aware)&lt;/li&gt;
&lt;li&gt;Clipboard integration for seamless pasting into web LLM chats&lt;/li&gt;
&lt;li&gt;Support for various LLMs and project types&lt;/li&gt;
&lt;li&gt;Minimal workflow disruption&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Technical details:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Written in Python, installable via pipx&lt;/li&gt;
&lt;li&gt;Leverages .gitignore patterns for intelligent file selection&lt;/li&gt;
&lt;li&gt;Uses tree-sitter for experimental file outlining feature&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Quick start:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;pipx install llm-context
cd /path/to/your/project
lc-init
lc-sel-files
lc-context
# Paste output into your LLM chat
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;The tool was developed using itself, demonstrating its practical application in AI-assisted development. We've documented real-world usage examples, including how it was used to build and structure our website.&lt;/p&gt;

&lt;p&gt;I'm particularly interested in feedback from developers working with models like o1-preview and o1-mini, or those who prefer web interfaces for AI-assisted development.&lt;/p&gt;

&lt;p&gt;GitHub: &lt;a href="https://github.com/cyberchitta/llm-context.py" rel="noopener noreferrer"&gt;https://github.com/cyberchitta/llm-context.py&lt;/a&gt;&lt;br&gt;
Detailed rationale and examples: &lt;a href="https://www.cyberchitta.cc/articles/llm-ctx-why.html" rel="noopener noreferrer"&gt;https://www.cyberchitta.cc/articles/llm-ctx-why.html&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;If you have any questions or feedback, please feel free to follow up in the comments.&lt;/p&gt;

</description>
    </item>
  </channel>
</rss>
