<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom" xmlns:dc="http://purl.org/dc/elements/1.1/">
  <channel>
    <title>DEV Community: ddamno</title>
    <description>The latest articles on DEV Community by ddamno (@ddamno_f4be48b6d302c9491c).</description>
    <link>https://dev.to/ddamno_f4be48b6d302c9491c</link>
    
    <atom:link rel="self" type="application/rss+xml" href="https://dev.to/feed/ddamno_f4be48b6d302c9491c"/>
    <language>en</language>
    <item>
      <title>InCoder | JetBrains Plugin</title>
      <dc:creator>ddamno</dc:creator>
      <pubDate>Sun, 27 Jul 2025 18:11:27 +0000</pubDate>
      <link>https://dev.to/ddamno_f4be48b6d302c9491c/incoder-jetbrains-plugin-2jhp</link>
      <guid>https://dev.to/ddamno_f4be48b6d302c9491c/incoder-jetbrains-plugin-2jhp</guid>
      <description>&lt;h2&gt;
  
  
  What is &lt;a href="https://plugins.jetbrains.com/plugin/26037-incoder" rel="noopener noreferrer"&gt;InCoder&lt;/a&gt;?
&lt;/h2&gt;

&lt;p&gt;InCoder is a plugin designed for JetBrains IDEs — including IntelliJ IDEA, PyCharm, and others in the ecosystem. It seamlessly integrates advanced Large Language Models (LLMs) into your development workflow, offering in-editor code generation, comprehension, and editing assistance.&lt;/p&gt;

&lt;p&gt;Whether you're using OpenAI, Claude, or a local model with Ollama, InCoder keeps everything inside your IDE — no external chat windows, no token quotas, no distractions.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F6mbqhagde6emnrscse7m.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F6mbqhagde6emnrscse7m.png" alt="demo chat" width="800" height="483"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Why I Built InCoder
&lt;/h2&gt;

&lt;p&gt;I spend most of my day inside IntelliJ Community Edition. I didn’t want to pay for Ultimate just to access AI features. And I didn’t want to rely on bloated, cloud-locked tools that felt disconnected from how I actually code.&lt;/p&gt;

&lt;h2&gt;
  
  
  What I really wanted was:
&lt;/h2&gt;

&lt;ul&gt;
&lt;li&gt;Full control over the LLM and its behavior&lt;/li&gt;
&lt;li&gt;The ability to use local models (Ollama) or Claude via API&lt;/li&gt;
&lt;li&gt;A workflow that stays in the IDE and understands my entire project&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;So I built InCoder.&lt;/p&gt;

&lt;h2&gt;
  
  
  What It Does
&lt;/h2&gt;

&lt;ul&gt;
&lt;li&gt;Connects to OpenAI, Anthropic (Claude), and Ollama (local models like LLaMA 3, Mistral, etc.)&lt;/li&gt;
&lt;li&gt;Can be extended to support other LLM providers&lt;/li&gt;
&lt;li&gt;Reads project files automatically to understand context&lt;/li&gt;
&lt;li&gt;Lets you define and tweak system prompts&lt;/li&gt;
&lt;li&gt;Gives LLMs access to tools for:

&lt;ul&gt;
&lt;li&gt;File reading/writing&lt;/li&gt;
&lt;li&gt;Creating patches or merge requests&lt;/li&gt;
&lt;li&gt;Running CLI commands (with user approval)&lt;/li&gt;
&lt;/ul&gt;


&lt;/li&gt;

&lt;/ul&gt;

&lt;h2&gt;
  
  
  Who It's For
&lt;/h2&gt;

&lt;ul&gt;
&lt;li&gt;Developers using JetBrains Community Edition (or any JetBrains IDE)&lt;/li&gt;
&lt;li&gt;Those who want full control over their AI assistant&lt;/li&gt;
&lt;li&gt;People frustrated with Junie, Copilot, or plugin limitations&lt;/li&gt;
&lt;li&gt;Tinkerers who want to experiment with local models or self-hosted APIs&lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  My Setup
&lt;/h2&gt;

&lt;p&gt;Right now, I use Claude Sonnet via Anthropic’s API — and it works beautifully. I’ve also tested InCoder with Ollama running local LLMs like LLaMA 3. The performance is solid, the integration is seamless.&lt;/p&gt;

&lt;p&gt;But what really makes it powerful is flexibility.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;I can run custom CLI commands from prompts (with confirmation)&lt;/li&gt;
&lt;li&gt;I can scaffold components, refactor, or generate boilerplate code&lt;/li&gt;
&lt;li&gt;I’ve built full React + TypeScript frontends this way — with the LLM guiding architecture, structure, and code&lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  Final Thoughts
&lt;/h2&gt;

&lt;p&gt;I didn’t set out to build a product — I built InCoder because I needed a tool that just worked the way I code. It's fast. Hackable. Private. And designed to integrate natively with the JetBrains environment.&lt;/p&gt;

&lt;p&gt;If that sounds like your style, give it a try.&lt;/p&gt;

&lt;p&gt;👉 Install &lt;a href="https://plugins.jetbrains.com/plugin/26037-incoder" rel="noopener noreferrer"&gt;InCoder&lt;/a&gt;&lt;br&gt;
🛠️ Contribute or check out the &lt;a href="https://github.com/damiano1996/incoder-plugin" rel="noopener noreferrer"&gt;code&lt;/a&gt;&lt;/p&gt;




&lt;p&gt;Happy hacking!&lt;br&gt;
— Damiano&lt;/p&gt;

</description>
      <category>langchain</category>
      <category>ai</category>
      <category>opensource</category>
      <category>github</category>
    </item>
  </channel>
</rss>
