<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom" xmlns:dc="http://purl.org/dc/elements/1.1/">
  <channel>
    <title>DEV Community: Donovan So</title>
    <description>The latest articles on DEV Community by Donovan So (@donfour).</description>
    <link>https://dev.to/donfour</link>
    
    <atom:link rel="self" type="application/rss+xml" href="https://dev.to/feed/donfour"/>
    <language>en</language>
    <item>
      <title>I built an open-source tool that helps add usage-based billing for your LLM projects</title>
      <dc:creator>Donovan So</dc:creator>
      <pubDate>Mon, 01 Apr 2024 21:51:20 +0000</pubDate>
      <link>https://dev.to/donfour/i-built-an-open-source-tool-that-helps-add-usage-based-billing-for-your-llm-projects-4f56</link>
      <guid>https://dev.to/donfour/i-built-an-open-source-tool-that-helps-add-usage-based-billing-for-your-llm-projects-4f56</guid>
      <description>&lt;p&gt;Nowadays, it is a huge hassle for projects built on top of OpenAI and Anthropic to implement monetization. You have to figure out:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;What is my OpenAI and Anthropic cost for each user?&lt;/li&gt;
&lt;li&gt;How much should I charge each user?&lt;/li&gt;
&lt;li&gt;How do I impose a usage limit on each user to ensure profitability for each pricing tier?&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;BricksLLM helps you answer all of these questions via a highly scalable API gateway built specifically for LLMs.&lt;/p&gt;

&lt;p&gt;Here is a quick demo:&lt;/p&gt;

&lt;p&gt;For example, for each user, you could create a proxy API key (through the REST endpoint) that has a spend limit of $100/month and a rate limit of 10000 requests/month:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fkz1r9budyme8wuf1cxql.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fkz1r9budyme8wuf1cxql.png" alt="Creating an API key with a monthly spend limit and rate limit" width="800" height="609"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Then, you can redirect your OpenAI/Anthropic requests to us and start using the key:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight javascript"&gt;&lt;code&gt;&lt;span class="c1"&gt;// OpenAI Node SDK v4&lt;/span&gt;
&lt;span class="k"&gt;import&lt;/span&gt; &lt;span class="nx"&gt;OpenAI&lt;/span&gt; &lt;span class="k"&gt;from&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;openai&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;

&lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;openai&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="k"&gt;new&lt;/span&gt; &lt;span class="nc"&gt;OpenAI&lt;/span&gt;&lt;span class="p"&gt;({&lt;/span&gt;
 &lt;span class="na"&gt;apiKey&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;MY-SECRET-KEY&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="c1"&gt;// key created earlier&lt;/span&gt;
 &lt;span class="na"&gt;baseURL&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;http://localhost:8002/api/providers/openai/v1&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="c1"&gt;// redirect to us&lt;/span&gt;
&lt;span class="p"&gt;});&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;That's it. Just start using OpenAI/Anthropic as you would normally.&lt;/p&gt;

&lt;p&gt;You can query usage metrics via key id, model, custom id and user id:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F0ysrzxlzfej3m3diggry.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F0ysrzxlzfej3m3diggry.png" alt="Retrieving usage metrics from our API" width="800" height="471"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;The usage data can be used both for analytics and Stripe integration. BricksLLM is free and open-source! You can spin it up using a single docker command. Under the hood it's just a Go web server with a PostgreSQL db and a Redis cache.&lt;/p&gt;

&lt;p&gt;Check us out and let me know what you think!&lt;/p&gt;

&lt;p&gt;Here is the repo if you want to learn more about it: &lt;a href="https://github.com/bricks-cloud/bricksllm"&gt;https://github.com/bricks-cloud/bricksllm&lt;/a&gt;&lt;/p&gt;

</description>
      <category>openai</category>
      <category>anthropic</category>
      <category>llm</category>
      <category>ai</category>
    </item>
  </channel>
</rss>
