<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom" xmlns:dc="http://purl.org/dc/elements/1.1/">
  <channel>
    <title>DEV Community: hellovai</title>
    <description>The latest articles on DEV Community by hellovai (@hellovai).</description>
    <link>https://dev.to/hellovai</link>
    
    <atom:link rel="self" type="application/rss+xml" href="https://dev.to/feed/hellovai"/>
    <language>en</language>
    <item>
      <title>BAML: A new programming language for using LLMs + a VSCode Playground</title>
      <dc:creator>hellovai</dc:creator>
      <pubDate>Tue, 26 Mar 2024 15:22:51 +0000</pubDate>
      <link>https://dev.to/hellovai/baml-a-new-programming-language-for-using-llms-with-a-vscode-playground-mp</link>
      <guid>https://dev.to/hellovai/baml-a-new-programming-language-for-using-llms-with-a-vscode-playground-mp</guid>
      <description>&lt;p&gt;All we wanted was our prompts in our codebase, not on some website. Then just to see the prompts before we ran the code. Then comments inside of prompts. Then just less strings everywhere, and more type-safety...&lt;/p&gt;

&lt;p&gt;At some point, it became &lt;a href="https://github.com/boundaryml/baml"&gt;BAML&lt;/a&gt;: a type-safe, self-contained way to call LLMs from Python and/or TypeScript.&lt;/p&gt;

&lt;p&gt;BAML encapsulates all the boilerplate for:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;flexible parsing of LLM responses into your exact data model&lt;/li&gt;
&lt;li&gt;streaming LLM responses as partial JSON&lt;/li&gt;
&lt;li&gt;wrapping LLM calls with retries and fallback strategies&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Our VSCode extension provides:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;real-time prompt previews,&lt;/li&gt;
&lt;li&gt;an LLM testing playground, and&lt;/li&gt;
&lt;li&gt;syntax highlighting (of course)&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Ftj1kteg2per5eyj8qmo8.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Ftj1kteg2per5eyj8qmo8.png" alt="Testing in vscode" width="800" height="425"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Here's a short BAML snippet for extracting a resume (with syntax highlighting):&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Ft6gt1880ez64wu7nmsnc.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Ft6gt1880ez64wu7nmsnc.png" alt="Image description" width="800" height="340"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Or in code form:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight rust"&gt;&lt;code&gt;&lt;span class="c1"&gt;// extract_resume.baml&lt;/span&gt;

&lt;span class="c1"&gt;// 1. Define the type for the output&lt;/span&gt;
&lt;span class="n"&gt;class&lt;/span&gt; &lt;span class="n"&gt;Resume&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
  &lt;span class="n"&gt;name&lt;/span&gt; &lt;span class="n"&gt;string&lt;/span&gt;
  &lt;span class="c1"&gt;// Use an array to get multiple education histories&lt;/span&gt;
  &lt;span class="n"&gt;education&lt;/span&gt; &lt;span class="n"&gt;Education&lt;/span&gt;&lt;span class="p"&gt;[]&lt;/span&gt;
&lt;span class="p"&gt;}&lt;/span&gt;

&lt;span class="c1"&gt;// A nested class&lt;/span&gt;
&lt;span class="n"&gt;class&lt;/span&gt; &lt;span class="n"&gt;Education&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
  &lt;span class="n"&gt;university&lt;/span&gt; &lt;span class="n"&gt;string&lt;/span&gt;
  &lt;span class="n"&gt;start_year&lt;/span&gt; &lt;span class="nb"&gt;int&lt;/span&gt;
  &lt;span class="c1"&gt;// @description injects context into the prompt about this field&lt;/span&gt;
  &lt;span class="n"&gt;end_year&lt;/span&gt; &lt;span class="nb"&gt;int&lt;/span&gt;&lt;span class="o"&gt;?&lt;/span&gt;
  &lt;span class="o"&gt;@&lt;/span&gt;&lt;span class="nf"&gt;description&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="s"&gt;"unset if still in school"&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
&lt;span class="p"&gt;}&lt;/span&gt;

&lt;span class="c1"&gt;// 2. Define the function signature&lt;/span&gt;
&lt;span class="c1"&gt;// This function takes in a single paramater&lt;/span&gt;
&lt;span class="c1"&gt;// Outputs a Resume type&lt;/span&gt;
&lt;span class="n"&gt;function&lt;/span&gt; &lt;span class="n"&gt;ExtractResume&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
  &lt;span class="nf"&gt;input&lt;/span&gt; &lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;resume_text&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="n"&gt;string&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
  &lt;span class="n"&gt;output&lt;/span&gt; &lt;span class="n"&gt;Resume&lt;/span&gt;
&lt;span class="p"&gt;}&lt;/span&gt;

&lt;span class="c1"&gt;// 3. Use an llm to implement ExtractResume.&lt;/span&gt;
&lt;span class="c1"&gt;// We'll name this impl 'version1'.&lt;/span&gt;
&lt;span class="k"&gt;impl&lt;/span&gt;&lt;span class="o"&gt;&amp;lt;&lt;/span&gt;&lt;span class="n"&gt;llm&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;ExtractResume&lt;/span&gt;&lt;span class="o"&gt;&amp;gt;&lt;/span&gt; &lt;span class="n"&gt;version1&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
  &lt;span class="n"&gt;client&lt;/span&gt; &lt;span class="n"&gt;GPT4&lt;/span&gt;
  &lt;span class="n"&gt;prompt&lt;/span&gt; &lt;span class="err"&gt;#&lt;/span&gt;&lt;span class="s"&gt;"
    Extract the resume from:
    ###
    {// This macro injects your input param //}
    {#input.resume_text}
    ###

    Output JSON Schema:
    {// This macro prints out your schema //}
    {#print_type(output)}
  "&lt;/span&gt;&lt;span class="err"&gt;#&lt;/span&gt;
&lt;span class="p"&gt;}&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;The whole premise really just boils down to:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;preferring to use types, not strings&lt;/li&gt;
&lt;li&gt;not wanting to leave VSCode just to do some LLM testing&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;We also have a bunch of cool features in the works: conditionals and loops in our prompt templates, image support, and more powerful types.&lt;/p&gt;

&lt;p&gt;We're still pretty early and would love to hear your feedback. To get started:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://github.com/boundaryml/baml"&gt;https://github.com/boundaryml/baml&lt;/a&gt;&lt;/p&gt;

</description>
      <category>ai</category>
      <category>opensource</category>
      <category>news</category>
      <category>vscode</category>
    </item>
  </channel>
</rss>
