DEV Community

Alex Aslam
Alex Aslam

Posted on

Embracing AI in Ruby with Langchainrb: A Developer’s Delight

As artificial intelligence reshapes the tech landscape, Ruby developers now have a charming ally to join the journey: the langchainrb gem. This library elegantly bridges Ruby with powerful AI models, letting you infuse applications with natural language processing without leaving your beloved syntax. Let’s explore how to weave AI magic into Ruby projects with langchainrb, complete with practical examples and Ruby-flavored insights.


Why Langchainrb?

Langchainrb brings AI capabilities to Ruby with minimal friction. It abstracts API complexities, offering intuitive interfaces for tasks like text generation, embeddings, and document analysis. Whether querying data, analyzing sentiments, or building chatbots, langchainrb feels like a natural extension of Ruby—expressive, flexible, and joyful.


Getting Started

1. Installation

Add the gem to your Gemfile:

gem "langchainrb"
Enter fullscreen mode Exit fullscreen mode

Then run:

bundle install
Enter fullscreen mode Exit fullscreen mode

2. Configure Your AI Provider

For OpenAI, set your API key as an environment variable:

export OPENAI_API_KEY="your-api-key"
Enter fullscreen mode Exit fullscreen mode

Or configure it in code:

Langchain::LLM::OpenAI.configure do |config|
  config.api_key = ENV["OPENAI_API_KEY"]
end
Enter fullscreen mode Exit fullscreen mode

First Conversations with AI

Let’s start with a friendly chat. Ask the AI about Ruby’s origins:

require "langchain"

llm = Langchain::LLM::OpenAI.new

response = llm.chat(
  messages: [
    {role: "user", content: "Who created the Ruby programming language?"}
  ]
)

puts response.message.content
# => "Ruby was created by Yukihiro Matsumoto (Matz) in the mid-1990s."
Enter fullscreen mode Exit fullscreen mode

The chat method sends a prompt and returns a structured response. Easy as puts "Hello, World!".


Understanding Text with Embeddings

Embeddings convert text into numerical vectors, enabling semantic analysis. Use them to compare text similarity:

embedding = llm.embed(text: "Ruby is elegant and expressive.")

# Compare with another string
another_embedding = llm.embed(text: "The Ruby language prioritizes developer happiness.")

# Calculate cosine similarity
similarity = embedding.cosine_similarity(another_embedding)
puts "Similarity score: #{similarity}" # => ~0.92 (highly similar)
Enter fullscreen mode Exit fullscreen mode

High similarity scores indicate related meanings—perfect for recommendation systems or search engines.


Building a Custom Q&A Bot

Let’s create a bot that answers questions using your documentation!

1. Load Documents

Load content from a URL or file:

loader = Langchain::Loader.new("https://guides.rubyonrails.org/getting_started.html")
documents = loader.load
Enter fullscreen mode Exit fullscreen mode

2. Process and Store Embeddings

Split text into chunks and generate embeddings:

splitter = Langchain::Chunker::RecursiveText.new(documents, chunk_size: 500)
chunks = splitter.chunks

# Store embeddings (simplified in-memory example)
vector_store = []
chunks.each do |chunk|
  embedding = llm.embed(text: chunk.text)
  vector_store << { embedding: embedding, text: chunk.text }
end
Enter fullscreen mode Exit fullscreen mode

3. Query Your Knowledge Base

Find relevant text for a user’s question:

question = "What is Rails?"
question_embedding = llm.embed(text: question)

# Find closest match
closest = vector_store.max_by do |entry|
  question_embedding.cosine_similarity(entry[:embedding])
end

# Ask AI to generate an answer
answer = llm.chat(
  messages: [
    {role: "system", content: "Answer based on the context: #{closest[:text]}"},
    {role: "user", content: question}
  ]
)

puts answer.message.content
# => "Rails is a web application framework written in Ruby..."
Enter fullscreen mode Exit fullscreen mode

Advanced Possibilities

  • Multi-Model Support: Switch providers (Hugging Face, Cohere) with minimal code changes.
  • Chains: Design workflows combining multiple AI steps, like summarizing text then generating a tweet.
  • Vector Databases: Integrate with Redis or Pinecone for scalable embedding storage.

Best Practices

  • Cost Awareness: Monitor API usage to avoid surprises.
  • Error Handling: Wrap calls in begin-rescue blocks to handle rate limits or timeouts.
  • Caching: Store frequent queries to reduce API calls.

Conclusion

Langchainrb invites Rubyists to explore AI with the same elegance and joy that defines Ruby. Whether enhancing apps with smart features or prototyping AI concepts, this gem turns complexity into simplicity. So why not let Ruby’s “developer happiness” philosophy guide your next AI adventure?

# Here’s to delightful coding with AI! 🚀
puts "Happy coding, Matz would approve!"
Enter fullscreen mode Exit fullscreen mode

Resources:

Now, go forth and build something brilliant! 🌟

Hostinger image

Get n8n VPS hosting 3x cheaper than a cloud solution

Get fast, easy, secure n8n VPS hosting from $4.99/mo at Hostinger. Automate any workflow using a pre-installed n8n application and no-code customization.

Start now

Top comments (0)

Cloudinary image

Optimize, customize, deliver, manage and analyze your images.

Remove background in all your web images at the same time, use outpainting to expand images with matching content, remove objects via open-set object detection and fill, recolor, crop, resize... Discover these and hundreds more ways to manage your web images and videos on a scale.

Learn more

👋 Kindness is contagious

If this post resonated with you, feel free to hit ❤️ or leave a quick comment to share your thoughts!

Okay