DEV Community

# llm

Posts

👋 Sign in for the ability to sort posts by relevant, latest, or top.
You're Already Routing Claude Code. You're Just Doing It Manually.

You're Already Routing Claude Code. You're Just Doing It Manually.

Comments 1
3 min read
Chat Completions vs OpenAI Responses API: What Actually Changed

Chat Completions vs OpenAI Responses API: What Actually Changed

10
Comments
4 min read
What It Actually Takes to Run a RAG System in Production

What It Actually Takes to Run a RAG System in Production

Comments
2 min read
I Finally Found the Words for How I've Always Programmed: Nāma-Rƫpa and Latent Space

I Finally Found the Words for How I've Always Programmed: Nāma-Rƫpa and Latent Space

Comments
5 min read
AI Factory: Automating AI Agent Project Setup

AI Factory: Automating AI Agent Project Setup

Comments
5 min read
When MCP Is Not The Right Choice

When MCP Is Not The Right Choice

1
Comments
7 min read
Prompt Management Is Infrastructure: Requirements, Tools, and Patterns

Prompt Management Is Infrastructure: Requirements, Tools, and Patterns

6
Comments 3
11 min read
Never Get Caught by an LLM Deprecation Again: A Guide to llm-model-deprecation

Never Get Caught by an LLM Deprecation Again: A Guide to llm-model-deprecation

Comments 1
12 min read
I built an LLM Request Cascade proxy that auto-switches models before you ever timeout

I built an LLM Request Cascade proxy that auto-switches models before you ever timeout

Comments 4
5 min read
Stop Passing Raw DataFrames to Your LLM — Here's a Better Way

Stop Passing Raw DataFrames to Your LLM — Here's a Better Way

1
Comments
4 min read
Embedding Accessibility into AI based software development

Embedding Accessibility into AI based software development

29
Comments 2
5 min read
Why Most AI Agents Fail (And How to Design Them Right)

Why Most AI Agents Fail (And How to Design Them Right)

31
Comments 1
6 min read
Building a Production RAG Server with Ollama, Open WebUI and Chroma DB

Building a Production RAG Server with Ollama, Open WebUI and Chroma DB

Comments
1 min read
What “Production-Ready LLM Feature” Really Means

What “Production-Ready LLM Feature” Really Means

1
Comments 2
5 min read
Self-Hosted AI Models: A Practical Guide to Running LLMs Locally (2026)

Self-Hosted AI Models: A Practical Guide to Running LLMs Locally (2026)

1
Comments
19 min read
👋 Sign in for the ability to sort posts by relevant, latest, or top.