DEV Community

Alex Spinov
Alex Spinov

Posted on

Mistral AI Has a Free API: Here's How to Use Europe's Best Open-Source LLM

What is Mistral AI?

Mistral AI is Europe's leading AI company, building open-source and commercial LLMs that rival GPT-4 at a fraction of the cost. Their models — Mistral 7B, Mixtral 8x7B, Mistral Large, and Codestral — are available through a generous free tier API.

Why Mistral Over OpenAI?

  • Free tier — 1M tokens/month free on La Plateforme
  • Open-source models — Mistral 7B and Mixtral are Apache 2.0 licensed
  • EU data residency — GDPR-compliant by default
  • Mixture of Experts — Mixtral activates only 2 of 8 experts per token = fast + cheap
  • Function calling — native tool use on all models
  • Codestral — specialized coding model that beats GPT-4 on code benchmarks

Quick Start

pip install mistralai
Enter fullscreen mode Exit fullscreen mode
from mistralai import Mistral

client = Mistral(api_key="your-api-key")  # Free at console.mistral.ai

# Simple chat
response = client.chat.complete(
    model="mistral-large-latest",
    messages=[{"role": "user", "content": "Explain Kubernetes pods in 3 sentences"}]
)
print(response.choices[0].message.content)
Enter fullscreen mode Exit fullscreen mode

Function Calling (Tool Use)

import json

tools = [{
    "type": "function",
    "function": {
        "name": "get_weather",
        "description": "Get current weather for a city",
        "parameters": {
            "type": "object",
            "properties": {
                "city": {"type": "string", "description": "City name"},
                "unit": {"type": "string", "enum": ["celsius", "fahrenheit"]}
            },
            "required": ["city"]
        }
    }
}]

response = client.chat.complete(
    model="mistral-large-latest",
    messages=[{"role": "user", "content": "What's the weather in Paris?"}],
    tools=tools
)

tool_call = response.choices[0].message.tool_calls[0]
print(f"Function: {tool_call.function.name}")
print(f"Args: {tool_call.function.arguments}")
Enter fullscreen mode Exit fullscreen mode

Streaming Responses

stream = client.chat.stream(
    model="mistral-small-latest",
    messages=[{"role": "user", "content": "Write a Python web scraper"}]
)

for chunk in stream:
    if chunk.data.choices[0].delta.content:
        print(chunk.data.choices[0].delta.content, end="")
Enter fullscreen mode Exit fullscreen mode

Embeddings for RAG

response = client.embeddings.create(
    model="mistral-embed",
    inputs=["How to deploy Kubernetes", "Docker vs Podman comparison"]
)

# 1024-dimensional vectors for semantic search
vectors = [item.embedding for item in response.data]
Enter fullscreen mode Exit fullscreen mode

Code Generation with Codestral

response = client.chat.complete(
    model="codestral-latest",
    messages=[{
        "role": "user",
        "content": "Write a FastAPI endpoint that accepts a CSV file upload, validates columns, and returns JSON summary statistics"
    }]
)
print(response.choices[0].message.content)
# Codestral generates production-ready code with error handling
Enter fullscreen mode Exit fullscreen mode

Mistral Models Comparison

Model Parameters Speed Best For Cost (per 1M tokens)
Mistral 7B 7B Fastest Simple tasks, chatbots Free (self-hosted)
Mixtral 8x7B 46.7B (12.9B active) Fast General purpose Free (self-hosted)
Mistral Small - Fast Classification, routing $0.2 input
Mistral Large - Medium Complex reasoning $2 input
Codestral - Fast Code generation $0.3 input

Real-World Use Case

A European healthtech startup needed GDPR-compliant AI for patient data analysis. OpenAI meant US data processing — a legal nightmare. They switched to Mistral with EU hosting: same quality responses, 60% lower costs with Mixtral, full GDPR compliance, and their legal team could finally sleep at night.


Building AI applications with European data compliance? I help teams integrate LLMs into production systems. Reach out at spinov001@gmail.com or explore my data automation tools on Apify.

Top comments (0)