Ever watch your dog pace back and forth at 2 AM and think "What is going on in that head?"
I did. So I built something.
This is the story of how I combined computer vision, GPT-4, and a cheap Raspberry Pi camera into an AI-powered pet behavior analyzer — and what I learned about both machine learning and my dog's obvious anxiety issues.
The Problem
Pets can't talk. But they communicate — through posture, movement, routine, and all the weird little things they do. The challenge? Most of us aren't trained to read those signals.
I wanted a system that could:
- Detect behavioral patterns over time
- Flag anomalies (sudden changes in eating, sleeping, activity)
- Generate natural language summaries a non-expert could actually use
Here's what I built.
The Stack
- Python 3.11
- OpenCV (motion detection + frame sampling)
- OpenAI Vision API (GPT-4o)
- SQLite (behavior log storage)
- FastAPI (local dashboard)
- Raspberry Pi 4 + USB camera (optional — works with any webcam)
Step 1: Motion Detection & Frame Sampling
No point analyzing every frame — that's expensive and slow. Instead, we sample only when motion is detected.
import cv2
import time
import base64
from datetime import datetime
def detect_and_sample(camera_index=0, sensitivity=500, interval_sec=30):
cap = cv2.VideoCapture(camera_index)
ret, prev_frame = cap.read()
prev_gray = cv2.cvtColor(prev_frame, cv2.COLOR_BGR2GRAY)
last_capture = 0
frames = []
while True:
ret, frame = cap.read()
if not ret:
break
gray = cv2.cvtColor(frame, cv2.COLOR_BGR2GRAY)
diff = cv2.absdiff(prev_gray, gray)
motion_score = diff.sum()
now = time.time()
if motion_score > sensitivity and (now - last_capture) > interval_sec:
timestamp = datetime.now().isoformat()
_, buffer = cv2.imencode('.jpg', frame)
b64 = base64.b64encode(buffer).decode('utf-8')
frames.append({'timestamp': timestamp, 'image': b64})
last_capture = now
print(f"[{timestamp}] Motion captured (score: {motion_score:.0f})")
prev_gray = gray
cap.release()
return frames
Key insight: sensitivity=500 worked great for a medium-sized dog. Cats need a lower threshold (~200) because their movements are more subtle. Because of course they are.
Step 2: GPT-4 Vision Analysis
This is where the magic happens. Each captured frame gets sent to GPT-4o with a carefully structured prompt.
from openai import OpenAI
client = OpenAI()
BEHAVIOR_PROMPT = """
You are an expert animal behaviorist analyzing a pet in a home environment.
For this image, provide a JSON response with:
- behavior: primary observed behavior (e.g., 'resting', 'playing', 'anxious-pacing', 'eating', 'alert')
- energy_level: 1-10 scale
- stress_indicators: list of any stress signals observed (panting, tucked tail, flattened ears, etc.)
- confidence: your confidence in this assessment (0.0-1.0)
- notes: any notable observations in one sentence
Be specific and clinical. Respond ONLY with valid JSON.
"""
def analyze_frame(b64_image: str) -> dict:
response = client.chat.completions.create(
model="gpt-4o",
messages=[{
"role": "user",
"content": [
{"type": "text", "text": BEHAVIOR_PROMPT},
{
"type": "image_url",
"image_url": {
"url": f"data:image/jpeg;base64,{b64_image}",
"detail": "low" # saves tokens
}
}
]
}],
max_tokens=300,
response_format={"type": "json_object"}
)
import json
return json.loads(response.choices[0].message.content)
Pro tip: Use detail: "low" for the image — it cuts token cost by ~75% with minimal accuracy loss for behavior classification. You don't need pixel-perfect analysis, you need pattern recognition.
Step 3: Storing the Behavior Log
import sqlite3
from contextlib import contextmanager
@contextmanager
def get_db():
conn = sqlite3.connect('pet_behavior.db')
conn.row_factory = sqlite3.Row
try:
yield conn
finally:
conn.close()
def init_db():
with get_db() as conn:
conn.execute("""
CREATE TABLE IF NOT EXISTS observations (
id INTEGER PRIMARY KEY AUTOINCREMENT,
timestamp TEXT NOT NULL,
behavior TEXT,
energy_level INTEGER,
stress_indicators TEXT,
confidence REAL,
notes TEXT
)
""")
conn.commit()
def log_observation(timestamp: str, analysis: dict):
import json
with get_db() as conn:
conn.execute("""
INSERT INTO observations
(timestamp, behavior, energy_level, stress_indicators, confidence, notes)
VALUES (?, ?, ?, ?, ?, ?)
""", (
timestamp,
analysis.get('behavior'),
analysis.get('energy_level'),
json.dumps(analysis.get('stress_indicators', [])),
analysis.get('confidence'),
analysis.get('notes')
))
conn.commit()
Step 4: The Weekly Summary (The Part My Vet Actually Loved)
Raw data isn't useful to most pet owners. So I added a weekly summary generator:
def generate_weekly_summary(pet_name="Max", species="dog"):
with get_db() as conn:
rows = conn.execute("""
SELECT behavior, energy_level, stress_indicators, notes, timestamp
FROM observations
WHERE timestamp >= datetime('now', '-7 days')
ORDER BY timestamp DESC
LIMIT 200
""").fetchall()
if not rows:
return "No observations recorded this week."
# Aggregate stats
behaviors = [r['behavior'] for r in rows]
avg_energy = sum(r['energy_level'] for r in rows) / len(rows)
from collections import Counter
behavior_counts = Counter(behaviors)
top_behaviors = behavior_counts.most_common(5)
summary_data = {
"total_observations": len(rows),
"average_energy": round(avg_energy, 1),
"top_behaviors": top_behaviors,
"sample_notes": [r['notes'] for r in rows[:10]]
}
prompt = f"""
You are a friendly pet behaviorist creating a weekly wellness summary.
Pet: {pet_name} ({species})
Data: {summary_data}
Write a warm, informative 3-paragraph summary:
1. Overall mood and energy this week
2. Notable patterns or changes
3. One actionable suggestion for the owner
Tone: professional but caring. Like a vet you actually trust.
"""
response = client.chat.completions.create(
model="gpt-4o-mini", # cheaper for text-only
messages=[{"role": "user", "content": prompt}],
max_tokens=400
)
return response.choices[0].message.content
What I Actually Discovered
After two weeks of running this on my dog Max:
- He's most anxious between 4-6 PM — right when the neighborhood kids come home from school (we never connected this before)
- His energy crashes hard on rainy days — obvious in retrospect, but seeing it in a chart made it real
- He self-soothes by watching out the front window — which the system flagged as "alert" behavior, but combined with other signals was clearly calming, not reactive
My vet was genuinely interested in the weekly summaries. She said behavior data collected over time is something they rarely get from owners.
The Repo
Full code (including the FastAPI dashboard and a Docker Compose setup):
👉 github.com/example/pet-behavior-ai (link in bio — drop a comment if you want me to clean it up and publish)
Where This Is Going
The whole project started because I was frustrated by how little support exists for pet owners trying to understand their animals' mental and emotional health.
Turns out I'm not alone — services like MyPetTherapist are doing exactly this at a product level: connecting AI-driven insights with actual behavioral support for pets. Worth checking out if you want this without the DIY setup.
But building it yourself? Also extremely worth it. Especially when your dog stares at you with those confused eyes and you can now say "Actually, I have data on that."
What behavior patterns have you noticed in your pets that surprised you? Drop them in the comments — curious what others are seeing.
Top comments (0)