DEV Community

Cover image for 🏌️‍♂️ How I Built a Golf Swing Analyzer in Python Using AI Pose Detection (That Actually Works)
Ryan Banze
Ryan Banze

Posted on

🏌️‍♂️ How I Built a Golf Swing Analyzer in Python Using AI Pose Detection (That Actually Works)

⛳ Why This Project Matters

Golf has always been a game of inches , a micro-adjustment in wrist angle, a fraction of a second in timing, or a subtle shift in posture can be the difference between a 300-yard drive and a slice into the trees.

Traditionally, only elite players with access to swing coaches, motion capture systems, or $10,000 launch monitors could dissect their biomechanics. Everyone else? We just squint at slow-mo YouTube replays of Tiger and hope for the best.

That gap is what I set out to solve.

What if anyone, anywhere, with nothing more than a smartphone video and a Colab notebook, could access near-pro-level swing diagnostics?

That was the genesis of GolfPosePro , an AI-powered golf swing analyzer that:

  • Tracks your swing phases frame-by-frame with pose estimation.
  • Visualizes biomechanics (like wrist trajectory) in debug plots.
  • Compares your motion to PGA Tour pros , side-by-side.
  • Generates enhanced playback with slow motion, labeled overlays, and pro benchmarks.

All built with Python, MediaPipe, OpenCV, matplotlib, and Google Colab Pro.

This isn’t just about golf — it’s a case study in democratizing biomechanics through AI.


⚙️ What It Does

  • 🧠 Extracts wrist motion from your swing video.
  • 🪄 Segments swing phases dynamically: Address → Backswing → Top → Downswing → Impact → Follow-through
  • 🔍 Overlays debug plots of wrist trajectory, velocity, and key checkpoints.
  • 🎯 Runs side-by-side comparisons against PGA swings (downloaded with yt-dlp).
  • 🐢 Encodes slow-motion video segments, highlighting your motion frame-by-frame.


👉 Imagine watching your swing next to Rory McIlroy’s — with a biomechanical plot showing exactly where your wrist path diverges.

🧱 How It Works

This project is really three systems working together:

  1. Pose Estimation Engine (MediaPipe) → Converts pixels into biomechanical landmarks.
  2. Signal Processing Layer (NumPy + matplotlib) → Smooths, filters, and segments motion.
  3. Visualization Pipeline (OpenCV + FFmpeg) → Merges raw video with analytical overlays.

Let’s break that down.


🧍‍♂️ 1. Pose Estimation with MediaPipe
At the heart of the system is MediaPipe Pose — Google’s real-time human landmark detector.

It tracks 33 body landmarks at ~30 FPS, including wrists, shoulders, and hips.

results = pose.process(rgb_frame)
wrist_y = results.pose_landmarks.landmark[LEFT_WRIST].y
From a swing video, we extract wrist positions across time.Why wrists? Because theyre critical in determining swing path, lag, and release timing.

Enter fullscreen mode Exit fullscreen mode

🧼 2. Trajectory Smoothing
Raw pose data is noisy (frames jitter, lighting shifts).
To stabilize it, I apply a uniform moving average filter and compute velocity with NumPy gradients.


velocity = np.gradient(uniform_filter1d(wrist_y, size=5))

Enter fullscreen mode Exit fullscreen mode

This transforms jittery landmarks into smooth curves that actually mean something.

  • Velocity spikes = transition points
  • Flat zones = posture holds

📐 3. Swing Phase Segmentation
Here’s the biomechanical magic:

  • Address → Backswing start = wrist first deviates upward.
  • Top of swing = lowest wrist point (relative to torso).
  • Impact = peak wrist acceleration crossing baseline.
  • Follow-through = velocity decay + posture stabilization. Each phase is dynamically detected, then color-coded on the debug plot.

🎥 4. Side-by-Side Video Overlays

A coach doesn’t just tell you where you’re off — they show you.
So with OpenCV and FFmpeg, I stack:

  • Your swing
  • A pro’s swing (downloaded via yt-dlp)
  • Trajectory plots with labeled swing checkpoints
combined_frame = np.hstack((frame, debug_plot_img))

Enter fullscreen mode Exit fullscreen mode

The final output: a video file with slow-motion playback at impact, plus real-time analytical overlays.

🧪 Tools Used


🏌️ Built For

  • Amateurs → Upload iPhone swing clips, get coach-like insights.
  • Coaches → Use it as a feedback tool without expensive sensors.
  • Developers → A sandbox for exploring pose detection + video analytics. This notebook isn’t replacing coaches or TrackMan — but it’s democratizing access to biomechanics.

🙏 Credits

  • Pro swing footage: YouTube Shorts (Max Homa, Ludvig Åberg).
  • Frameworks: MediaPipe, OpenCV, matplotlib, FFmpeg.
  • Countless test swings (and slices) on the driving range.

🚀 What’s Next

  • 🗣️ AI coach commentary overlay.
  • 🏌️ Support for left-handed players (pose normalization).
  • 🎥 Ball tracer integration.
  • 📊 Automatic swing grading with ML classifiers.
  • 📱 Mobile-friendly UI.

🏁 Final Thoughts
Golf is often said to be a battle between the player and themselves.
By applying AI pose detection, we finally have a way to quantify the invisible — turning milliseconds of motion into data you can act on.
This project isn’t just about golf.
It’s a glimpse of how AI can democratize performance analysis across all sports.
And for me? It’s about making practice smarter, not just longer.
⛳ Let’s bring AI to the range — one frame at a time
If you enjoyed this project, consider buying me a coffee to support more free AI tutorials and tools:

👉 Buy Me a Coffee ☕


📱 Follow Me


Top comments (0)