DEV Community

Cover image for ๐ŸŒ๏ธโ€โ™‚๏ธ How I Built a Golf Swing Analyzer in Python Using AI Pose Detection (That Actually Works)
Ryan Banze
Ryan Banze

Posted on • Edited on

๐ŸŒ๏ธโ€โ™‚๏ธ How I Built a Golf Swing Analyzer in Python Using AI Pose Detection (That Actually Works)

โ›ณ Why This Project Matters

Golf has always been a game of inches , a micro-adjustment in wrist angle, a fraction of a second in timing, or a subtle shift in posture can be the difference between a 300-yard drive and a slice into the trees.

Traditionally, only elite players with access to swing coaches, motion capture systems, or $10,000 launch monitors could dissect their biomechanics. Everyone else? We just squint at slow-mo YouTube replays of Tiger and hope for the best.

That gap is what I set out to solve.

What if anyone, anywhere, with nothing more than a smartphone video and a Colab notebook, could access near-pro-level swing diagnostics?


That was the genesis of GolfPosePro , an AI-powered golf swing analyzer that:

  • Tracks your swing phases frame-by-frame with pose estimation.
  • Visualizes biomechanics (like wrist trajectory) in debug plots.
  • Compares your motion to PGA Tour pros , side-by-side.
  • Generates enhanced playback with slow motion, labeled overlays, and pro benchmarks.

All built with Python, MediaPipe, OpenCV, matplotlib, and Google Colab Pro.

This isnโ€™t just about golf ,itโ€™s a case study in democratizing biomechanics through AI.


โš™๏ธ What It Does

  • ๐Ÿง  Extracts wrist motion from your swing video.
  • ๐Ÿช„ Segments swing phases dynamically: Address โ†’ Backswing โ†’ Top โ†’ Downswing โ†’ Impact โ†’ Follow-through
  • ๐Ÿ” Overlays debug plots of wrist trajectory, velocity, and key checkpoints.
  • ๐ŸŽฏ Runs side-by-side comparisons against PGA swings (downloaded with yt-dlp).
  • ๐Ÿข Encodes slow-motion video segments, highlighting your motion frame-by-frame.


๐Ÿ‘‰ Imagine watching your swing next to Rory McIlroyโ€™s , with a biomechanical plot showing exactly where your wrist path diverges.

๐Ÿงฑ How It Works

This project is really three systems working together:

  1. Pose Estimation Engine (MediaPipe) โ†’ Converts pixels into biomechanical landmarks.
  2. Signal Processing Layer (NumPy + matplotlib) โ†’ Smooths, filters, and segments motion.
  3. Visualization Pipeline (OpenCV + FFmpeg) โ†’ Merges raw video with analytical overlays.

Letโ€™s break that down.


๐Ÿงโ€โ™‚๏ธ 1. Pose Estimation with MediaPipe
At the heart of the system is MediaPipe Pose โ€” Googleโ€™s real-time human landmark detector.

It tracks 33 body landmarks at ~30 FPS, including wrists, shoulders, and hips.

results = pose.process(rgb_frame)
wrist_y = results.pose_landmarks.landmark[LEFT_WRIST].y
From a swing video, we extract wrist positions across time.โ€จWhy wrists? Because theyโ€™re critical in determining swing path, lag, and release timing.

Enter fullscreen mode Exit fullscreen mode

๐Ÿงผ 2. Trajectory Smoothing
Raw pose data is noisy (frames jitter, lighting shifts).โ€จTo stabilize it, I apply a uniform moving average filter and compute velocity with NumPy gradients.



velocity = np.gradient(uniform_filter1d(wrist_y, size=5))

Enter fullscreen mode Exit fullscreen mode

This transforms jittery landmarks into smooth curves that actually mean something.

  • Velocity spikes = transition points
  • Flat zones = posture holds

๐Ÿ“ 3. Swing Phase Segmentation
Hereโ€™s the biomechanical magic:

  • Address โ†’ Backswing start = wrist first deviates upward.
  • Top of swing = lowest wrist point (relative to torso).
  • Impact = peak wrist acceleration crossing baseline.
  • Follow-through = velocity decay + posture stabilization. Each phase is dynamically detected, then color-coded on the debug plot.

๐ŸŽฅ 4. Side-by-Side Video Overlays

A coach doesnโ€™t just tell you where youโ€™re off , they show you.
So with OpenCV and FFmpeg, I stack:

  • Your swing
  • A proโ€™s swing (downloaded via yt-dlp)
  • Trajectory plots with labeled swing checkpoints
combined_frame = np.hstack((frame, debug_plot_img))

Enter fullscreen mode Exit fullscreen mode

The final output: a video file with slow-motion playback at impact, plus real-time analytical overlays.



๐Ÿงช Tools Used


๐ŸŒ๏ธ Built For

  • Amateurs โ†’ Upload iPhone swing clips, get coach-like insights.
  • Coaches โ†’ Use it as a feedback tool without expensive sensors.
  • Developers โ†’ A sandbox for exploring pose detection + video analytics. This notebook isnโ€™t replacing coaches or TrackMan โ€” but itโ€™s democratizing access to biomechanics.

๐Ÿ™ Credits

  • Pro swing footage: YouTube Shorts (Max Homa, Ludvig ร…berg).
  • Frameworks: MediaPipe, OpenCV, matplotlib, FFmpeg.
  • Countless test swings (and slices) on the driving range.

๐Ÿš€ Whatโ€™s Next

  • ๐Ÿ—ฃ๏ธ AI coach commentary overlay.
  • ๐ŸŒ๏ธ Support for left-handed players (pose normalization).
  • ๐ŸŽฅ Ball tracer integration.
  • ๐Ÿ“Š Automatic swing grading with ML classifiers.
  • ๐Ÿ“ฑ Mobile-friendly UI.

Video Tutorial:

Full Video Tutorial

๐Ÿ Final Thoughts
Golf is often said to be a battle between the player and themselves.โ€จBy applying AI pose detection, we finally have a way to quantify the invisible , turning milliseconds of motion into data you can act on.
This project isnโ€™t just about golf.โ€จItโ€™s a glimpse of how AI can democratize performance analysis across all sports.
And for me? Itโ€™s about making practice smarter, not just longer.


โ›ณ Letโ€™s bring AI to the range , one frame at a time
If you enjoyed this project, consider buying me a coffee to support more free AI tutorials and tools:
๐Ÿ“‚ Source Code & Notebook: https://github.com/ryanboscobanze/GolfPosePro


๐Ÿ‘‰ Buy Me a Coffee โ˜•


๐Ÿ“ฑ Follow Me


Top comments (0)