DEV Community

Cover image for Building SignSpeak: Real-time ASL Recognition with React and AI 🤟
shiva shanker
shiva shanker

Posted on

Building SignSpeak: Real-time ASL Recognition with React and AI 🤟

Ever wanted to build something that could break down communication barriers? That's exactly what led me to create SignSpeak - a real-time ASL gesture recognition web app.

🚀 What I Built

SignSpeak translates American Sign Language gestures into text and speech, all running in your browser. Point your hand at the camera, make an ASL gesture, and watch it get recognized instantly with 85%+ accuracy.

🔗 Live Demo: signgesture.netlify.app
📚 GitHub: github.com/shivas1432/Sign_Gesture_Speak

🛠️ The Tech Stack

Here's what powers SignSpeak:

  • React 18 + TypeScript - For the UI and type safety
  • MediaPipe Hands - Google's hand landmark detection
  • TensorFlow.js - Client-side gesture classification
  • Tailwind CSS - Modern, responsive styling
  • Web Speech API - Text-to-speech functionality
  • Vite - Lightning-fast development
// Core architecture
const App = () => {
  const { videoRef, isLoading } = useCamera();
  const { recognizedGesture, confidence } = useGestureRecognition(videoRef);

  return (
    <div className="app">
      <Camera ref={videoRef} />
      <GestureOverlay landmarks={handLandmarks} />
      <ResultsPanel gesture={recognizedGesture} confidence={confidence} />
    </div>
  );
};
Enter fullscreen mode Exit fullscreen mode

🎯 How It Works

1. Hand Detection

MediaPipe analyzes the camera feed and extracts 21 hand landmarks in real-time:

const handsConfig = {
  maxNumHands: 2,
  modelComplexity: 1,
  minDetectionConfidence: 0.7,
  minTrackingConfidence: 0.5
};
Enter fullscreen mode Exit fullscreen mode

2. Gesture Recognition

Custom algorithms analyze hand landmark positions to identify specific gestures:

const recognizeGesture = (landmarks) => {
  // Example: Detect thumbs up
  const thumbTip = landmarks[4];
  const thumbIP = landmarks[3];
  const indexMCP = landmarks[5];

  if (thumbTip.y < thumbIP.y && thumbTip.y < indexMCP.y) {
    return { gesture: 'thumbs_up', confidence: 0.92 };
  }

  // More gesture logic...
};
Enter fullscreen mode Exit fullscreen mode

3. Real-time Processing

Everything runs at 30 FPS with minimal latency:

useEffect(() => {
  const processFrame = async () => {
    if (videoRef.current) {
      const results = await hands.send({ image: videoRef.current });
      const gesture = recognizeGesture(results.multiHandLandmarks);
      setRecognizedGesture(gesture);
    }
    requestAnimationFrame(processFrame);
  };
  processFrame();
}, []);
Enter fullscreen mode Exit fullscreen mode

🎨 Key Features I'm Proud Of

Real-time Hand Tracking

  • 21-point hand landmark detection
  • Visual overlay showing hand skeleton
  • Multi-hand support

Gesture Library

Currently recognizes:

  • ASL Letters: A, L, V
  • Common gestures: 👍 👋 ✌️
  • Confidence scoring for accuracy

Accessibility First

  • Text-to-speech with customizable voices
  • Keyboard navigation support
  • High contrast mode
  • Mobile-responsive design

Performance Optimized

  • Client-side processing (no server calls)
  • WebAssembly acceleration
  • Efficient canvas rendering
  • Bundle size < 2MB

🔧 Development Challenges

1. Hand Landmark Accuracy

MediaPipe sometimes struggles with complex hand positions. Solution: Added confidence thresholds and gesture smoothing.

2. Real-time Performance

Processing 30 FPS while running ML models is intensive. Used Web Workers and optimized the recognition pipeline.

3. Cross-browser Compatibility

Different browsers handle camera permissions differently. Implemented robust error handling and fallbacks.

📱 Progressive Web App

SignSpeak works offline and can be installed on any device:

{
  "name": "SignSpeak",
  "start_url": "/",
  "display": "standalone",
  "background_color": "#1f2937",
  "theme_color": "#3b82f6"
}
Enter fullscreen mode Exit fullscreen mode

🎯 What's Next?

  • More ASL vocabulary - expanding beyond individual letters
  • Sentence formation - combining gestures into phrases
  • Custom gesture training - let users add their own gestures
  • Mobile app - native iOS/Android versions

💡 Key Learnings

  1. Start simple - I began with basic gestures before adding complexity
  2. User feedback is gold - tested with actual ASL users early on
  3. Performance matters - real-time apps need careful optimization
  4. Accessibility can't be an afterthought - built it in from day one

🤝 Want to Contribute?

The project is open source and I'd love your help! Here's how:

🌟 Star the repo: Show some love on GitHub

🍴 Fork & contribute:

git clone https://github.com/shivas1432/Sign_Gesture_Speak.git
cd Sign_Gesture_Speak
npm install
npm run dev
Enter fullscreen mode Exit fullscreen mode

Ideas for contributors:

  • Add new ASL gestures
  • Improve recognition accuracy
  • Mobile UI enhancements
  • Documentation improvements

🎉 Try It Out!

Visit signgesture.netlify.app and see ASL recognition in action. Works best in good lighting with your hand clearly visible.

🤔 Discussion

What other accessibility challenges could we solve with web technology? I'm always looking for new project ideas that can make a difference.


Connect with me:

Building inclusive technology, one gesture at a time 🤟

Top comments (4)

Collapse
 
recursivecodes profile image
Todd Sharp

Error in dev tools in your demo:

index-CoF8kfhv.js:157 Failed to initialize hand detection: TypeError: _v.Hands is not a constructor
    at index-CoF8kfhv.js:157:24758
    at index-CoF8kfhv.js:157:25187
    at Cl (index-CoF8kfhv.js:40:24263)
    at Zn (index-CoF8kfhv.js:40:42318)
    at index-CoF8kfhv.js:40:40661
    at V (index-CoF8kfhv.js:25:1582)
    at MessagePort.G (index-CoF8kfhv.js:25:1952)
Enter fullscreen mode Exit fullscreen mode
Collapse
 
shiva_shanker_k profile image
shiva shanker

Thats awesome. Thanks for the feedback ,I think its a library loading error with MediaPipe. I'll fix the initialization timing and push an update asap.

Collapse
 
recursivecodes profile image
Todd Sharp

Not sure if you published that update, but I'm still receiving the same error.

Thread Thread
 
shiva_shanker_k profile image
shiva shanker

your are correct .i didn't made any changes to my code file due to lack of time. i am really sorry for that .i am gonna fix it asap .if possible i am really happy to welcome you to make contribute for my repository github.com/shivas1432/Sign_Gesture... . and thank you so much for your eagerness to try my app .thats alot for me .once again i am sorry .i will let you know when i push the changes.