DEV Community

Jaji
Jaji

Posted on • Edited on

Mastering React Performance: Web Workers and Generator Functions

Part of the Web Performance Optimization Series

When building data-intensive React applications, you might encounter scenarios where processing large datasets causes your UI to freeze. This is because JavaScript runs on a single thread, meaning heavy computations can block user interactions. Let's explore how to solve this using Generator Functions and Web Workers with a real-world example.

The Problem: UI Freezing During Heavy Computations

Imagine you're building an events analytics dashboard that needs to process thousands of events with complex calculations. Here's what typically happens:

function EventsDashboard() {
  const [events, setEvents] = useState([]);

  // This function blocks the UI thread
  const processEvents = (rawEvents) => {
    return rawEvents.map(event => {
      // Complex calculations that take time
      const score = calculateEventScore(event);  // ~2ms per event
      const sentiment = analyzeSentiment(event); // ~3ms per event
      const category = classifyEvent(event);     // ~1ms per event

      // With 10,000 events, this takes: 
      // 10,000 * (2 + 3 + 1) = 60,000ms = 60 seconds!
      return {
        ...event,
        score,
        sentiment,
        category
      };
    });
  };

  const processAndDisplay = () => {
    const processedEvents = processEvents(events);
    setEvents(processedEvents);
  };

  return (
    <div>
      <button onClick={processAndDisplay}>Process Events</button>
      <EventsTable events={events} />
    </div>
  );
}
Enter fullscreen mode Exit fullscreen mode

The problem? With 10,000 events, your UI freezes for 60 seconds! During this time:

  • Users can't click buttons
  • Scrolling is jerky
  • Input fields don't respond
  • Animations freeze

Solution 1: Generator Functions for Chunked Processing

Generator functions allow us to process data in chunks, yielding control back to the main thread periodically:

function* eventProcessor(events, chunkSize = 100) {
  // Process events in chunks of 100
  for (let i = 0; i < events.length; i += chunkSize) {
    const chunk = events.slice(i, i + chunkSize);

    const processedChunk = chunk.map(event => ({
      ...event,
      score: calculateEventScore(event),
      sentiment: analyzeSentiment(event),
      category: classifyEvent(event)
    }));

    // Yield each processed chunk
    yield processedChunk;
  }
}

function EventsDashboard() {
  const [events, setEvents] = useState([]);
  const [progress, setProgress] = useState(0);
  const [isProcessing, setIsProcessing] = useState(false);

  const processEventsInChunks = async () => {
    setIsProcessing(true);
    const generator = eventProcessor(events);
    let processedEvents = [];

    try {
      while (true) {
        const { value: chunk, done } = generator.next();
        if (done) break;

        processedEvents = [...processedEvents, ...chunk];

        // Update progress
        const progress = (processedEvents.length / events.length) * 100;
        setProgress(progress);

        // Let the UI breathe
        await new Promise(resolve => setTimeout(resolve, 0));

        // Update UI with processed events so far
        setEvents(processedEvents);
      }
    } finally {
      setIsProcessing(false);
    }
  };

  return (
    <div>
      <button 
        onClick={processEventsInChunks}
        disabled={isProcessing}
      >
        {isProcessing ? 'Processing...' : 'Process Events'}
      </button>

      {isProcessing && (
        <ProgressBar 
          value={progress} 
          label={`Processing: ${Math.round(progress)}%`} 
        />
      )}

      <EventsTable events={events} />
    </div>
  );
}
Enter fullscreen mode Exit fullscreen mode

Solution 2: Web Workers for True Parallel Processing

Web Workers allow us to run computations in a separate thread:

// eventWorker.ts
type Event = {
  id: string;
  name: string;
  timestamp: number;
  data: any;
};

type ProcessedEvent = Event & {
  score: number;
  sentiment: string;
  category: string;
};

type WorkerMessage = {
  type: 'PROCESS_CHUNK';
  payload: Event[];
};

type WorkerResponse = {
  type: 'CHUNK_PROCESSED' | 'PROCESSING_COMPLETE' | 'ERROR';
  payload: ProcessedEvent[] | Error;
  progress?: number;
};

self.onmessage = (e: MessageEvent<WorkerMessage>) => {
  const { type, payload: events } = e.data;

  if (type === 'PROCESS_CHUNK') {
    try {
      let processedCount = 0;
      const totalEvents = events.length;
      const chunkSize = 100;

      // Process in smaller chunks to report progress
      for (let i = 0; i < events.length; i += chunkSize) {
        const chunk = events.slice(i, i + chunkSize);

        const processedChunk = chunk.map(event => ({
          ...event,
          score: calculateEventScore(event),
          sentiment: analyzeSentiment(event),
          category: classifyEvent(event)
        }));

        processedCount += chunk.length;

        // Report progress
        self.postMessage({
          type: 'CHUNK_PROCESSED',
          payload: processedChunk,
          progress: (processedCount / totalEvents) * 100
        });
      }

      self.postMessage({
        type: 'PROCESSING_COMPLETE',
        payload: events
      });
    } catch (error) {
      self.postMessage({
        type: 'ERROR',
        payload: error
      });
    }
  }
};
Enter fullscreen mode Exit fullscreen mode
// EventsDashboard.tsx
function EventsDashboard() {
  const [events, setEvents] = useState([]);
  const [progress, setProgress] = useState(0);
  const [error, setError] = useState(null);
  const workerRef = useRef();

  useEffect(() => {
    // Initialize worker
    workerRef.current = new Worker('/eventWorker.ts');

    // Handle worker messages
    workerRef.current.onmessage = (e) => {
      const { type, payload, progress } = e.data;

      switch (type) {
        case 'CHUNK_PROCESSED':
          setEvents(current => [...current, ...payload]);
          setProgress(progress);
          break;

        case 'PROCESSING_COMPLETE':
          setProgress(100);
          break;

        case 'ERROR':
          setError(payload);
          break;
      }
    };

    return () => workerRef.current?.terminate();
  }, []);

  const processEvents = () => {
    setEvents([]);
    setProgress(0);
    setError(null);

    workerRef.current.postMessage({
      type: 'PROCESS_CHUNK',
      payload: events
    });
  };

  return (
    <div className="p-4">
      <div className="mb-4">
        <button 
          onClick={processEvents}
          disabled={progress > 0 && progress < 100}
          className="px-4 py-2 bg-blue-500 text-white rounded"
        >
          Process Events
        </button>
      </div>

      {progress > 0 && progress < 100 && (
        <div className="mb-4">
          <ProgressBar 
            value={progress} 
            label={`Processing: ${Math.round(progress)}%`}
          />
        </div>
      )}

      {error && (
        <div className="mb-4 p-4 bg-red-100 text-red-700">
          Error: {error.message}
        </div>
      )}

      <EventsTable 
        events={events}
        isLoading={progress > 0 && progress < 100}
      />
    </div>
  );
}
Enter fullscreen mode Exit fullscreen mode

The Ultimate Solution: Combining Both Approaches

For optimal performance, especially with very large datasets (100,000+ events), combine both approaches:

  1. Use Web Worker for parallel processing
  2. Use Generator Functions inside the worker for chunked processing
  3. Stream results back to the main thread

Here's the complete implementation:

// advancedEventWorker.ts
function* processInChunks(events: Event[], chunkSize: number) {
  for (let i = 0; i < events.length; i += chunkSize) {
    const chunk = events.slice(i, i + chunkSize);
    yield chunk;
  }
}

self.onmessage = async (e: MessageEvent<WorkerMessage>) => {
  const { type, payload: events } = e.data;

  if (type === 'PROCESS_EVENTS') {
    try {
      const CHUNK_SIZE = 100;
      const chunks = processInChunks(events, CHUNK_SIZE);
      let processedCount = 0;
      const totalEvents = events.length;

      for (const chunk of chunks) {
        // Process each chunk
        const processedChunk = await Promise.all(
          chunk.map(async event => ({
            ...event,
            score: await calculateEventScore(event),
            sentiment: await analyzeSentiment(event),
            category: await classifyEvent(event)
          }))
        );

        processedCount += chunk.length;

        // Stream results back to main thread
        self.postMessage({
          type: 'CHUNK_PROCESSED',
          payload: processedChunk,
          progress: (processedCount / totalEvents) * 100
        });

        // Simulate giving the worker thread a breather
        await new Promise(resolve => setTimeout(resolve, 0));
      }

      self.postMessage({
        type: 'PROCESSING_COMPLETE',
        payload: null,
        progress: 100
      });
    } catch (error) {
      self.postMessage({
        type: 'ERROR',
        payload: error
      });
    }
  }
};
Enter fullscreen mode Exit fullscreen mode

Performance Monitoring

To measure the impact of these optimizations:

// Before processing
performance.mark('processStart');

// After processing
performance.mark('processEnd');
performance.measure(
  'eventProcessing',
  'processStart',
  'processEnd'
);

// Log metrics
const metrics = performance.getEntriesByName('eventProcessing')[0];
console.log(`Processing took ${metrics.duration}ms`);
Enter fullscreen mode Exit fullscreen mode

Best Practices and Tips

  1. Choose the Right Chunk Size:

    • Too small: Overhead from frequent updates
    • Too large: UI becomes unresponsive
    • Start with 100 items per chunk and adjust based on performance metrics
  2. Memory Management:

// Clear processed chunks from memory
let processedEvents = new Array(totalEvents);
for (const [index, chunk] of chunks.entries()) {
  processedEvents.splice(index * CHUNK_SIZE, CHUNK_SIZE, ...processedChunk);
  // Clear references to help garbage collection
  chunk.length = 0;
}
Enter fullscreen mode Exit fullscreen mode
  1. Error Handling:
const safeProcess = async (event) => {
  try {
    return await processEvent(event);
  } catch (error) {
    console.error(`Failed to process event ${event.id}:`, error);
    return {
      ...event,
      error: error.message
    };
  }
};
Enter fullscreen mode Exit fullscreen mode
  1. Cancellation:
function EventsDashboard() {
  const cancelRef = useRef(false);

  useEffect(() => {
    return () => {
      cancelRef.current = true;
    };
  }, []);

  const processEvents = async () => {
    for (const chunk of chunks) {
      if (cancelRef.current) break;
      // Process chunk...
    }
  };
}
Enter fullscreen mode Exit fullscreen mode

Real-World Performance Improvements

With this implementation:

  • Processing 10,000 events: 60s → 3s
  • UI remains responsive throughout
  • Memory usage stays consistent
  • Users see progressive updates
  • Processing can be cancelled

Conclusion

By combining Generator Functions and Web Workers, we can handle intensive data processing tasks while maintaining a smooth user experience. This pattern is particularly valuable for:

  • Data visualization applications
  • Real-time analytics dashboards
  • Large dataset processing
  • Complex calculations
  • File processing applications

The key is to break down large tasks into manageable chunks and process them in a way that doesn't block the main thread, while keeping the user informed of progress.

Remember to always measure performance before and after implementing these optimizations to ensure they provide meaningful benefits for your specific use case.

Top comments (0)