DEV Community

Cover image for What I learned from the AI Showdown❤️
Nadine
Nadine Subscriber

Posted on

What I learned from the AI Showdown❤️

On Saturday, 14-15 June, Loveable hosted the AI Showdown: a public comparison of the world's leading code-generating AI models.

OpenAI, Anthropic, and Google partnered with Loveable to host the AI Showdown over a weekend.

While working on a project for Bolt, utilising Claude Sonnet, I struggled to troubleshoot errors because it would bundle most of the application code into a single file. This made it difficult to maintain or follow.


The Problem: Monolithic Bundles

A large JavaScript bundle forces the user's browser to download and parse all the application's code upfront. This leads to:

  • Slower Initial Load Times: The user sees a blank screen or a loading spinner for longer, impacting perceived performance.
  • Increased Bandwidth Usage: Unnecessary code is downloaded, which can be a particular issue for users on slower networks or with data caps.
  • Higher Memory Consumption: More code means more memory used by the browser, potentially affecting less powerful devices.

I decided to use the Showdown to test the three models. I would switch to Gemini to refactor the code into smaller files, which helped improve maintainability. GPT could do the same, refactoring without compromising functionality.


How I Fixed My App's Perceived Loading Issues:

Code Splitting with React.lazy and React.Suspense

Code splitting breaks the application's JavaScript into smaller "chunks" that can be loaded on demand. React.lazy and React.Suspense provide a convenient way to implement this in React:

  • React.lazy(): This allows you to render a dynamic import as a regular component. Instead of importing a component directly at the top of your file, you can "lazy-load" it, meaning its code chunk will only be fetched when the component is actually rendered.
  • React.Suspense: This works in conjunction with React.lazy. It lets you specify a "fallback" UI (e.g., a loading spinner, skeleton screen, or simple text like "Loading...") to display while the lazy-loaded component's code is being fetched and prepared.

Experimenting with different models helped me find a solution that dramatically improved the user experience by providing immediate feedback and preventing a blank page.

Top comments (0)