<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom" xmlns:dc="http://purl.org/dc/elements/1.1/">
  <channel>
    <title>DEV Community: zenlesscodes</title>
    <description>The latest articles on DEV Community by zenlesscodes (@zenlesscodes).</description>
    <link>https://dev.to/zenlesscodes</link>
    
    <atom:link rel="self" type="application/rss+xml" href="https://dev.to/feed/zenlesscodes"/>
    <language>en</language>
    <item>
      <title>I Made My ZZZ Code Site 100x Faster by Removing Flask</title>
      <dc:creator>zenlesscodes</dc:creator>
      <pubDate>Wed, 04 Feb 2026 09:26:39 +0000</pubDate>
      <link>https://dev.to/zenlesscodes/i-made-my-zzz-code-site-100x-faster-by-removing-flask-1nh2</link>
      <guid>https://dev.to/zenlesscodes/i-made-my-zzz-code-site-100x-faster-by-removing-flask-1nh2</guid>
      <description>&lt;p&gt;Remember that &lt;a href="https://zenlesscodes.com" rel="noopener noreferrer"&gt;ZZZ code aggregator&lt;/a&gt; I built? Well, I looked at my VPS metrics and realized something dumb: I was running Python on every single request for data that updates &lt;em&gt;once per hour&lt;/em&gt;.&lt;/p&gt;

&lt;p&gt;So I ripped out Flask entirely. Here's what happened.&lt;/p&gt;

&lt;h2&gt;
  
  
  The Problem
&lt;/h2&gt;

&lt;p&gt;My original setup was standard Flask + Gunicorn behind Nginx:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;Request → Nginx → Gunicorn → Flask → Response
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;It worked fine. But the site was just... rendering the same JSON data every time. The codes only change when HoYoverse drops new ones (roughly weekly, sometimes monthly). Running a Python process for every visitor felt wasteful on a $3 VPS.&lt;/p&gt;

&lt;p&gt;Memory usage hovered around 100MB. Not terrible, but not great either.&lt;/p&gt;

&lt;h2&gt;
  
  
  The Fix: Static File Generation
&lt;/h2&gt;

&lt;p&gt;The new architecture is embarrassingly simple:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;Request → Nginx → static file (HTML/JSON)
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;A background daemon generates static files every hour. Nginx serves them directly. That's it.&lt;/p&gt;

&lt;p&gt;The key insight: &lt;strong&gt;if your data doesn't change between requests, don't compute it between requests.&lt;/strong&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  The Code That Made It Work
&lt;/h2&gt;

&lt;h3&gt;
  
  
  Atomic File Writes
&lt;/h3&gt;

&lt;p&gt;The trickiest part was ensuring users never see a half-written file. Linux gives us atomic renames, so:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight python"&gt;&lt;code&gt;&lt;span class="k"&gt;def&lt;/span&gt; &lt;span class="nf"&gt;atomic_write&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;filepath&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="n"&gt;Path&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;content&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="nb"&gt;str&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="o"&gt;-&amp;gt;&lt;/span&gt; &lt;span class="bp"&gt;None&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;
    &lt;span class="n"&gt;temp_path&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;filepath&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;with_suffix&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;.tmp&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
    &lt;span class="k"&gt;try&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;
        &lt;span class="n"&gt;temp_path&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;write_text&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;content&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;encoding&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;utf-8&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
        &lt;span class="n"&gt;temp_path&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;rename&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;filepath&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;  &lt;span class="c1"&gt;# Atomic on Linux
&lt;/span&gt;    &lt;span class="k"&gt;except&lt;/span&gt; &lt;span class="nb"&gt;Exception&lt;/span&gt; &lt;span class="k"&gt;as&lt;/span&gt; &lt;span class="n"&gt;e&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;
        &lt;span class="k"&gt;if&lt;/span&gt; &lt;span class="n"&gt;temp_path&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;exists&lt;/span&gt;&lt;span class="p"&gt;():&lt;/span&gt;
            &lt;span class="n"&gt;temp_path&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;unlink&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt;
        &lt;span class="k"&gt;raise&lt;/span&gt; &lt;span class="n"&gt;e&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Write to a temp file, then rename. The rename is atomic - a reader either gets the old file or the new file, never a partial write.&lt;/p&gt;

&lt;h3&gt;
  
  
  Scheduler Config
&lt;/h3&gt;

&lt;p&gt;APScheduler handles the hourly updates, but I needed to prevent overlapping runs if one takes too long:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight python"&gt;&lt;code&gt;&lt;span class="n"&gt;scheduler&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;add_job&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;
    &lt;span class="n"&gt;update_codes_task&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
    &lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;interval&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
    &lt;span class="n"&gt;minutes&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="mi"&gt;60&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
    &lt;span class="n"&gt;max_instances&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="mi"&gt;1&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;    &lt;span class="c1"&gt;# Prevent overlapping
&lt;/span&gt;    &lt;span class="n"&gt;coalesce&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="bp"&gt;True&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;      &lt;span class="c1"&gt;# Combine missed runs
&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;code&gt;max_instances=1&lt;/code&gt; ensures only one update runs at a time. &lt;code&gt;coalesce=True&lt;/code&gt; means if the server was down and missed 3 runs, it only runs once when it comes back up (not 3 times in a row).&lt;/p&gt;

&lt;h2&gt;
  
  
  The Results
&lt;/h2&gt;

&lt;div class="table-wrapper-paragraph"&gt;&lt;table&gt;
&lt;thead&gt;
&lt;tr&gt;
&lt;th&gt;Metric&lt;/th&gt;
&lt;th&gt;Before (Flask)&lt;/th&gt;
&lt;th&gt;After (Static)&lt;/th&gt;
&lt;/tr&gt;
&lt;/thead&gt;
&lt;tbody&gt;
&lt;tr&gt;
&lt;td&gt;Memory&lt;/td&gt;
&lt;td&gt;~100MB&lt;/td&gt;
&lt;td&gt;~20MB&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Requests/sec&lt;/td&gt;
&lt;td&gt;~100-500&lt;/td&gt;
&lt;td&gt;~10,000+&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Dependencies&lt;/td&gt;
&lt;td&gt;flask, gunicorn&lt;/td&gt;
&lt;td&gt;jinja2&lt;/td&gt;
&lt;/tr&gt;
&lt;/tbody&gt;
&lt;/table&gt;&lt;/div&gt;

&lt;p&gt;That requests/sec number is from Nginx serving static files directly. Your mileage will vary based on file size and server specs, but the point is: it's &lt;em&gt;way&lt;/em&gt; faster.&lt;/p&gt;

&lt;h2&gt;
  
  
  Bonus: Image Optimization
&lt;/h2&gt;

&lt;p&gt;While I was at it, I converted all the reward icons from remote PNG URLs to self-hosted WebP files:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight javascript"&gt;&lt;code&gt;&lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;ICON_MAP&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
  &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;Polychrome&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;/static/e6ee639872c119aa6895758f3a755d3b.webp&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
  &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;Denny&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;/static/7db931d2138edcfb9e155907503f2fbe.webp&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
  &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;Senior Investigator Log&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;/static/ab0406e53b7f8c4afe08096a2f7aa587.webp&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
  &lt;span class="c1"&gt;// ... 11 reward icons total&lt;/span&gt;
&lt;span class="p"&gt;};&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;The annoying part? I was loading these from game8.co on every page view. Now they're:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;25-34% smaller&lt;/strong&gt; (WebP compression vs PNG)&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Self-hosted&lt;/strong&gt; (no external dependencies)&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Content-hashed&lt;/strong&gt; for cache-busting&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;I kept PNG fallbacks for older browsers, but modern browsers get the lighter WebP versions.&lt;/p&gt;

&lt;h2&gt;
  
  
  Unexpected Benefits
&lt;/h2&gt;

&lt;p&gt;&lt;strong&gt;Crash-proof&lt;/strong&gt;: If my Python daemon dies, the last-generated files keep getting served. Users see stale data (worst case: 1 hour old) instead of an error page.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Simpler deployment&lt;/strong&gt;: No WSGI server to configure. Just run the Python script as a systemd service and point Nginx at a directory.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;IndexNow integration&lt;/strong&gt;: When codes actually change, I ping search engines. But it only happens when there's real new content, not on every request.&lt;/p&gt;

&lt;h2&gt;
  
  
  The Takeaway
&lt;/h2&gt;

&lt;p&gt;The best optimization is often eliminating unnecessary work entirely.&lt;/p&gt;

&lt;p&gt;Before: Run Python interpreter → load Flask → route request → fetch cached data → render template → return response.&lt;/p&gt;

&lt;p&gt;After: Return file.&lt;/p&gt;

&lt;p&gt;This pattern works whenever your data updates less frequently than your traffic. Blog posts, documentation sites, dashboards with hourly data - all good candidates.&lt;/p&gt;

&lt;h2&gt;
  
  
  Cost Breakdown
&lt;/h2&gt;

&lt;p&gt;Still the same as before:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Domain: ~$10/year&lt;/li&gt;
&lt;li&gt;VPS: ~$3/month&lt;/li&gt;
&lt;li&gt;Cloudflare: Free tier&lt;/li&gt;
&lt;/ul&gt;




&lt;p&gt;&lt;strong&gt;&lt;a href="https://zenlesscodes.com" rel="noopener noreferrer"&gt;zenlesscodes.com&lt;/a&gt;&lt;/strong&gt; - still aggregating ZZZ codes, now 100x faster at it.&lt;/p&gt;

</description>
      <category>python</category>
      <category>webdev</category>
      <category>performance</category>
      <category>beginners</category>
    </item>
    <item>
      <title>How I Built a ZZZ Code Aggregator in a Weekend</title>
      <dc:creator>zenlesscodes</dc:creator>
      <pubDate>Mon, 02 Feb 2026 06:39:35 +0000</pubDate>
      <link>https://dev.to/zenlesscodes/how-i-built-a-zzz-code-aggregator-in-a-weekend-2ldc</link>
      <guid>https://dev.to/zenlesscodes/how-i-built-a-zzz-code-aggregator-in-a-weekend-2ldc</guid>
      <description>&lt;p&gt;I play &lt;strong&gt;Zenless Zone Zero&lt;/strong&gt;, and like most gacha games, HoYoverse drops redemption codes for free currency. The annoying part? &lt;strong&gt;I kept missing codes.&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;So I built &lt;a href="https://zenlesscodes.com" rel="noopener noreferrer"&gt;zenlesscodes.com&lt;/a&gt; - a simple aggregator that pulls from multiple sources and shows all active codes in one place.&lt;/p&gt;

&lt;h2&gt;
  
  
  The Stack
&lt;/h2&gt;

&lt;p&gt;Nothing fancy:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Python/Flask&lt;/strong&gt; for the backend&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;BeautifulSoup&lt;/strong&gt; for scraping(respectfully) the Fandom wiki&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;APScheduler&lt;/strong&gt; for hourly background updates&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Gunicorn&lt;/strong&gt; as the WSGI server&lt;/li&gt;
&lt;li&gt;Runs on a &lt;strong&gt;$3/month VPS&lt;/strong&gt;
&lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  How It Works
&lt;/h2&gt;

&lt;p&gt;The site fetches codes from 3 sources every hour:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight python"&gt;&lt;code&gt;&lt;span class="k"&gt;def&lt;/span&gt; &lt;span class="nf"&gt;fetch_all_codes&lt;/span&gt;&lt;span class="p"&gt;():&lt;/span&gt;
    &lt;span class="n"&gt;primary_codes&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nf"&gt;fetch_primary_api&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt;      &lt;span class="c1"&gt;# hoyo-codes API
&lt;/span&gt;    &lt;span class="n"&gt;github_codes&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nf"&gt;fetch_github&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt;            &lt;span class="c1"&gt;# community GitHub list
&lt;/span&gt;    &lt;span class="n"&gt;fandom_codes&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;expired&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nf"&gt;fetch_fandom&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt;   &lt;span class="c1"&gt;# wiki scraping
&lt;/span&gt;
    &lt;span class="k"&gt;return&lt;/span&gt; &lt;span class="nf"&gt;aggregate_codes&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;primary_codes&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;github_codes&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;fandom_codes&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;expired&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;The tricky part was &lt;strong&gt;deduplication and expiration detection&lt;/strong&gt;. Different sources report the same codes with slightly different formatting, and the Fandom wiki is actually the most reliable for knowing when codes expire.&lt;/p&gt;

&lt;h2&gt;
  
  
  The Fandom Scraping
&lt;/h2&gt;

&lt;p&gt;Fandom uses a MediaWiki backend, so instead of fighting Cloudflare, I just use their API:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight python"&gt;&lt;code&gt;&lt;span class="k"&gt;def&lt;/span&gt; &lt;span class="nf"&gt;fetch_fandom_via_api&lt;/span&gt;&lt;span class="p"&gt;():&lt;/span&gt;
    &lt;span class="n"&gt;params&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
        &lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;action&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;parse&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
        &lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;page&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;Redemption_Code&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
        &lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;format&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;json&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
        &lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;prop&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;text&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
    &lt;span class="p"&gt;}&lt;/span&gt;
    &lt;span class="n"&gt;response&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;requests&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;get&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;FANDOM_API_URL&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;params&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="n"&gt;params&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
    &lt;span class="k"&gt;return&lt;/span&gt; &lt;span class="n"&gt;response&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;json&lt;/span&gt;&lt;span class="p"&gt;()[&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;parse&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;][&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;text&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;][&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;*&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;]&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Then I parse the HTML table, checking for &lt;code&gt;bg-green&lt;/code&gt; (active) vs &lt;code&gt;bg-red&lt;/code&gt; (expired) CSS classes on the status cells.&lt;/p&gt;

&lt;h2&gt;
  
  
  SEO Stuff
&lt;/h2&gt;

&lt;p&gt;Threw in &lt;strong&gt;IndexNow&lt;/strong&gt; pings so search engines know when content updates. It's literally one API call to Bing and they share it with the other search engines.&lt;/p&gt;

&lt;h2&gt;
  
  
  What I'd Do Differently
&lt;/h2&gt;

&lt;p&gt;Honestly, not much. Flask might be overkill - this could probably be a static site generated by a cron job. But the Flask setup gives me a &lt;code&gt;/api/codes&lt;/code&gt; endpoint for free, and the scheduling is cleaner.&lt;/p&gt;

&lt;h2&gt;
  
  
  Cost
&lt;/h2&gt;

&lt;ul&gt;
&lt;li&gt;Domain: ~$10/year&lt;/li&gt;
&lt;li&gt;VPS: ~$3/month(Already had this, prepaid for 3 years)&lt;/li&gt;
&lt;li&gt;Cloudflare: Free tier&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;That's it. Launched a couple days ago.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;&lt;a href="https://zenlesscodes.com" rel="noopener noreferrer"&gt;zenlesscodes.com&lt;/a&gt;&lt;/strong&gt; if you play ZZZ and want to stop missing codes.&lt;/p&gt;

</description>
      <category>python</category>
      <category>webdev</category>
      <category>flask</category>
      <category>beginners</category>
    </item>
  </channel>
</rss>
