<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom" xmlns:dc="http://purl.org/dc/elements/1.1/">
  <channel>
    <title>DEV Community: Ahmad Sharabati</title>
    <description>The latest articles on DEV Community by Ahmad Sharabati (@ahmad212o).</description>
    <link>https://dev.to/ahmad212o</link>
    
    <atom:link rel="self" type="application/rss+xml" href="https://dev.to/feed/ahmad212o"/>
    <language>en</language>
    <item>
      <title>Stop flying blind on flaky tests — pytest-cloudreport gives you HTML reports and cross-run history</title>
      <dc:creator>Ahmad Sharabati</dc:creator>
      <pubDate>Sun, 26 Apr 2026 07:04:23 +0000</pubDate>
      <link>https://dev.to/ahmad212o/stop-flying-blind-on-flaky-tests-pytest-cloudreport-gives-you-html-reports-and-cross-run-history-5c0k</link>
      <guid>https://dev.to/ahmad212o/stop-flying-blind-on-flaky-tests-pytest-cloudreport-gives-you-html-reports-and-cross-run-history-5c0k</guid>
      <description>&lt;p&gt;If you've ever stared at a CI failure that passes locally, or watched the same test flap for the third time this week with no idea &lt;em&gt;how often&lt;/em&gt; it's actually broken — this post is for you.&lt;/p&gt;

&lt;h2&gt;
  
  
  The problem
&lt;/h2&gt;

&lt;p&gt;pytest's built-in output is great for a single run. But it tells you nothing across runs:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Which tests are consistently slow?&lt;/li&gt;
&lt;li&gt;Is this failure new or has it been flapping for two weeks?&lt;/li&gt;
&lt;li&gt;Did the last deploy make things better or worse?&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;You end up either ignoring flakiness until it becomes critical, or building custom tooling you don't have time to maintain.&lt;/p&gt;

&lt;h2&gt;
  
  
  What pytest-cloudreport does
&lt;/h2&gt;

&lt;p&gt;It's a pytest plugin with two modes:&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;1. Local HTML report&lt;/strong&gt; — zero config, zero account needed.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;pip &lt;span class="nb"&gt;install &lt;/span&gt;pytest-cloudreport
pytest &lt;span class="nt"&gt;--cloudreport-local&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;After your test run you get a self-contained &lt;code&gt;cloudreport.html&lt;/code&gt; with a full breakdown: pass/fail/skip counts, duration per test, error output, environment info. Open it in a browser, share it with a teammate, attach it to a ticket.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;2. Cloud dashboard&lt;/strong&gt; — cross-run history, flaky-test detection, team access.&lt;/p&gt;

&lt;p&gt;Set an API key and every run uploads automatically:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;&lt;span class="nb"&gt;export &lt;/span&gt;&lt;span class="nv"&gt;PYTEST_CLOUD_API_KEY&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;your_key_here
pytest
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;No &lt;code&gt;--cloudreport&lt;/code&gt; flag needed — the presence of the key enables it. The upload runs in a background thread with a 5-second timeout, so it never slows down or stalls your CI pipeline.&lt;/p&gt;

&lt;h2&gt;
  
  
  Flaky test detection
&lt;/h2&gt;

&lt;p&gt;This was the original reason I built it. The dashboard tracks pass/fail across runs and surfaces tests with inconsistent results — not just "it failed today" but "this test has failed 4 out of the last 20 runs."&lt;/p&gt;

&lt;p&gt;It also integrates cleanly with &lt;code&gt;pytest-rerunfailures&lt;/code&gt;: intermediate retries are ignored and only the final outcome is recorded, so a test that passes on the third attempt counts as one flaky event, not two failures.&lt;/p&gt;

&lt;h2&gt;
  
  
  CI setup (GitHub Actions example)
&lt;/h2&gt;



&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight yaml"&gt;&lt;code&gt;&lt;span class="pi"&gt;-&lt;/span&gt; &lt;span class="na"&gt;name&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;Run tests&lt;/span&gt;
  &lt;span class="na"&gt;env&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
    &lt;span class="na"&gt;PYTEST_CLOUD_API_KEY&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;${{ secrets.PYTEST_CLOUD_API_KEY }}&lt;/span&gt;
  &lt;span class="na"&gt;run&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;pytest --cloudreport-local&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;That's it. You get both the local HTML artifact and the cloud upload in one command.&lt;/p&gt;

&lt;p&gt;To attach the report as a CI artifact:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight yaml"&gt;&lt;code&gt;&lt;span class="pi"&gt;-&lt;/span&gt; &lt;span class="na"&gt;uses&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;actions/upload-artifact@v4&lt;/span&gt;
  &lt;span class="na"&gt;if&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;always()&lt;/span&gt;
  &lt;span class="na"&gt;with&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
    &lt;span class="na"&gt;name&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;test-report&lt;/span&gt;
    &lt;span class="na"&gt;path&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;cloudreport.html&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h2&gt;
  
  
  pytest-xdist support
&lt;/h2&gt;

&lt;p&gt;If you run tests in parallel with &lt;code&gt;pytest-xdist&lt;/code&gt;, the plugin automatically detects worker processes and only uploads from the controller. No configuration needed, no duplicate uploads.&lt;/p&gt;

&lt;h2&gt;
  
  
  Escape hatch
&lt;/h2&gt;

&lt;p&gt;For teams where the cloud upload is opt-in only:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;&lt;span class="nv"&gt;CLOUDREPORT_DISABLE&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;1 pytest
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;This hard-disables the plugin without uninstalling it — useful if you want it enabled by default in CI but off for local dev.&lt;/p&gt;

&lt;h2&gt;
  
  
  Environment labelling
&lt;/h2&gt;

&lt;p&gt;Tag runs by environment so you can compare CI vs staging vs production results:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight ini"&gt;&lt;code&gt;&lt;span class="c"&gt;# pytest.ini
&lt;/span&gt;&lt;span class="nn"&gt;[pytest]&lt;/span&gt;
&lt;span class="py"&gt;cloudreport_environment&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;staging&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Or via env var: &lt;code&gt;PYTEST_CLOUD_ENV=staging&lt;/code&gt;.&lt;/p&gt;

&lt;h2&gt;
  
  
  Getting started
&lt;/h2&gt;



&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;pip &lt;span class="nb"&gt;install &lt;/span&gt;pytest-cloudreport

&lt;span class="c"&gt;# Local report only&lt;/span&gt;
pytest &lt;span class="nt"&gt;--cloudreport-local&lt;/span&gt;

&lt;span class="c"&gt;# With cross-run history (local SQLite)&lt;/span&gt;
pytest &lt;span class="nt"&gt;--cloudreport-local&lt;/span&gt; &lt;span class="nt"&gt;--accumulate&lt;/span&gt;

&lt;span class="c"&gt;# With cloud dashboard&lt;/span&gt;
&lt;span class="nb"&gt;export &lt;/span&gt;&lt;span class="nv"&gt;PYTEST_CLOUD_API_KEY&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;your_key
pytest
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Free tier covers 1 project and 5,000 tests per day — enough for most projects without a credit card.&lt;/p&gt;

&lt;p&gt;Source: &lt;a href="https://github.com/ahmad212o/pytest-cloudreport" rel="noopener noreferrer"&gt;github.com/ahmad212o/pytest-cloudreport&lt;/a&gt;&lt;br&gt;
PyPI: &lt;a href="https://pypi.org/project/pytest-cloudreport/" rel="noopener noreferrer"&gt;pytest-cloudreport&lt;/a&gt;&lt;br&gt;
Dashboard: &lt;a href="https://cloudreport.dev" rel="noopener noreferrer"&gt;cloudreport.dev&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Feedback welcome — especially on the flaky-test heuristic and any edge cases you hit with unusual pytest setups.&lt;/p&gt;

</description>
      <category>python</category>
      <category>pytest</category>
      <category>testing</category>
      <category>devops</category>
    </item>
  </channel>
</rss>
