DEV Community

Cover image for The Phantom Quota: Reclaiming GitHub Actions Storage for Peak Software Developer Efficiency
Oleg
Oleg

Posted on

The Phantom Quota: Reclaiming GitHub Actions Storage for Peak Software Developer Efficiency

In the fast-paced world of software development, optimizing every aspect of our workflow is crucial for maintaining high software developer efficiency. Yet, sometimes, hidden costs and inefficiencies lurk in unexpected corners. A recent discussion on the GitHub Community forum, initiated by user jefflongo, brought to light a perplexing issue: an organization hitting its GitHub Actions storage quota despite not running any actions for months and having artifact expiration configured. This scenario isn't just a minor inconvenience; it's a drain on resources, budget, and ultimately, developer focus.

The Phantom Quota: When "Expired" Doesn't Mean Gone

Jefflongo's dilemma was straightforward: an email warning about nearing their Actions storage quota, with usage reports showing approximately 45 gigabyte-hours per day from a single repository. The confounding factor? This repository hadn't executed any actions in over 10 months. Its few old workflow runs had artifacts explicitly set to expire in one day and were indeed marked as expired in the UI. The logical expectation was zero storage impact, yet the billing meter kept ticking.

The community's responses quickly unraveled this mystery, revealing a nuanced and sometimes frustrating reality of GitHub Actions storage management. It turns out, "expired" doesn't always mean "deleted" in the immediate, billable sense.

A server rack showing an A server rack showing an "expired" hard drive still physically present and consuming space, illustrating the ghost file glitch.

Why Your Storage Bill Might Not Reflect Reality

- **Delayed Cleanup Processes:** GitHub, like any large-scale platform, relies on background jobs for cleanup. Even when artifacts are configured to expire, there can be a significant delay between an artifact being marked 'expired' and its physical deletion from storage. During this limbo period, the data still occupies disk space and, critically, counts against your quota.

- **The "Ghost File" Glitch:** As highlighted by community member BryanBradfo, a known backend issue exists where GitHub's UI correctly marks artifacts as "expired," but the underlying physical files remain on the server. This means the billing system continues to count these "ghost files" against your quota, directly impacting your organization's budget and perceived control over your **development productivity tools**.

- **Beyond Artifacts:** The storage quota isn't solely for build artifacts. Workflow logs, caches, and other metadata associated with workflow runs also consume space. These elements can persist even after artifacts are gone or marked expired, contributing to the cumulative storage footprint.

- **Time-Weighted Billing:** As ash-iiiiish noted, artifact storage billing is based on time-weighted usage, not just point-in-time snapshots. Deleting artifacts stops *future* accrual but doesn't retroactively erase the storage already consumed up to that point. This makes timely cleanup even more critical.
Enter fullscreen mode Exit fullscreen mode

Reclaiming Your Space: Immediate Actions and Proactive Strategies

For dev teams, product managers, and CTOs focused on optimizing developer OKRs and ensuring efficient resource allocation, understanding and mitigating these storage anomalies is paramount. The good news is that while the problem can be perplexing, the solutions are actionable.

Immediate Resolution: Manual Intervention

When faced with an immediate quota warning due to "ghost files" or delayed cleanup, manual intervention is often the fastest path to resolution. As several community members advised, deleting old workflow runs directly forces the system to purge associated artifacts and logs:

- **Via GitHub UI:** Navigate to the Actions tab in the affected repository. Click on each old workflow run and use the delete option. This process removes associated artifacts and logs.

- **Using GitHub CLI:** For repositories with numerous old runs, manual deletion via the UI can be tedious. The GitHub CLI offers a more efficient way to bulk delete stale workflow data. Commands like `gh run list --status completed --json databaseId,createdAt,url | jq '.[] | select(.createdAt < "2023-01-01T00:00:00Z") | .databaseId' | xargs -I {} gh run delete {}` can be adapted to target runs older than a specific date.
Enter fullscreen mode Exit fullscreen mode

After deletion, expect a delay of 6-24 hours for GitHub's storage recalculation to complete. Your usage should drop once the system recognizes the deletions.

A developer using a command-line interface to delete old workflow runs, with symbols of cleared cloud storage and set retention policies.A developer using a command-line interface to delete old workflow runs, with symbols of cleared cloud storage and set retention policies.

Proactive Prevention: Setting Up for Sustainable Efficiency

While manual deletion is a good reactive measure, true software developer efficiency comes from proactive management. Implement these strategies to prevent future phantom quota issues:

- Aggressive Organization-Level Retention Policies: Don't rely solely on per-workflow settings, which can be inconsistent. Configure aggressive artifact and log retention policies at the organization level. Go to Organization Settings → Actions → General and adjust "Artifact and log retention" to the shortest practical duration for your needs (e.g., 7-14 days). This ensures consistency across all repositories.

  • Regular Cache Clearing: GitHub Actions caches can accumulate significant data over time. Periodically review and clear stale caches to free up space. While not directly billed as "storage," large caches can impact performance and contribute to overall data footprint.

Consider Alternative Storage for Large Outputs: For build outputs or artifacts that need longer retention or are exceptionally large, GitHub Actions storage might not be the most cost-effective solution. Explore alternatives such as:

    - **GitHub Releases:** Ideal for storing production binaries and release assets.

    - **Cloud Storage:** Integrate with AWS S3, Google Cloud Storage, Azure Blob Storage, or similar services for long-term, cost-optimized storage of large artifacts.
Enter fullscreen mode Exit fullscreen mode
  • Monitor Usage Regularly: Don't wait for an email warning. Regularly review your organization's GitHub Actions usage reports to identify repositories with unexpectedly high storage consumption. Early detection can prevent significant costs.
Enter fullscreen mode Exit fullscreen mode

Beyond the Quota: A Call to Action for Technical Leaders

For CTOs, delivery managers, and engineering leaders, this discussion underscores the importance of not just enabling development productivity tools but also understanding their operational nuances. Implementing robust policies around GitHub Actions storage isn't just about saving money; it's about fostering a culture of resource awareness and preventing distractions that pull developers away from their core tasks.

By proactively managing GitHub Actions storage, teams can ensure their CI/CD pipelines remain efficient, predictable, and cost-effective, directly contributing to higher software developer efficiency and better alignment with strategic developer OKRs. Don't let phantom quotas haunt your budget or your team's productivity.

Top comments (0)