When I first built my projects page, I pulled data directly from the GitHub API. It worked fine locally, but once I deployed it to a static hosting platform, a problem appeared: API rate limits.
GitHub’s API only allows a limited number of unauthenticated requests per hour. On static hosting (like GitHub Pages), there’s no server-side environment to hide API keys or manage authentication. This meant that every visitor’s browser hit the GitHub API directly. A few visitors were enough to exhaust the limit and break the project feed.
This challenge made me rethink the architecture. Instead of relying on live API calls, I built a static JSON caching system:
- A Node.js script (update-projects.js) fetches repositories from the GitHub API. The data is saved to static/projects.json.
- The frontend loads data from this JSON file instead of calling the API.
To keep the data fresh, I added a GitHub Actions workflow that:
- Runs every Monday at midnight UTC.
- Fetches the latest repositories.
- Updates static/projects.json only if changes are detected.
Commits and pushes updates back to the repo, which triggers redeployment.
This approach gives me the best of both worlds:
- Fast, reliable static hosting without API failures.
- Fresh data automatically updated on schedule.
- Zero risk of hitting API limits for visitors.
What started as a limitation turned into a neat solution that blends static hosting with automation.
Check out my portfolio gitongacode.co.ke
Top comments (0)