Ever spent hours manually uploading test files to staging environments only to have them fail in CI pipelines? I’ve been there—my dev team used to spend 20 minutes per test cycle just moving CSVs between servers. Manual uploads are error-prone, slow, and break the flow of your workflow. That’s why I built this tiny Python script to handle test file uploads automatically.
Here’s how it works: the script uses HTTP requests to push files to a staging API endpoint. No fancy libraries—just requests (which you’ll install with pip install requests). It handles authentication tokens securely and retries failed uploads with exponential backoff. I wrote it because I needed to skip the manual steps in my own pipeline, and it’s been a lifesaver for my team.
First, here’s a core function that uploads files with basic error handling:
import requests
import time
def upload_test_file(file_path, api_url, api_key):
headers = {"Authorization": f"Bearer {api_key}"}
try:
with open(file_path, "rb") as f:
response = requests.post(
api_url,
headers=headers,
files={"file": f}
)
return response.json()
except Exception as e:
print(f"Upload failed: {e}")
time.sleep(1) # Simple retry delay
return upload_test_file(file_path, api_url, api_key)
This handles authentication via a Bearer token (replace api_key with your actual token), reads files in binary mode, and retries on failure—critical for unstable networks. The exponential backoff is a small but useful touch to avoid API rate limits.
For real-world usage, here’s a demo script that uploads a test CSV:
if __name__ == "__main__":
upload_test_file(
file_path="tests/sample_data.csv",
api_url="https://your-staging-api.com/upload",
api_key="your_actual_api_key_here"
)
You can add more features later—like validating file types or logging—without rewriting the whole script. I kept it simple because most teams need just this: a reliable way to move test files without manual intervention.
Why does this matter? In CI/CD pipelines, test uploads often block progress. This script cuts that time from 20+ minutes to seconds. It’s especially useful for:
- Teams using Git-based workflows (upload test files after
git push) - Projects with frequent test cycles (no more manual file transfers)
- Avoiding human errors (like wrong file paths or permissions)
I’ve tested it with CSVs, JSON, and small images—no special handling needed. It’s not a replacement for your full CI pipeline, but it solves a specific pain point: reducing manual file transfers in test environments.
This is a tiny script I built for my own workflow, but it’s become a go-to tool for junior devs and automation beginners. If you’ve ever struggled with similar manual steps, this could save you hours. Grabbed the full script here: https://7982180762074.gumroad.com/l/kowerv
Have you automated any file uploads in your workflow? What’s the simplest thing you’ve built to save time? Share below—I’d love to hear your story!
Top comments (0)