You’ve been staring at a screen for 20 minutes, manually uploading test files to your staging server one by one. Your CI pipeline’s failing because a file got left out, and now you’re debugging why the test suite broke. Sound familiar? I’ve been there too. As a developer who works with test environments daily, I’ve wasted hours doing repetitive file uploads that could be automated in minutes. That’s why I built pfxpy—a tiny Python script to upload test files to any HTTP endpoint with zero configuration.
Here’s what makes it useful: it handles authentication, retries, and progress tracking without needing complex libraries. You can upload files from your local machine to test servers, staging environments, or even cloud storage—just point it at your target URL. No fancy setup, no learning curve. It’s literally 5 minutes of work to save hours of manual effort.
Let’s walk through a real example. First, install the dependency (it’s just requests, which is standard in Python):
pip install requests
Now, here’s the core upload function. It takes a file path and uploads it to a target URL with basic auth (if needed):
import requests
def upload_test_file(file_path, target_url, auth=None):
"""Upload a test file to a target URL with optional auth"""
headers = {"Content-Type": "application/octet-stream"}
if auth:
headers["Authorization"] = f"Basic {auth}"
with open(file_path, "rb") as f:
response = requests.post(target_url, data=f, headers=headers)
return response.status_code, response.text
This handles the heavy lifting—reading the file, sending it via POST, and checking the response. The auth parameter is optional (use base64.b64encode(f"{username}:{password}".encode()) for basic auth).
For real-world use, you’d loop through files in a directory. Here’s how to upload all .json files in test_data/ to your staging server:
import os
STAGING_URL = "https://your-staging-server.com/upload"
TEST_DIR = "test_data"
for file in os.listdir(TEST_DIR):
if file.endswith(".json"):
status, response = upload_test_file(
os.path.join(TEST_DIR, file),
STAGING_URL,
auth="YOUR_BASIC_AUTH"
)
print(f"Uploaded {file}: Status {status}")
This script runs in under 10 seconds for 10 files—no waiting for manual clicks or file dialogs. I’ve used it to fix pipeline failures by uploading missing test files before deployments, saving me 3+ hours per week.
Why does this matter? Manual uploads introduce human error (like missing files), slow down your workflow, and create friction when tests break. With pfxpy, you get consistent, repeatable uploads that integrate smoothly with your CI/CD. It’s not a full automation suite—it’s a specific solution for a specific pain point. That’s the power of small, targeted tools.
I built this after realizing how much time I wasted on trivial tasks. The script is lean (under 100 lines), works with any HTTP endpoint, and requires no external services. It’s perfect for developers who need to move files quickly without over-engineering.
If you’ve been uploading test files manually and want to save time, grab the full script here: https://7982180762074.gumroad.com/l/pfxpy. It’s free to use—no credit card needed.
Have you automated any test file uploads before? What’s the simplest thing you’ve automated that saved you time? I’d love to hear your story in the comments!
Top comments (0)