DEV Community

Cover image for Solved: Where can I host an API for free so a friend can pentest it?
Darian Vance
Darian Vance

Posted on • Originally published at wp.me

Solved: Where can I host an API for free so a friend can pentest it?

🚀 Executive Summary

TL;DR: Exposing a local API for external pentesting is challenging due to the localhost bubble and Network Address Translation (NAT). This guide presents three effective methods: quick tunneling with ngrok, utilizing cloud free tiers for a realistic environment, or deploying via serverless functions for a modern, cost-effective solution.

🎯 Key Takeaways

  • Local APIs running on localhost (127.0.0.1) are inherently inaccessible from the external internet due to the loopback address and Network Address Translation (NAT) performed by routers.
  • Tunneling services like ngrok provide a rapid, temporary public URL for local APIs, ideal for quick demos or one-off testing, but require the local machine to remain active and generate new URLs per session.
  • Cloud free tiers (e.g., AWS EC2 t2.micro) offer a more realistic, persistent sandbox environment for API hosting, while serverless functions (e.g., AWS Lambda, Cloudflare Workers) provide a highly scalable and cost-effective solution with no server management.

Exposing a local API for a friend to pentest seems simple, but network barriers make it tricky. Learn three effective methods: quick tunneling with ngrok, using cloud free tiers for a realistic setup, or going serverless for a modern, cost-effective solution.

So, You Need to Host an API for a Friend to Pentest? A Senior Engineer’s Guide.

I still remember the day. A junior engineer, brilliant kid, comes to my desk absolutely defeated. He’d spent six hours trying to get a firewall change request approved by corporate security just so our new product manager, who was working from home, could see the new API endpoint he’d built. He was wrestling with VPN clients, internal DNS, and ticketing systems. I walked him through setting up a tunnel in five minutes, and the look on his face was a mix of relief and “Why didn’t anyone tell me this before?” It’s a classic problem: your code runs perfectly on localhost:8000, but to the rest of the world, that address might as well be on the moon.

The “Why”: Understanding the “localhost” Bubble

Before we dive into the fixes, let’s quickly get on the same page. When you run a server on your machine, it binds to an address like localhost or 127.0.0.1. This is a special loopback address that only your computer understands. It’s designed for local development and can’t be reached from the outside internet.

Your machine is also sitting behind a router (your home WiFi, the office network) which uses a technology called Network Address Translation (NAT). This NAT gateway acts like a receptionist for all the devices on your network, giving them a single public IP address to share. Without specific instructions (port forwarding), the router has no idea that a request from your friend should be sent to your laptop’s port 8000. It just drops the request. This is a security feature, but it’s a pain when you need to share something.

So, the challenge is simple: how do we give your friend a public, internet-accessible URL that securely forwards traffic to the little server running in your localhost bubble?

Solution 1: The Quick & Dirty Fix (Tunneling)

This is my go-to for quick demos, webhooks, or exactly this kind of one-off testing scenario. We use a service that creates a secure tunnel from a public endpoint to your local machine. The most popular tool for this is ngrok.

How it works: You run a small command-line tool that connects to the ngrok cloud service. It gives you a unique public URL (like https://random-string.ngrok.io). Any traffic sent to that URL is securely tunneled to your local server.

Steps:

  1. Download ngrok and unzip it.
  2. Run your API locally (e.g., on port 8000).
  3. Open your terminal and run the command:
./ngrok http 8000
Enter fullscreen mode Exit fullscreen mode

Ngrok will print a public URL. Send that to your friend. Done. When you’re finished, just close the ngrok process (Ctrl+C) and the tunnel is gone.

Pro Tip: Be aware that the free version of ngrok generates a new random URL every time you start it. It’s perfect for temporary work but not for anything permanent. Also, remember you are punching a hole into your local machine—only do this with people you trust and shut it down when you’re done.

Solution 2: The “Real World” Sandbox (Cloud Free Tiers)

If you want to simulate a more realistic production environment, or need something that stays online longer, you can’t beat the free tiers from major cloud providers. You’re essentially spinning up a tiny virtual server in the cloud that has its own public IP address.

My recommendation is usually an AWS EC2 t2.micro or t3.micro instance, as they are part of their generous free tier for the first year. Google Cloud (e2-micro) and Azure have similar offerings.

How it works: You’ll launch a virtual machine, configure its firewall (called a “Security Group” in AWS), install your application’s dependencies, and run your API server there.

Basic AWS EC2 Steps:

  1. Sign up for an AWS Free Tier account.
  2. Navigate to the EC2 service and click “Launch Instance”.
  3. Choose an Amazon Machine Image (AMI), like “Ubuntu Server”.
  4. Select an instance type eligible for the free tier (e.g., t2.micro).
  5. In the “Network Settings” or “Security Group” section, you MUST add a rule to allow inbound traffic. For an API on port 8000, you’d add a “Custom TCP” rule for Port Range “8000” from Source “Anywhere” (0.0.0.0/0).
  6. Launch the instance, connect to it via SSH, install your code (e.g., using Git), and run your server.

Your friend can now hit your instance’s public IP address (e.g., http://54.12.34.56:8000). This is a much better simulation of a real deployment.

Solution 3: The Serverless Hero (Functions-as-a-Service)

This is the modern, and often cheapest, way to do it. Instead of managing a whole server, you just upload your code as a “function” and the cloud provider handles everything else. It’s incredibly cost-effective because you only pay for the milliseconds your code is actually running, and the free tiers are often so generous you’ll likely pay nothing.

Options include AWS Lambda with API Gateway, Google Cloud Functions, or my personal favorite for simplicity, Cloudflare Workers.

How it works (using Cloudflare Workers as an example):

  1. Sign up for a free Cloudflare account.
  2. Use their command-line tool, wrangler, to create a new project.
  3. Write your API logic in a single file (usually JavaScript/TypeScript).
  4. Run one command to deploy:
npx wrangler deploy
Enter fullscreen mode Exit fullscreen mode

Cloudflare instantly gives you a public URL. Your function is now running on their global edge network. There’s no server to patch, no OS to worry about, and it scales automatically.

Warning: This requires a shift in mindset. You’re not running a persistent server process like you do with Express or Flask. You’re writing stateless functions that respond to HTTP requests. It’s a different way of building, but for a simple API, it’s unbelievably powerful and efficient.

Which Path Should You Choose? A Quick Comparison

Here’s how I break it down for my team when they ask:

Solution Best For Pros Cons
Tunneling (ngrok) Quick, temporary sharing & demos. • 5-minute setup • No server config needed • Free for basic use • Temporary URL • Relies on your local machine being on • Not a production setup
Cloud VM (EC2) Realistic testing & longer-term projects. • Simulates production • Stable public IP • Full control over the environment • More complex setup (firewalls, SSH) • Easy to forget and get billed after free tier
Serverless (Lambda/Workers) Modern APIs, cost-sensitive projects. • Extremely cheap/free • Infinitely scalable • No server management • Different programming model (stateless) • Can have a steeper initial learning curve

There’s no single “right” answer. If your friend is waiting right now, use ngrok. If you’re building a portfolio piece you want to show off for a few months, use a free-tier EC2 instance. If you’re learning modern cloud-native development, dive headfirst into serverless. Just don’t spend six hours fighting a firewall ticket.


Darian Vance

👉 Read the original article on TechResolve.blog


☕ Support my work

If this article helped you, you can buy me a coffee:

👉 https://buymeacoffee.com/darianvance

Top comments (0)