Update (March 2026): Since this article was published, I have been using Claude Code + n8n-MCP to build and debug n8n workflows directly from the terminal. The combo makes the FastAPI approach below even more powerful: Claude Code scaffolds the workflow, n8n runs it, Python handles the heavy lifting. I wrote about the full setup here.
Sometimes, you need to call a Python script from an n8n workflow. While n8n provides a Code node that supports Python, its implementation is limited. If you want to leverage the full power of Python — including external libraries, complex computations, or machine learning models — you need a dedicated Python environment.
TL;DR: Use FastAPI to wrap Python scripts as API endpoints, containerize with Docker, and call from n8n workflows via HTTP requests. This bypasses n8n's limited Python Code node to access full Python capabilities including external libraries.
In this guide, I’ll show you how to integrate a Python script into an n8n workflow by using FastAPI to create an API endpoint. Whether you want to process images, manipulate data, or perform any custom task, this approach lets you harness Python’s power within n8n’s automation capabilities.
We’ll cover:
- Wrapping a Python script it in a FastAPI application.
- Containerizing it with Docker.
- Calling it from an n8n workflow.
By the end, you’ll have a working setup where n8n triggers your Python script via an HTTP request.
Prerequisites
Before starting, ensure you have:
- Docker installed on your machine or server.
- n8n running (preferably in Docker, though other setups work too).
- Basic familiarity with Python, Docker, and n8n workflows.
- A jar of Tiger Balm to boost courage.
Step 1: Create Your Python Script
Let’s start with a simple Python script. For this example, we’ll use an image resizing script, but you can swap it out for any Python logic you need.
Example Script: resize.py
This script resizes an image to a 1000x1000 square while maintaining its aspect ratio and adding a white background.
I chose this example because the image resize node in n8n isn’t working properly at the moment. n8n Guys 😵💫, you need to fix this!
from PIL import Image
import argparse
def resize_image(input_path, output_path, size=1000):
# Open the original image
original_image = Image.open(input_path)
width, height = original_image.size
# Calculate new dimensions while preserving aspect ratio
ratio = min(size / width, size / height)
new_width = int(width \* ratio)
new_height = int(height \* ratio)
# Resize the image
resized_image = original_image.resize((new_width, new_height), Image.LANCZOS)
# Create a white 1000x1000 background and paste the resized image
final_image = Image.new("RGB", (size, size), (255, 255, 255))
offset_x = (size - new_width) // 2
offset_y = (size - new_height) // 2
final_image.paste(resized_image, (offset_x, offset_y))
# Save the result
final_image.save(output_path)
if __name__ == "__main__":
parser = argparse.ArgumentParser(description="Resize image to 1000x1000")
parser.add_argument("input_image", help\="Path to input image")
parser.add_argument("output_image", help\="Path to output image")
args = parser.parse_args()
resize_image(args.input_image, args.output_image)
Save this as resize.py. It takes an input image path and an output path as arguments.
Step 2: Create a FastAPI Wrapper
Instead of just building a FastAPI application, we’re actually wrapping our Python script with an API, making it accessible to n8n. This wrapper will allow n8n to interact with the script via HTTP requests, ensuring better flexibility and scalability.
💡 Another option would have been to use Flask, which is a popular lightweight framework for building APIs. However, we chose FastAPI because it offers asynchronous support, automatic data validation, and built-in OpenAPI documentation, making it faster and more efficient for handling API requests — especially in automation scenarios like n8n.
One of the key advantages of FastAPI is how easy it is to extend. Adding more scripts is as simple as defining new endpoints. For example, if you need another script to apply a grayscale filter to an image, you can just create a new route like /grayscale and call the corresponding script. This modular approach makes it easy to scale your API and integrate multiple Python scripts into your n8n workflow without modifying existing endpoints.
#app.py
from fastapi import FastAPI, UploadFile, File
from fastapi.responses import StreamingResponse
import subprocess
import tempfile
import os
import io
app = FastAPI()
@app.post("/resize")
async def resize_image(file: UploadFile = File(...)):
# Create temporary files for input and output
with tempfile.NamedTemporaryFile(delete=False, suffix=".jpg") as input_file:
input_path = input_file.name
input_file.write(await file.read())
with tempfile.NamedTemporaryFile(delete=False, suffix=".jpg") as output_file:
output_path = output_file.name
# Run the resize script
try:
subprocess.run(["python", "resize.py", input_path, output_path], check=True)
except subprocess.CalledProcessError as e:
return {"error": f"Failed to resize image: {e}"}
# Read the resized image
with open(output_path, "rb") as f:
image_data = f.read()
# Clean up temporary files
os.remove(input_path)
os.remove(output_path)
# Return the image as a response
return StreamingResponse(io.BytesIO(image_data), media_type="image/jpeg")
How It Works:
- The /resize endpoint accepts an uploaded file.
- It saves the file temporarily, calls resize.py using subprocess, and returns the resized image.
- Temporary files are deleted after processing to avoid clutter.
Save this as app.py in the same directory as resize.py.
Note: Reflection on the FastAPI Approach with subprocess vs Python Class 🥊
Jazys pointed out to me: “Well, for a developer, that kind of line stings the eyes. The best would be to create a Python class and call that class. Anyway, when you want to add new APIs someday, you’ll have to modify your app.py, so you might as well integrate Python all the way.”
He’s absolutely right from a purist perspective: using a Python class (like ImageProcessor) to encapsulate the logic, integrated directly into the FastAPI application, enhances robustness, performance, and maintainability, especially for an evolving application with new routes. For a developer seeking a clean, modular, and well-documented solution (with automatic Swagger documentation), this approach is ideal (e.g., importing a class into app.py with reusable methods).
However, as a pragmatic tinkerer and non-developer, I prefer my subprocess solution: it allows me to quickly call independent small scripts without worrying about robustness or scalability, which suits my simple needs. Thanks for the sharp insight, Jazys — your dev wisdom shines through! 😉
Step 3: Define Dependencies
Create a requirements.txt file to list the Python packages we need (btw, If you’re as lazy as I am, you can generate a requirements.txt file automatically by running the following command in your project directory: pip freeze > requirements.txt )
# requirements.txt
fastapi
uvicorn
pillow
python-multipart
Save this as requirements.txt.
Step 4: Containerize with Docker
To ensure seamless integration with n8n and maintain consistency across environments, we’ll containerize our FastAPI application using Docker. Since n8n is likely running in a Docker container, we’ll connect our Python API to the same Docker network from the start.
Additionally, we’ll explore two approaches: using a standalone Dockerfile or leveraging docker compose for more complex setups. In specific cases, such as when processing a large number of files, we’ll also discuss using a shared volume.
Network Setup
Both n8n and our Python API need to communicate within the same Docker network. First, identify the network your n8n container is using:
docker inspect <n8n_container_name> | grep Network
Replace with the name of your n8n container (e.g., n8n-akc0oo0ogc0gkog0g8ww). Look for the NetworkID or Networks section to find the network name (e.g., coolify or n8n_default). We’ll use this network to ensure the containers can talk to each other using their container names.
Remark: We could have chosen to expose our Python scripts to the internet by mapping a public port (e.g., -p 8000:8000) and securing it with HTTPS and authentication. However, in this case, we’ve opted to keep them internal to the VPS. This simplifies the setup and avoids security complexities that arise when services are exposed to the world, such as managing firewalls, certificates, reverse proxy, or potential attacks.
Option 1: Using a Standalone Dockerfile
This is the simplest approach, ideal for a single-service setup.
Create a Dockerfile in your project directory:
FROM python:3.9
WORKDIR /app
COPY requirements.txt .
RUN pip install -r requirements.txt
COPY app.py .
COPY resize.py .
CMD ["uvicorn", "app:app", "--host", "0.0.0.0", "--port", "8000"]
Build the Docker image:
docker build -t fastapi-python-app .
Run the Container
Run the container, connecting it to the n8n network (replace with the actual network name):
docker run -d --name python-api --network <n8n_network> fastapi-python-app
-name python-api: Names the container for easy reference.
-network : Connects it to n8n’s network, allowing communication via http://python-api:8000.
The API will be accessible at http://python-api:8000/resize within the network.
Specific Case: Shared Volume
If your workflow involves processing a large number of files (e.g., resizing hundreds of images), you might want to avoid repeatedly transferring files over HTTP. Instead, you can use a shared Docker volume to store files accessible to both n8n and the Python API. For example:
docker run -d --name python-api --network <n8n_network> -v /path/to/shared/folder:/data fastapi-python-app
- -v /path/to/shared/folder:/data: Mounts a directory from the host (/path/to/shared/folder) to /data in the container.
- Update app.py to read/write files from /data instead of using temporary files, and ensure n8n can write to the same volume. This is an exception rather than the norm, as it couples the containers more tightly, but it’s efficient for bulk file operations.
Option 2: Using Docker Compose
For a more structured setup, especially if you’re managing multiple services or want to define the network explicitly, use docker-compose.
Create a docker-compose.yml file:
services:
python-api:
build: .
image: fastapi-python-app
container_name: python-api
command: uvicorn app:app --host 0.0.0.0 --port 8000
networks:
- n8n_network
networks:
n8n_network:
name: <n8n_network>
external: true
- build: .: Builds the image from the Dockerfile in the current directory.
- networks: Connects the service to the existing n8n network (replace with the actual name from docker network ls).
Build and Run
Start the service:
docker compose up -d
This launches the python-api container in the specified network.
Shared Volume with Docker Compose
For scenarios with many files, add a volume to docker-compose.yml:
services:
python-api:
build: .
image: fastapi-python-app
container_name: python-api
command: uvicorn app:app --host 0.0.0.0 --port 8000
volumes:
- shared-data:/data
networks:
- n8n_network
volumes:
shared-data:
networks:
n8n_network:
name: <n8n_network>
external: true
- volumes: Defines a named volume (shared-data) mounted at /data in the container.
- If n8n is also managed by this docker-compose.yml, add it as a service and attach it to the same volume and network. Otherwise, ensure n8n’s container mounts the same volume separately.
Managing with Docker Compose
Stop or restart with:
docker compose down
docker compose up -d
Step 5: Verify and Test the Container
Before integrating the Python API with n8n, let’s confirm that the container runs correctly and the API responds as expected. We’ll test it with curl in a way that you can later copy-paste directly into n8n’s HTTP Request node for a seamless transition.
Check Container Status
Confirm the container is running:
docker ps
Look for python-api in the list with a status of Up and no exposed ports (since we’re using an internal network). If it’s not running, check the logs:
docker logs python-api
Fix any errors (e.g., missing files, dependency issues) by rebuilding the image if needed:
docker build -t fastapi-python-app .
docker run -d --name python-api --network <n8n_network> fastapi-python-app
Test the API Locally
Test the endpoint from within the Docker network using curl:
docker run --rm --network <n8n_network> curlimages/curl curl -X POST -F "file=@/path/to/test.jpg" http://python-api:8000/resize -o output.jpg
- Replace /path/to/test.jpg with a test image path, mounting it if necessary (e.g., -v /local/path:/data and use file=@/data/test.jpg).
- Verify output.jpg contains the resized image.
Bonus: n8n-Compatible curl Command
Here’s a curl command you can run directly on your host (if the port is exposed) or adapt for n8n:
curl -X POST -F "file=@/path/to/test.jpg" http://python-api:8000/resize -o output.jpg
- Why it’s useful: This exact command can be copied into n8n’s HTTP Request node via the import cURL option and pasting it into the “Raw Request” field (after adjusting the file path to match n8n’s binary data). It’s a quick way to prototype the request before fine-tuning in n8n.
- If you exposed the port locally (e.g., with -p 8000:8000), run this from your VPS to test outside Docker.
If this works, the API is ready for n8n to call it 😀.
Step 6: Configure n8n to Call the API
In your n8n instance:
- Open the workflow editor.
- Add an HTTP Request node.
- Configure it like this:
- URL: http://python-api:8000/resize (using the container name as the hostname).
- Method: POST.
- Send Binary Data: Enable this option.
- Binary Property: Set to data (or the name of your binary input field).
Adding an Image File
If you’re resizing an image from disk:
- Add a Read Binary File node before the HTTP Request node.
- Set the file path to your input image.
- Connect it to the HTTP Request node, ensuring the binary data flows through.
If this content is useful, your claps 👏, highlights 🖍️, or comments 💬 help us produce more relevant material.
Step 7: Test the Workflow
- Execute the n8n workflow.
- Check the HTTP Request node’s output: You should see the resized image returned as binary data.
- Admire this masterpiece! 😆🔥
Troubleshooting
- “Connection refused”: Ensure both containers are on the same Docker network and the URL matches the container name (python-api).
- “Module not found”: Double-check requirements.txt and rebuild the Docker image.
- File issues: If n8n can’t find the input file, verify the path is accessible within its container.
Now You’re All Set!
You’re now capable of securely calling Python scripts from n8n without any side effects on your automation environment. Plus, you could even use these scripts outside of n8n — but that’s a whole other story!
Happy automating! Let me know in the comments if you run into any issues or have questions. 🚀
I documented the CLI structure, production CLAUDE.md, and agent constraints I use across all my automation projects in a free kit. 3 files, 10 minutes.


Top comments (0)