The Problem: 45-Minute Backtests for a Simple Moving Average Strategy
I had a backtest that took 45 minutes to run. The strategy was trivial — a dual moving average crossover on SPY, testing across 200 combinations of short and long windows. Nothing fancy. Just buying when the 20-day MA crossed above the 50-day, selling when it crossed below.
The code looked clean. Pandas rolling windows, a simple loop over parameter combinations, position sizing logic. But every time I added another year of data or expanded the parameter grid, the runtime exploded.
Turns out the bottleneck wasn't the strategy logic — it was how I was using arrays.
Trick 1: Vectorize Position Calculations with np.where Instead of Loops
My original position logic looked like this:
python
import pandas as pd
import numpy as np
# positions was initialized as zeros
for i in range(1, len(signals)):
if signals[i] == 1 and positions[i-1] == 0:
positions[i] = 1 # Enter long
elif signals[i] == -1 and positions[i-1] == 1:
positions[i] = 0 # Exit
---
*Continue reading the full article on [TildAlice](https://tildalice.io/numpy-tricks-portfolio-backtest-10x-faster/)*

Top comments (0)