I maintain several large Laravel applications. Last year I was profiling an import endpoint that accepts JSON payloads with up to a hundred items. Each item has around 47 fields, many with conditional rules like exclude_unless and required_if. The endpoint was taking 3.4 seconds. I assumed it was database queries.
It wasn't. Validation alone was taking 3.2 of those 3.4 seconds. The database work was 200ms.
I spent a good part of a day on finding out why. Once I figured out what was causing it, I submitted 10 performance PRs to Laravel. Most were closed. As Taylor always says: "Consider releasing it as a package". So I did, and it brings that 3.2s down to 83ms of time spent on validation.
What actually happens when you validate arrays
When you write this:
'items.*.name' => 'required|string|max:255',
'items.*.qty' => 'required|numeric|min:1',
Laravel's explodeWildcardRules() expands the wildcards into concrete rules for every item in the data:
items.0.name => required|string|max:255
items.0.qty => required|numeric|min:1
items.1.name => required|string|max:255
items.1.qty => required|numeric|min:1
items.2.name => required|string|max:255
...
items.499.name => required|string|max:255
items.499.qty => required|numeric|min:1
For a simpler example: 500 items with 7 fields gives you 3,500 concrete rules. The expansion works by matching each wildcard pattern against every key in the flattened data array. That's where the O(n²) comes from — for every wildcard rule, it scans every data key.
That's not the only cost. During validation, each attribute runs through validateAttribute() which evaluates every rule — including exclude_unless and exclude_if. These exclusion rules call dependent-rule parameter resolution and accumulate excluded attributes. After the main loop, shouldBeExcluded() checks each attribute against that list and removes excluded ones. For conditional-heavy payloads, the cost of evaluating all those exclusion rules dominates. With 100 items and 47 conditional patterns, you end up with 4,700 concrete rules, each evaluating its exclusion conditions.
Here's a simplified view of what the validator is doing:
for each of 4,700 expanded attributes:
for each rule on this attribute:
parse the rule string ← repeated for identical rules
resolve the value from nested data ← Arr::dot() on every access
evaluate exclude_unless/exclude_if ← dependent-rule parameter resolution
run the validation logic
check if attribute was excluded ← scans excludeAttributes list
Multiply that inner loop by 4,700 and you have your 3.2 seconds.
How I found it
I started with Laravel Telescope, which showed the validation taking most of the request time but didn't show why. So I switched to Xdebug's profiler and opened the trace in KCachegrind.
explodeWildcardRules() was eating 35% of validation time — converting Arr::dot() output to regex patterns, then matching every wildcard against every key. Exclusion rule evaluation (validateExcludeUnless, validateExcludeIf, and the dependent-rule parameter resolution they trigger) took another 28%. ValidationRuleParser::parse() accounted for 15%, parsing the same rule strings hundreds of times because there's no caching between items.
The remaining 22% was scattered: BigNumber for simple integer comparisons that could've been native PHP, repeated hasRule() calls, message placeholder replacement on strings that had no placeholders.
What I tried upstream
On March 15, 2026, I submitted 10 performance PRs to laravel/framework:
- Traverse data directly for wildcard expansion instead of
Arr::dot()+ regex - Cache parsed string rules to avoid re-parsing identical rules across items
- Pre-compute bail/nullable/sometimes flags instead of calling
hasRule()repeatedly - Use native PHP comparison for size validation instead of BigNumber
- Short-circuit
shouldStopValidating()when no errors exist yet - Cache
Arr::undot()results inRule::compile() - Skip placeholder replacements when the message template has none
- Use
array_pushwith spread inMessageBag::all() - Cache route instances in
CompiledRouteCollection::getByName() - Avoid redundant
Util::getParameterClassName()in the container
The last four were merged. The six validation-specific ones were all closed. I resubmitted the wildcard traversal PR (#59287) against the 13.x branch with benchmarks from a real application. Closed again.
What I tried that didn't work
Before building a package, I tried a few things.
Splitting the payload into chunks and validating each chunk separately. Helps with memory, but the O(n²) expansion still runs per chunk, and error messages lose their original indices.
Rule::forEach() instead of wildcard rules. It uses NestedRules::compile() to generate per-item rules, avoiding explodeWildcardRules(). But the compilation overhead per item — instantiating rule objects, re-parsing strings, resolving callbacks — meant it was actually slower for simple cases.
Pre-expanding rules manually before passing them to the validator. This skips the expansion cost but you still pay for exclusion rule evaluation during validation. Half the problem, not the whole thing.
I also tried caching the validator instance across requests, but rules change with every payload — different item counts mean different expanded rules.
The fundamental issue is that Laravel's validator works well up to maybe 50-100 rules. Beyond that, the linear scans become quadratic.
The fix
Since the fixes couldn't go upstream, I built them into laravel-fluent-validation. The HasFluentRules trait plugs into your existing FormRequest and applies three optimizations.
The first optimization replaces the O(n²) wildcard expansion with O(n) tree traversal. Instead of flattening the data with Arr::dot() and matching regex patterns against every key, WildcardExpander walks the data tree once and emits concrete paths as it descends. For 500 items with 7 fields, all those regex matches become a single tree walk. It also skips the intermediate Arr::dot() array, which saves memory on large payloads. About 20% faster by itself.
The second optimization makes the biggest difference on simple rules. The package compiles PHP closures for 25 common rules — string, numeric, integer, boolean, email, url, ip, uuid, in, regex, and others. Each closure is a few is_string() / strlen() / in_array() calls. During validation, each item hits the closure first. Pass? Laravel's validator never sees it. Fail? That item goes through Laravel for the correct error message.
is_string($v) && strlen($v) <= 255 takes nanoseconds. Laravel's validator doing the same check goes through rule parsing, method dispatch, BigNumber size comparison, and message formatting — even when the value is valid. Custom Rule classes, closures, and anything the package doesn't recognize pass through to Laravel untouched.
The third optimization targets conditional rules. For exclude_unless and exclude_if, the package evaluates the conditions before the validator starts and removes excluded attributes from the rule set entirely. Instead of 4,700 rules where each attribute evaluates its exclusion conditions at validation time, the validator only sees the ~200 rules that actually apply.
You don't need to change your rules. Just add the trait:
use SanderMuller\FluentValidation\HasFluentRules;
class ImportRequest extends FormRequest
{
use HasFluentRules;
// your existing rules() method, unchanged
}
The numbers
These benchmarks run in CI on every PR. The data is randomly generated each run; the numbers are median of 3 runs.
| Scenario | Native Laravel | With HasFluentRules | Speedup |
|---|---|---|---|
| 500 items × 7 simple fields (string, numeric, in) | ~200ms | ~2ms | 97x |
| 500 items × 7 mixed fields (string + date comparison) | ~200ms | ~20ms | 10x |
| 100 items × 47 conditional fields (exclude_unless) | ~3,200ms | ~83ms | 39x |
97x on simple rules because every field is fast-checkable and the validator only runs for items that fail. 10x on mixed rules because string and numeric fields use closures while date comparisons still go through Laravel. 39x on conditional rules comes almost entirely from pre-evaluation removing excluded attributes before the validator starts.
If your form has 5 fields, validation takes ~1ms either way. Nobody cares. But if you're validating a CSV import with 500 rows or an API endpoint accepting a batch of orders, 200ms vs 2ms is the difference between a page that feels instant and one that feels broken.
Is this safe?
The closures implement the same logic as Laravel's validator — is_string(), is_numeric(), filter_var(), the same checks. If a fast-check passes, the value would have passed Laravel's validator too. If it fails, that value goes through Laravel's full validation pipeline for the correct error message and rule identifier.
The package has 516 tests and 1,020 assertions, and benchmarks run in CI on every PR. It's Octane-safe (factory resolver restored via try/finally, no static state between requests).
When you don't need this
If your form requests have fewer than ~50 expanded rules, the standard validator is fine. The overhead is negligible at that scale. This package solves a specific problem: array validation with hundreds or thousands of items where wildcard expansion and per-attribute scanning become the bottleneck.
If you're not sure whether validation is your bottleneck, profile first. Laravel Telescope shows total request time breakdowns. If validation isn't in the top 3, you have bigger fish.
Try it
composer require sandermuller/laravel-fluent-validation
Add use HasFluentRules to a FormRequest with wildcard rules and run your benchmarks. The performance optimization works with your existing string rules — just add the trait. There's also a fluent rule builder API with IDE autocompletion (FluentRule::string()->required()->max(255)) but the performance fix is the real reason you want this.
I've deployed this across all the large applications I maintain — around 80 files converted, thousands of tests passing, zero behavioral regressions. Each codebase surfaced different edge cases (Livewire's rule key pre-reading breaking each(), Filament's trait collision with the validation override) and the package got better each time.
If you've ever wondered why your import endpoint is slow, check validation before reaching for query optimization. It might be where 90% of the time goes.
https://github.com/SanderMuller/laravel-fluent-validation
These optimizations belong in the framework. I'll keep advocating for that upstream while maintaining the package. The performance issue is tracked in laravel/framework#49375.
Top comments (0)