Schrödinger's URL: Breaking vLLM with Parser Differentials (CVE-2026-24779)
Vulnerability ID: CVE-2026-24779
CVSS Score: 7.1
Published: 2026-01-28
A critical Server-Side Request Forgery (SSRF) vulnerability exists in vLLM versions prior to 0.14.1. The flaw stems from a 'Parser Differential' where the validation logic (using Python's urllib) and the execution logic (using urllib3/requests) interpret the same URL string differently. This discrepancy allows attackers to bypass domain allowlists by crafting URLs that appear innocent to the validator but resolve to internal network targets during execution. In containerized AI environments, this exposes sensitive metadata, internal APIs, and potentially allows for Denial of Service.
TL;DR
vLLM used two different libraries to parse URLs: one to check if it's safe, and another to actually fetch it. Attackers can exploit this disagreement to slip past the bouncer and force the AI server to access internal network resources, including cloud metadata services.
⚠️ Exploit Status: POC
Technical Details
- CWE ID: CWE-918
- Attack Vector: Network (AV:N)
- CVSS Score: 7.1 (High)
- Exploit Status: PoC Available
- Impact: Information Disclosure / DoS
- Key Fix: Standardize URL parsing using urllib3
Affected Systems
- vLLM inference server (versions < 0.14.1)
- Applications using vLLM for multimodal generation
- Kubernetes clusters hosting vulnerable vLLM pods
-
vLLM: < 0.14.1 (Fixed in:
0.14.1)
Code Analysis
Commit: f46d576
Fix SSRF vulnerability by using urllib3 for both validation and fetching
- from urllib.parse import ParseResult, urlparse
+ from urllib3.util import Url, parse_url
Exploit Details
- GitHub Advisory: Original advisory containing the parser differential concept and payload structure.
Mitigation Strategies
- Upgrade vLLM to version 0.14.1 or later.
- Implement Kubernetes NetworkPolicies to block egress to internal subnets (10.0.0.0/8, etc.).
- Block access to cloud metadata services (169.254.169.254) at the network level.
- Disable HTTP redirects in vLLM configuration.
Remediation Steps:
- Identify all running instances of vLLM.
- Pull the latest docker image:
docker pull vllm/vllm:v0.14.1. - Set environment variable
VLLM_MEDIA_URL_ALLOW_REDIRECTS=False. - Redeploy the service.
References
Read the full report for CVE-2026-24779 on our website for more details including interactive diagrams and full exploit analysis.
Top comments (0)