LLMs Hallucinate Nonexistent Software Packages: A Supply Chain Vulnerability

2025-04-29
LLMs Hallucinate Nonexistent Software Packages: A Supply Chain Vulnerability

Researchers have discovered a concerning vulnerability in large language models (LLMs): the hallucination of nonexistent software packages during code generation. This isn't random; specific nonexistent package names are repeatedly generated, creating a repeatable pattern. Attackers could exploit this by publishing malware under these hallucinated names, waiting for developers to access them, thus launching a supply chain attack. Open-source LLMs exhibited a higher rate of this “package hallucination” than commercial models, and Python code showed fewer instances than JavaScript.

AI