AI Code Generation's Hallucinations: A New Software Supply Chain Threat

The rise of AI-powered code generation tools is revolutionizing software development, but also introducing new risks to the software supply chain. These tools sometimes 'hallucinate' nonexistent software packages, a vulnerability attackers are exploiting. They create malicious packages and upload them to registries like PyPI or npm. When the AI 'hallucinates' the name again, installing dependencies executes the malware. Studies show around 5.2% of commercial AI suggestions are non-existent packages, compared to 21.7% for open-source models. This 'hallucination' shows a bimodal pattern: some invented names reappear consistently, others vanish. This form of typosquatting, dubbed 'slopsquatting', requires developers to carefully vet AI-generated code. The Python Software Foundation is actively working to mitigate these risks.