Fighting Back Against AI Music Theft: Poisoning the Well with Adversarial Noise

2025-04-15
Fighting Back Against AI Music Theft: Poisoning the Well with Adversarial Noise

Benn Jordan's latest video proposes a novel way to combat generative AI music services that steal music for their datasets: adversarial noise poisoning attacks. This technique uses specially crafted noise to disrupt the AI's learning process, making it unable to accurately learn from the poisoned data. While currently requiring high-end GPUs and substantial computing power, its effectiveness proves its potential, and more efficient methods may be developed in the future. This raises important questions about AI music copyright and data security, offering musicians a potential new defense against unauthorized use of their work.

Read more