
Hugging Face Models Now Ship With Free Backdoor Hugs via
In 2026, AI model supply chain attacks surged by 156% year-over-year, creating a complex attack surface that extends far beyond traditional software supply chains. Attackers target AI development ecosystems, exploiting training datasets, model weights, and fine-tuning adapters. Poisoned datasets can affect thousands of models, while compromised Low-Rank Adaptation adapters can contain hidden malicious code. CloudBorne and SockPuppet attacks also pose significant threats, with attackers compromising cloud infrastructure and creating fake developer personas to introduce backdoors into open-source AI projects. Traditional supply chain security measures are inadequate, and organizations must implement comprehensive defensive strategies, including cryptographic signing, bills of materials, and zero-trust principles. Recent incidents, such as the Wondershare RepairIt incident, highlight the real-world impact of AI supply chain attacks. To defend against these threats, organizations must prioritize security throughout the AI development lifecycle, investing in specialized AI security capabilities and industry-leading security platforms like SentinelOne.