The popular Python Pickle serialization format, which is common for distributing AI models, offers ways for attackers to ...
The technique, called nullifAI, allows the models to bypass Hugging Face’s protective measures against malicious AI models ...
Researchers discovered two malicious ML models on Hugging Face exploiting “broken” pickle files to evade detection, bypassing ...