Model Signing: Securing the Integrity of ML Models

With the explosive growth of machine learning applications, model security has become a critical concern. This project aims to secure the integrity and provenance of machine learning models through model signing. It utilizes tools like Sigstore to generate model signatures and provides CLI and API interfaces, supporting various signing methods (including Sigstore, public keys, and certificates). Users can independently verify the integrity of their models, preventing tampering after training. The project also integrates with SLSA (Supply chain Levels for Software Artifacts) to further enhance the security of the machine learning model supply chain.
Read more