See how top teams stay future-ready for audits. 🚀
AI Compliance

Substantial modification 

A Substantial Modification, as legally defined by the EU AI Act, is a change made to a high-risk AI system after it has been placed on the market or put into service that significantly alters the system's original design, performance, safety, or intended purpose in a way that impacts its conformity with the Act’s mandatory requirements.

This concept creates a critical governance checkpoint for the lifecycle of AI systems, acknowledging that software, unlike traditional hardware, is inherently dynamic. It distinguishes between routine updates, bug fixes, or security patches—which do not trigger reassessment—and fundamental changes that reintroduce pre-market levels of risk. The determination hinges on whether the modification could affect the system’s behavior in relation to the essential health, safety, and fundamental rights requirements. Once a change is classified as “substantial,” the modified system is treated as a new product entry, requiring a full or partial repeat of the conformity assessment before it can be legally returned to the market.

Determining whether a modification is "substantial" requires a structured risk-based evaluation, typically considering factors such as:

Change to Intended Purpose: Expanding or altering the system's stated objective or the context in which it operates (e.g., modifying a CV-screening tool to also make automated hiring decisions).

Performance Specification Shift: Modifications that alter the system's accuracy, robustness, or cybersecurity levels, either intentionally or as an unforeseen side-effect.

Core Algorithm or Logic Update: Significant changes to the model architecture, retraining with new or fundamentally different data, or altering the decision-making logic in a way that affects outputs.

Affected Requirements: Any change that impacts the system's conformity with specific Annex III requirements, such as those for transparency, human oversight, or data quality.

Regulatory Context: Article 43(3) of the EU AI Act explicitly states that substantial modifications shall be considered as creating a new AI system, subject to a new conformity assessment. This is further clarified in recital 72, which notes that changes affecting the system’s compliance or its intended purpose trigger this obligation. The provider is responsible for documenting the rationale for their “substantial” or “non-substantial” classification.

Compliance and Strategic Impact: Misclassifying a substantial modification as a minor update is a major compliance risk, potentially placing a non-conformant, high-risk system on the market. Proactively managing this process through a formal change control procedure within the Quality Management System is essential. It allows providers to innovate and improve their systems responsibly while maintaining a continuous chain of compliance, thereby avoiding regulatory penalties, forced market withdrawals, and liability for incidents caused by an unauthorized, modified system.

Subscribe to our newsletter
Get monthly updates and curated industry insights
Subscribe
Thank you! Your submission has been received!
Oops! Something went wrong while submitting the form.

Ready to see what security-first GRC really looks like?

The Scrut Platform helps you move fast, stay compliant, and build securely from the start.

Book a Demo
Book a Demo