← Back to Insights
Compliance

Law 25 compliance checklist for AI tools in 2026

Essential Law 25 compliance requirements for AI tools. Data residency, consent frameworks, and penalty avoidance for Quebec organizations.

By Augure·
a tall building with lots of windows and a sky background

Law 25 compliance for AI tools requires specific attention to data residency, consent mechanisms, and algorithmic transparency under Quebec's provincial privacy legislation. Quebec organizations must navigate cross-border data transfer restrictions per Section 17, conduct privacy impact assessments for AI deployments under Section 3.3, and implement technical safeguards that meet the Commission d'accès à l'information du Québec's enforcement standards. Non-compliance penalties reach C$10 million or 2% of worldwide turnover under Section 89, with repeat offenses facing penalties up to C$25 million under Section 93.1.

The regulatory landscape has matured significantly since Law 25's full enforcement began in September 2024. Organizations using AI tools now face routine audits and must demonstrate compliance through documented processes, not just policy statements.


Data residency and cross-border transfers

Law 25's Section 17 creates strict requirements for transferring personal information outside Quebec. This directly impacts AI tool selection, as most commercial platforms process data through US-based infrastructure subject to foreign surveillance laws including the CLOUD Act.

Organizations must obtain explicit consent before using AI tools that transfer data internationally. The consent must specify the destination country, purpose of transfer, and associated risks per Section 17's requirements. Generic privacy policies don't satisfy this standard under Quebec provincial law.

"Under Law 25 Section 17, organizations cannot assume implied consent for cross-border AI processing. Each transfer requires explicit, informed consent that details the specific risks of foreign jurisdiction processing, including potential exposure to foreign surveillance laws."

The Commission d'accès à l'information du Québec has clarified that cloud-based AI services constituting "communication" of personal information trigger Section 17 requirements. This includes uploading documents to AI chat interfaces, knowledge base tools, and automated analysis platforms.

Canadian data residency eliminates these consent complications entirely. Platforms like Augure, which maintain 100% Canadian infrastructure and processing, avoid cross-border transfer issues while providing equivalent AI capabilities through models like Ossington 3 and Tofino 2.5.


Privacy impact assessment requirements

Section 3.3 of Law 25 mandates privacy impact assessments (PIAs) for processing that presents "elevated risks to privacy." AI tools typically qualify due to their automated decision-making capabilities and potential for profiling under Quebec's provincial privacy framework.

The PIA must address:

Data sources and collection methods - Document what personal information the AI processes and how it's obtained • Processing purposes and legal basis - Specify why AI analysis is necessary and the lawful basis under Sections 12-14 • Algorithmic decision-making scope - Identify which processes involve automated analysis or recommendations • Risk mitigation measures - Detail technical and organizational safeguards

PIAs must be completed before AI deployment and updated when processing activities change materially. The Commission can request PIA documentation during investigations under Section 70, making thorough documentation essential.

Financial services firms have faced particular scrutiny. A Quebec credit union received a C$50,000 penalty in late 2025 for deploying customer analytics AI without completing required impact assessments under Section 3.3.


Consent frameworks for AI processing

Law 25 Section 14 requires clear, specific consent for AI processing of personal information. Blanket consent clauses referencing "data analytics" or "business intelligence" don't meet the legislation's specificity requirements.

Valid consent must explain:

• The specific AI capabilities being used (chat, document analysis, predictive modeling) • What personal information will be processed • How long information will be retained per Section 10 • Whether processing involves automated decision-making affecting individuals

"Consent for AI processing under Law 25 Section 14 must be granular and specific. Organizations cannot rely on broad 'data processing' language that was acceptable under previous privacy frameworks or federal PIPEDA standards."

Employee consent presents additional complexity under Quebec provincial law. Section 12 establishes higher standards for employer-employee data relationships, requiring demonstration that AI processing serves legitimate business interests while minimizing privacy impact.

Professional services firms using AI for client matter analysis must obtain explicit client consent beyond standard engagement letters. The Quebec bar association issued guidance in 2025 requiring separate AI processing disclosures in retainer agreements to meet Section 14 requirements.


Technical safeguards and data protection

Law 25's Section 8 requires "security measures adapted to the sensitivity of the personal information." AI tools processing Quebec personal information must implement appropriate technical safeguards based on the data's sensitivity and processing scope.

Minimum technical requirements include:

Encryption in transit and at rest - All personal information must be encrypted during transmission and storage per Section 8 • Access controls and authentication - Multi-factor authentication for AI tool access, with role-based permissions • Audit logging and monitoring - Comprehensive logs of data access, processing activities, and user actions • Data retention controls - Automated deletion capabilities aligned with Section 10 retention policies

Organizations must also address AI-specific risks like model training data exposure and prompt injection vulnerabilities. The Commission d'accès à l'information du Québec has indicated that standard cybersecurity measures may be insufficient for AI systems handling sensitive personal information under Quebec provincial requirements.

"Technical safeguards for AI tools under Law 25 Section 8 must address both traditional data security risks and AI-specific vulnerabilities like training data exposure and algorithmic bias that could impact individual privacy rights under Quebec's provincial privacy framework."

Augure's architecture demonstrates compliant technical implementation through built-in encryption, Canadian data residency, and granular access controls designed specifically for regulated organizations operating under provincial privacy legislation.


Vendor due diligence and contracts

Section 18 of Law 25 makes organizations responsible for their service providers' privacy practices. This creates specific due diligence obligations when selecting AI tools and platforms under Quebec provincial law.

Vendor assessment must cover:

Data processing location - Verify where personal information will be processed and stored • Subprocessor arrangements - Identify all parties with potential access to personal information • Security certifications and audits - Review SOC 2, ISO 27001, or equivalent security assessments • Breach notification procedures - Ensure 72-hour notification capability required under Section 63

Contractual provisions must address data processing limitations, return or deletion obligations per Section 25, and audit rights. Standard software licenses rarely provide adequate privacy protections for Law 25 compliance.

US-based AI providers present particular challenges due to CLOUD Act exposure and foreign intelligence surveillance risks. Organizations must assess whether adequate contractual protections can address jurisdictional risks or whether Canadian alternatives provide better compliance positioning under Quebec provincial law.


Breach notification and incident response

Law 25's breach notification requirements under Sections 63-68 apply fully to AI-related incidents. Organizations must notify the Commission d'accès à l'information du Québec within 72 hours of discovering breaches that could cause serious injury per Section 63.

AI-specific breach scenarios include:

Training data exposure - Personal information used in model training becomes accessible through model outputs • Prompt injection attacks - Malicious inputs extract personal information from AI systems • Model inference attacks - Adversarial techniques reveal personal information about training data subjects • Unauthorized access to AI-processed data - Compromise of AI platforms containing personal information

Incident response plans must address AI tool-specific risks and notification procedures. This includes identifying when AI processing incidents trigger the "serious injury" threshold requiring public notification under Section 67.

Healthcare organizations using AI for patient data analysis have faced particular challenges under Quebec provincial law. A Montreal hospital network faced a C$75,000 penalty in 2025 for failing to properly notify regulators about an AI system breach that exposed patient diagnostic information under Section 63 requirements.


Ongoing monitoring and compliance maintenance

Law 25 compliance requires continuous monitoring, not just initial deployment assessments. Organizations must track AI tool usage, update privacy impact assessments per Section 3.3, and maintain current documentation of processing activities.

Regular compliance activities include:

Quarterly access reviews - Verify that AI tool permissions align with current job responsibilities • Annual privacy impact assessment updates - Review and update PIAs for material changes in AI processing per Section 3.3 • Vendor security assessment reviews - Monitor third-party security postures and certifications • Employee training updates - Ensure staff understand privacy obligations when using AI tools

The Commission d'accès à l'information du Québec has indicated that routine compliance audits will become more common in 2026 under its enforcement powers in Section 70. Organizations must maintain readily accessible documentation demonstrating ongoing Law 25 compliance.

Documentation should include current privacy impact assessments, vendor due diligence records, consent management logs per Section 14, and incident response procedures specifically addressing AI-related privacy risks.


Law 25 compliance for AI tools requires proactive planning and ongoing vigilance under Quebec's provincial privacy framework. Organizations must balance AI capabilities with privacy protection obligations through careful vendor selection, thorough impact assessments per Section 3.3, and robust technical safeguards meeting Section 8 requirements.

Canadian AI platforms provide a compliance-forward approach by eliminating cross-border data transfer complexities under Section 17 while delivering enterprise-grade AI capabilities. Learn more about compliant AI infrastructure at augureai.ca.

A

About Augure

Augure is a sovereign AI platform for regulated Canadian organizations. Chat, knowledge base, and compliance tools — all running on Canadian infrastructure.

Ready to try sovereign AI?

Start free. No credit card required.

Get Started