← Back to Insights
Compliance

Law 25 compliance checklist for AI tools in 2026

Essential Law 25 requirements for Quebec organizations using AI tools: data residency, consent, impact assessments, and vendor compliance.

By Augure·
a person is writing on a paper with a pen

Law 25 compliance for AI tools requires specific attention to data residency, consent mechanisms, and privacy impact assessments. Quebec organizations must evaluate each AI platform against sections 3(5), 17, and 12-14 of the Act respecting the protection of personal information in the private sector. The key question isn't whether you use AI — it's whether your chosen tools meet Quebec's enhanced privacy standards that took full effect in September 2024.


Privacy impact assessment requirements

Law 25 section 3(5) mandates privacy impact assessments (PIAs) for AI tools that present "high risk to the privacy of the persons concerned." Most enterprise AI platforms qualify under this threshold, with section 3(6) specifying that automated decision-making systems explicitly require PIAs.

Your PIA must address three core elements before deployment. First, document the specific personal information your AI tool will process, including any indirect data collection through conversation logs or document uploads. Second, assess the necessity and proportionality of this processing against your legitimate business interests under section 12 requirements.

Third, identify technical and organizational safeguards to minimize privacy risks. This includes data minimization settings, access controls, and retention policies specific to your AI implementation that satisfy section 10's accuracy and security obligations.

Law 25 section 3(5) requires PIAs for "high risk" processing activities, not just high-risk tools. The same AI platform may require different assessments depending on how each department uses it, with penalties under section 91 reaching C$25 million for non-compliance.

The Commission d'accès à l'information du Québec (CAI) has indicated that automated decision-making, extensive profiling, and processing of sensitive categories under section 12 trigger the PIA requirement. AI tools used for HR screening, customer segmentation, or legal document analysis typically fall within this scope.


Data residency and transfer requirements

Section 17 of Law 25 sets strict conditions for transferring personal information outside Quebec. AI platforms routing data through US infrastructure face particular scrutiny under the enhanced framework, with additional requirements beyond federal PIPEDA transfer provisions.

You must ensure adequate protection equivalent to Law 25 standards. For US-based AI providers, this typically requires explicit consent from individuals under section 14 or demonstration of compelling legitimate interest under section 18, plus robust contractual safeguards.

Canadian data residency eliminates most transfer compliance complexity. Platforms like Augure that maintain 100% Canadian infrastructure avoid section 17 requirements entirely, as data never crosses international boundaries while still meeting both Law 25 and PIPEDA obligations.

Document your transfer assessment in writing. Include the legal basis (consent, legitimate interest, or necessity), destination country analysis, and specific contractual protections. The CAI expects this documentation during compliance reviews under section 70 inspection powers.

International data transfers under Law 25 section 17 require ongoing monitoring beyond initial adequacy assessments. Changes to foreign surveillance laws or provider corporate structure can invalidate your original compliance determination, creating ongoing liability under section 91 penalties.


Consent and transparency obligations

Law 25 sections 12-14 strengthen consent requirements for AI processing beyond federal PIPEDA standards. The "clear and simple language" standard in section 13 applies directly to how you explain AI capabilities to users and customers.

Your privacy notice must specify the AI tool's purpose, data sources, and any automated decision-making capabilities under section 8 collection notice requirements. Generic language about "improving services through technology" doesn't meet Law 25's specificity requirements or section 91 penalty thresholds.

For workplace AI deployments, employee consent remains valid only when truly voluntary under section 14. The CAI has signaled that mandatory AI tools for performance evaluation or monitoring require alternative legal bases under section 12's legitimate interest provisions.

Implement granular consent controls where feasible. Users should understand whether their data trains AI models, gets retained for persistent memory features, or processes through third-party APIs. These distinctions matter for section 13 compliance and avoiding section 91 penalties up to C$25 million.


Vendor due diligence framework

Law 25 section 3.3 establishes your responsibility for third-party processors, including AI platform providers. Due diligence goes beyond standard contract terms to assess actual privacy practices under Quebec's provincial jurisdiction.

Request detailed information about data processing locations, sub-processor arrangements, and security certifications meeting section 10 requirements. AI companies operating under foreign jurisdiction (particularly US CLOUD Act exposure) require enhanced scrutiny beyond federal PIPEDA due diligence standards.

Evaluate the provider's privacy governance framework. Look for dedicated privacy officers, regular compliance audits, and transparent incident response procedures meeting section 3.5 breach notification requirements. Generic privacy policies don't substitute for detailed processing agreements under section 18.

Establish data processing agreements (DPAs) that specify Law 25 compliance requirements. Include data subject rights procedures under sections 23-37, breach notification timelines per section 3.5, and audit provisions. The agreement should address AI-specific risks like model training and algorithmic bias.

Your Law 25 liability under section 91 doesn't transfer to AI vendors. Section 3.3 makes clear that Quebec organizations remain accountable for third-party processing, with penalties up to C$25 million regardless of contractual arrangements with processors.


Data subject rights implementation

Sections 23-37 of Law 25 expand individual rights beyond federal PIPEDA requirements that directly impact AI tool selection and configuration. Your chosen platform must support these rights technically, not just contractually, to avoid section 91 penalties.

Right of access under section 23 requires ability to identify all personal information processed by AI systems, including derived insights and automated decision outcomes. Generic responses don't satisfy this requirement or the C$10 million penalty threshold for enterprises.

Right of rectification under section 26 becomes complex with AI systems that create persistent profiles or memory. You need mechanisms to correct not just source data, but any AI-generated inferences or classifications affecting individual rights.

Right to data portability under section 28 applies to AI-processed information in structured format. Consider how your AI platform handles export requests for conversation histories, document annotations, or user preference profiles within Quebec's provincial framework.

Implement automated data subject request workflows where possible. Manual processes become unmanageable as AI usage scales across your organization while maintaining section 91 compliance thresholds.


Automated decision-making controls

Law 25 section 12.1 requires explicit notification when AI systems make decisions that significantly affect individuals. This extends beyond traditional automated processing to include AI-assisted decision-making under Quebec's provincial privacy regime.

Document your AI decision-making processes clearly. Include human oversight procedures, appeal mechanisms, and explanation capabilities meeting section 12.1 requirements. The law requires "meaningful information about the logic involved" in AI decisions, with section 91 penalties for non-compliance.

For high-impact decisions (hiring, credit, healthcare), implement human review requirements under section 12.1. The AI recommendation should inform human judgment, not replace it entirely. Document this distinction in your procedures to satisfy CAI inspection requirements.

Consider algorithmic bias monitoring for AI tools processing protected characteristics. While Law 25 doesn't explicitly require bias testing, section 9's accuracy principle creates implicit obligations for fair AI outcomes, particularly under section 91 penalty exposure.


Record-keeping and audit requirements

Law 25 section 3.2 requires detailed records of processing activities involving AI tools. These records must demonstrate compliance with all applicable Law 25 provisions, not just document general AI usage, to withstand section 70 CAI inspections.

Maintain processing registers that specify AI tool purposes, data categories, retention periods under section 11, and international transfers per section 17 for each implementation. Generic entries for "artificial intelligence" don't meet regulatory expectations or section 91 compliance standards.

Document privacy impact assessments per section 3(5), consent collection procedures under sections 12-14, and data subject rights responses per sections 23-37 specific to AI processing. The CAI expects these records during inspections or complaint investigations under section 70 powers.

Establish audit procedures for AI processing activities. Regular compliance reviews should assess consent validity, data minimization effectiveness under section 5, and security control adequacy per section 10. These reviews provide defensive documentation for regulatory inquiries and section 91 penalty mitigation.

Augure's Canadian infrastructure and built-in Law 25 compliance controls simplify record-keeping requirements under section 3.2 by eliminating international transfer complexity and providing native Quebec privacy law safeguards without US CLOUD Act exposure.


Implementation timeline and priorities

Start with inventory and risk assessment of current AI tools against Law 25 section 3(5) PIA requirements. Many organizations discover compliance gaps in existing implementations that require immediate attention to avoid section 91 penalties.

Prioritize high-risk AI applications first: those processing sensitive information under section 12, making automated decisions per section 12.1, or involving extensive personal data analysis. These typically require full PIA processes and enhanced safeguards under Quebec's provincial framework.

Develop standardized evaluation criteria for new AI tool acquisitions that include Law 25 compliance as a mandatory requirement, not an optional consideration. This prevents compliance debt accumulation as AI adoption expands while maintaining both provincial Law 25 and federal PIPEDA obligations.

Plan for ongoing compliance monitoring under section 70 inspection readiness, not just initial implementation. AI capabilities evolve rapidly, and new features may trigger additional Law 25 obligations that weren't present during original deployment.

Quebec organizations need AI platforms built for Canadian regulatory requirements, not retrofitted for compliance. The framework exists to protect individual privacy while enabling legitimate business innovation — choose tools that support both objectives from the ground up.

Ready to evaluate your AI compliance posture? Visit augureai.ca to explore how sovereign Canadian AI infrastructure simplifies Law 25 compliance requirements.

A

About Augure

Augure is a sovereign AI platform for regulated Canadian organizations. Chat, knowledge base, and compliance tools — all running on Canadian infrastructure.

Ready to try sovereign AI?

Start free. No credit card required.

Get Started