The AI Competency Gap Analysis FCA Compliance Checklist
Your 10-Point Framework for SM&CR Compliance in the Age of AI
The FCA is regulating AI through its existing principles: The risk of AI deployment rests on your competence to govern it. This Checklist identifies the 10 most critical accountability gaps that could expose you or your firm under the Senior Managers & Certification Regime (SM&CR) and the Consumer Duty.
Section A: Accountability and Governance (SM&CR Focus)
The Risk: Delegation of AI tools does not delegate accountability. You must demonstrate "reasonable steps" to oversee AI risk (SM&CR Senior Manager Conduct Rule 2 & 3).
Competency Assessment Points 1-3
Point 1: Designated Responsibility
Competency Question:
Is a specific Senior Manager (SMF) clearly and formally responsible for the end to end governance and competency standards related to all AI tools used by the firm?
Score (Yes/No/N/A): ☐ Yes ☐ No ☐ N/A
Gap/Risk Area:
SM&CR Accountability Map: Failure to clearly assign AI risk ownership.
Point 2: Policy & Usage
Competency Question:
Does your firm have a current, published, and enforced AI Acceptable Use Policy that specifically restricts employees from uploading confidential client data (e.g., in a fact find) into public, unvetted GenAI tools (e.g., ChatGPT)?
Score (Yes/No/N/A): ☐ Yes ☐ No ☐ N/A
Gap/Risk Area:
Data Protection / Confidentiality: Uncontrolled 'Shadow AI' usage creating data leakage risk.
Point 3: Staff Certification
Competency Question:
For Certified Staff (e.g., Financial Advisers) using AI to generate client facing content (e.g., a suitability report draft), can the firm demonstrate they have received formal, auditable training on the specific risks (e.g., hallucination, bias) of that tool?
Score (Yes/No/N/A): ☐ Yes ☐ No ☐ N/A
Gap/Risk Area:
Certification Regime: Failure to assess fitness & propriety for AI enhanced roles.
Section B: Transparency and Traceability of Rationale (Consumer Duty Focus)
The Risk: You must be able to explain the rationale behind advice given, even if it was supported by an AI tool. Lack of Traceability violates the obligation to ensure fair value and avoid foreseeable harm (Consumer Duty Principle 12).
Competency Assessment Points 4-6

Point 4: Traceability of Rationale
Competency Question:
For any AI tool that directly influences a client facing decision (e.g., portfolio allocation or risk scoring), is there a documented process that allows the adviser to capture and verify the AI's underlying rationale?
Score (Yes/No/N/A): ☐ Yes ☐ No ☐ N/A
Gap/Risk Area:
Model Transparency: Inability to provide a 'human readable' audit trail for AI driven advice.

Point 5: Output Verification
Competency Question:
Is a 'Human in the Loop' mandatory, with a documented sign off, for every single AI generated output (text, data, or recommendation) that is presented to a client or used for regulatory reporting?
Score (Yes/No/N/A): ☐ Yes ☐ No ☐ N/A
Gap/Risk Area:
Hallucination Risk: Reliance on unverified AI outputs leading to client detriment or misreporting.

Point 6: Client Disclosure
Competency Question:
Do your client agreements or terms of business clearly and accurately disclose the extent to which AI tools are used in the provision of advice (including any limitations or risks associated with that use)?
Score (Yes/No/N/A): ☐ Yes ☐ No ☐ N/A
Gap/Risk Area:
Misleading Communication: Breach of Consumer Duty by obscuring the role and limits of AI.
Section C: Fairness and Robustness (Operational Risk Focus)
The Risk: Bias in AI can amplify discrimination, leading to unfair outcomes for vulnerable customers or market abuse. Poor testing violates Principles 2 and 3 (Skill, Care, and Diligence, & Adequate Systems).
Competency Assessment Points 7-10
Point 7: Bias Audit
Competency Question:
Has the AI model been tested with datasets representing diverse client demographics to proactively check for algorithmic bias that could disadvantage groups of vulnerable customers (e.g., on pricing, access, or advice)?
Score (Yes/No/N/A): ☐ Yes ☐ No ☐ N/A
Gap/Risk Area:
Fairness / Discrimination: AI embedding or amplifying bias into business practice.

Point 8: Vendor Oversight
Competency Question:
If you use a third party AI provider, do your contracts require the vendor to notify you immediately of any significant model updates, retraining cycles, or changes in performance that could impact your compliance?
Score (Yes/No/N/A): ☐ Yes ☐ No ☐ N/A
Gap/Risk Area:
Outsourcing Risk: Failure to actively monitor the operational and compliance health of third party AI.
Point 9: Data Quality
Competency Question:
Do you have robust processes to ensure the data used to train, refine, or prompt your AI tools is accurate, timely, and free from material errors that could lead to non compliant or poor quality advice?
Score (Yes/No/N/A): ☐ Yes ☐ No ☐ N/A
Gap/Risk Area:
Garbage In, Regulatory Fine Out: Poor data integrity undermining model reliability.

Point 10: Regulatory Pace
Competency Question:
Has your firm established a system to track the rapid advancement of AI models (e.g., the j100% capability increase every 3.3 months) and mandate a corresponding review of your internal AI policies every quarter?.
Score (Yes/No/N/A): ☐ Yes ☐ No ☐ N/A
Gap/Risk Area:
Pace vs. Policy Gap: Policy and governance lagging dangerously behind the speed of technological evolution
Next Step: The Critical Realisation
Total the "No" scores above. Each 'No' represents a tangible, documented failure point against the FCA's existing expectations for governance, competence, and customer protection.
Your firm may be innovating, but is it protected?
The Solution
The AI Literacy Toolkit for Financial Professionals provides the CPD accredited, sector-specific training and governance framework required to answer "YES" to all 10 of these crucial questions, turning an existential risk into an auditable compliance advantage.
© 2025 Clarendon Training Ireland Limited
Disclaimer:
No liability is accepted for losses or regulatory consequences arising from use of this checklist. Professional advice is essential before implementation.
This checklist should be customised based on specific firm requirements and regulatory obligations. Professional legal and compliance advice should be sought before implementation. Regular updates are essential as the regulatory landscape continues to evolve.