AI GRC in the Age of Artificial Intelligence: More Questions And More Actions!

It is well known that the FSI is perhaps the most regulated workspace in this world. A lot is being discussed on Artificial Intelligence Governance, Risk and Compliance (AI GRC) across Financial Services Industry (FSI) in all parts of the globe. To clear the clutter and focus on relevant actions, Financial Technology Frontiers (FTF) reviewed several articles, regulations and industry trends in AI GRC. We summarize our key findings here.

 

  • Financial Institutions carry a lot of responsibility and accountability to not only meet complex regulatory requirements (e.g., multi-geography jurisdictional compliance such as EU AI Act, GDPR, Basel III / IV, OSFI, RBI, SEC, FDIC) but also to ensure they operate with finesse to avoid the resulting reputational embarrassments. The key challenge is to harmonize these overlapping obligations across AI use cases and blend them into actionable outcomes. This is indeed an uphill task!

 

  • Financial Institutions function on two important human attributes: Trust and Faith. Any dent to these would spell a deathknell to an insitution, no matter how large or small it may be. In the AI world, these translate into “bias”, “favouritism” “ethics” and other similar human behavioural  attributes. Examples include pricing of products and services, legal liabilities, waiver of clauses for select transactions (and why?), etc. The challenge lies in identifying, assessing and mitigating bias in all AI models in use, throughout the lifecycle of those models, and not just in the design and implementation stages. The objective lies in demonstrating fairness and non-discrimination when decisions are based on AI model outputs. Something that the 2nd and 2rd Lines of Defense need to ponder about.

 

  • Financial Institutions need to ensure that when using AI models for their business activities, traceability, explainability and auditability are key components. The challenge lies in not just designing and deploying AI models, but have answers to “how” the AI models actually work and the ease with which their actions can be demostrated. For example, AI models used for credit underwriting, fraud detection, credit scoring, or trading decisions must be treaceable, auditable and explainable to ensure the outcomes are “acceptable” beyond reasonable doubts. A black box approach just won’t work.

 

  • Financial Institutions rely heavily on resources, services and tools supplied by Third-Party providers including vendors, outsourced agencies and contractors. The resulting Third-Party Risk is inherent. With the proliferation of AI in vendor ecosystems, the challenge lies in undertaking AI based vendor due diligence, and understanding vendor compliance regimes in AI-driven products and service offerings. Implementing controls to safegaurd institutional intersts, balance risks against business objectives and demonstrating a healthy risk posture are non-negotiable. For example, how robust and dependable are AI models used in common business activities such as Know Your Customer (KYC), lending decision-making and investments? How are these vetted?

 

  • Financial Institutions have mandates to oversee and govern their Cybersecurity risks similar to other financial and non-financial risks (e.g., capital adequacy, liquidity, asset-liability, operations, customer-service and others). An Audit Board study in 2024 indicates that 43% of EU based financial institutions report using AI for proactive risk tracking, while 50% struggle with establishing incident reporting practices to comply with the EU AI Act. Recent requirements include the need to define and report “material Cybersecurity incidents” to regulators within prescribed timeframes (e.g., SEC, OSFI). The challenge lies in not just detecting Cybersecurity incidents but investing in proactively monitoring institutional IT systems and remediating risks within defined timeframes. Sounds easy, but this calls for a close coordination among several business functions beyond technology and cybersecurity. And not to be left out is the key role played by external Third-Parties in the overall governance mechanisms, and the transfer of risk through Cybersecurity insurance.

 

  • Financial Instituions are governed by their Boards at the highest level. Demonstration of Board-level accountability is a fundamental challenge that is regularly noted in regulatory examinations as well. In the context of AI, two pertinent questions arise: One, how are Board members upskilling / equipping themselves with the requisite background in order to demonstrate their control over the insitution? Two, what strategies are Boards inculcating to review management decisions on AI model outputs and sign off on those strategic decisions?

 

  • Financial Institutioons operate a complex mix of technologies, both old and new. This is unavoidable since the evolution of banking technologies and their subsequent migrations are time and cost intensive. The term “obsolete technology usage” is not uncommon in the FSI space. The challenge lies in overlapping technology usage to modern risk management processes using AI driven tools (e.g., AI-based GRC automation tools). Compatibility, process re-engineering, customer service impact, reporting ativities and several other dimensions have direct influence on the  metamorphosis of an institution into a modern and scalable architecture.

 

  • Financial Institutions have new competitors that now offer similar services at unmatched speed and price. Money Service Businesses, Crypto-based money managements, Private lenders, Mortgage companies and Fintech entities make up the bulk of this competition. The McKinsey Global Banking Annual Review, 2025 predicts that Banks are likely to risk losing up to 28% of their revenue by 2030 to Fintechs and Tech firms, in the areas of SME banking, payments, and unsecured lending. The EY Global Fintech Adoption Index, 2025 pegs the global consumer usage of at least one Fintech platform at 68%, and the Federal Reserve Fintech Credit Report, May 2025 states that the US SME lending  from Fintechs is at 34% of all SME loans in 2024-25. While these statistics are alarming, they are not surprising. New-age businesses such as Fintech entities use cutting edge technologies, largely driven by AI. From a customer’s perspective, it is no longer “brand loyalty” but “efficiency of  tailored / customized service at the least cost and the fastest speed. The challenge for traditionally established financial institutions is to break the barriers of traditional banking services and embrace the new style of doing business, all the while managing and evolving legacy technologies and processes.

 

  • Financial Institutions employ large number of skilled resources, both on-roll and outsourced. Awareness and training activities are Business-As-Usual (BAU) in the FSI. The challenge lies in training and retaining talent in an ever growing and dynamic industry that is witnessing explosive growth in career options. The Deloitte Future of Banking Workforce, 2025 records that 80% of North American financial institutions cite difficulty in retaining tech talent among the top 3 internal risks. The MaRS Discovery District Fintech Talent Report, 2025 remarks that 28% of experienced tech and product staff working in Canadian financial services in 2023–2025 have moved to fintechs, neobanks, or tech-driven startups. It is not an exaggeration to state that AI is opening novel and faster methods to upskill oneself in much shorter timeframes than ever before. Combine this with the fast catching up concept of “solopreuners” that enables an individual to control multiple activities using AI Agents, thus taking productivity levels to new highs. Where is this leading to? Will we see employee numbers shrinking or will the nature of employment change to keep pace with the AI-driven world? The risk of “people” appears imminent. So, how are Human Resources teams in Financial Institutions gearing up for this change in human knowledge?

 

Conclusion:

 

This is a finite list of FTF’s findings and analysis on the challenges the FSI is facing.

Financial Institutions have a good laundry list of strategic and tactical objectives on hand, that they are cognizant of. It remains to be seen how intelligently and smartly financial institutions navigate in the new paradigm of an AI-driven financial services space. Which of challenges and needs should be prioritized over the others? The questions are daunting and difficult to answer. Collaboration, ecosystem building, practice sharing, adopting a consistent approach across the FSI to tackle the challenges are good pathways, or so it appears.

At FTF, we encourage you to share your views and contribute to building a solid financial services ecosystem. Our Hi2AI program is our storefront to discuss these findings and several others. So what is your take? Get in touch for engaging and immersive conversations on these findings and add yours.

 

Authored by: Narasimham Nittala.