- Written by - Phil Cuming
If you’re still a sceptic when it comes to AI’s growing dominance in the financial services industry, head over to Google Trends, type in ‘AI finance’, set the dropdowns to ‘Worldwide’ and ‘Past 5 Years’ and you should see the following:
Note the date this search term started to trend upward. December 2022, roughly a month after Chat GPT became available to the public.
The Age of AI has begun. And leading the charge is Generative AI (Gen AI), which uses machine learning models to learn patterns from large sets of data and then uses those patterns to generate new content by predicting what comes next.
The application of this technology in the context of financial services is vast. It’s already fundamentally changing everything from risk management and reporting to financial planning and personalising the customer experience.
Make no mistake, we are in the midst of an AI Gold Rush.
According to a recent Sapio Research report, a staggering 63% of finance professionals are currently using AI. The second-highest group is ‘IT’ at 44%.
By now, most finance professionals are well aware of the standard set of risks associated with Gen AI. In this piece, we’re going one level deeper to uncover some of the nuances you may not be aware of in the following three categories:
- Data Privacy and Security Concerns
- Risk of Bias and Discrimination
- Job Displacement and Workforce Adaptation
However, it’s not all doom and gloom. From an opportunities standpoint, there’s a lot to get excited about when it comes to Gen AI, namely:
- Enhancing Customer Service and Personalisation
- Optimising Investment Strategies
- Improving Operational Efficiency
Let’s dive in…
The Top 3 Generative AI Risks for Finance Professionals
1. Data Privacy and Security Concerns
Of all the potential minefields to navigate when adopting Gen AI, data privacy and security are perhaps the biggest ones.
We’ve identified three pitfalls to be aware of and have included some smart questions to ask your Gen AI provider that could save your firm countless millions in potential fines, not to mention the reputational damage that could occur.
Compliance with Data Protection Laws
Risk: Financial institutions may face severe fines (e.g., up to €20 million or 4% of annual turnover under GDPR) if they cannot demonstrate compliance with data protection regulations when deploying AI.
Specific concerns:
- Cross-Border Data Transfers: Many AI tools are hosted in global cloud environments. This poses risks related to transferring data across jurisdictions with differing legal standards, potentially violating privacy laws.
- Auditability and Explainability: AI models are often described as “black boxes,” making it difficult to audit their decisions or trace how data was processed, a requirement under GDPR Article 22.
Smart Questions to Ask Your Gen AI Provider:
- How does your platform ensure compliance with GDPR, CCPA, and other global data protection laws, particularly for cross-border data transfers?
- Can you provide a detailed explanation of how the model processes, stores, and deletes data to align with regulatory requirements?
- Does your system include features to ensure audibility and explainability, such as logs or tools for understanding decision-making processes?
- Where are your servers located, and how do you address data residency requirements for jurisdictions with strict local storage laws?
Adversarial Attacks and Model Exploitation
Risk: Without robust defences like differential privacy and secure API access, adversaries could compromise AI models, leading to data breaches.
This is because AI systems are vulnerable to adversarial attacks, where malicious actors manipulate inputs to trick the AI into revealing sensitive information or making erroneous predictions.
Specific concerns:
- Model Inversion Attacks: Hackers can reverse-engineer AI models to extract training data, potentially exposing sensitive financial data used during training.
- Prompt Injection Attacks: Malicious prompts can exploit AI to reveal restricted information or perform unauthorised actions.
Smart Questions to Ask Your Gen AI Provider:
- What measures are in place to prevent and mitigate model inversion attacks, and how do you secure sensitive training data from reverse engineering?
- How does your system defend against prompt injection attacks or other adversarial inputs designed to manipulate the AI?
- Does your platform implement differential privacy or similar techniques to anonymise training data and limit data leakage?
- What real-time monitoring and alert systems do you have to detect and respond to potential adversarial attacks?
- Can you provide examples of how your system has been tested for vulnerabilities, such as penetration testing or adversarial attack simulations?
Data Residency Requirements
Risk: Ignoring data residency laws can result in operational disruptions, regulatory sanctions, or forced cessation of AI deployments in specific regions.
Many countries have laws mandating that financial data must be stored and processed within their borders (e.g., China’s Cybersecurity Law). Gen AI systems hosted on global cloud platforms may inadvertently breach these requirements.
Specific concerns:
- Cloud Data Sovereignty: Hosting sensitive data on servers located outside a country’s jurisdiction could lead to non-compliance with local laws.
- Vendor Dependency: Financial institutions relying on third-party AI providers may lack control over data storage locations.
Smart Questions to Ask Your Gen AI Provider:
- Can you confirm the exact locations of your data centers and how they align with data residency requirements in the jurisdictions where we operate?
- What mechanisms are in place to restrict data storage and processing to specific regions or countries?
- Does your platform allow for on-premises or hybrid cloud deployments to ensure compliance with strict local data residency laws?
- How do you handle cross-border data flows, and what measures are in place to ensure compliance with regional data sovereignty regulations?
- In the event of regulatory changes, how quickly can you adapt to ensure ongoing compliance with updated data residency requirements?
2. Risk of Bias and Discrimination
We see it in the world of recruitment all the time. Businesses adopt Gen AI tech to help them weed through hundreds of job applications and shortlist worthy candidates only for that model to return biased or discriminatory outcomes due to systemic biases and flaws in the data it was trained on.
The two most important pitfalls to be aware of are:
Bias in Training Data
Risk: If training datasets contain historical biases, these will likely be reflected in the AI’s outputs. In financial services, this can result in discriminatory practices, such as inequitable loan approvals or credit scoring, perpetuating systemic inequities.
Specific Concerns:
- Historical Inequities: AI systems may reinforce historical patterns of discrimination found in financial data (e.g., redlining in mortgage lending).
- Blind Spots in Data: Missing data about specific populations or scenarios can result in models that fail to generalise fairly.
- Amplification of Bias: Gen AI may unintentionally magnify subtle biases during iterative processes like model fine-tuning.
Smart Questions to Ask Your Generative AI Provider
- How do you ensure that training datasets are free from historical or systemic biases?
- What techniques, such as re-sampling or synthetic data generation, do you use to address imbalanced data representation?
- How do you evaluate your models for fairness, and can you share metrics or case studies demonstrating bias mitigation?
- Are there any processes in place for regular auditing of models to identify and address emerging biases over time?
Lack of Transparency in Decision-Making
Risk: Gen AI models, often described as “black boxes,” lack explainability, making it difficult to identify how biases influence outputs. In finance, this lack of transparency can erode trust among clients and regulators, especially in critical decisions like fraud detection, creditworthiness, or compliance checks.
Specific Concerns:
- Opaque Models: Financial institutions may struggle to justify AI-driven decisions, especially when they affect customers adversely (e.g., loan rejections).
- Regulatory Compliance: Laws like GDPR require explainability for decisions made by automated systems, creating legal risks if AI outputs cannot be justified.
- Loss of Stakeholder Trust: A lack of transparency can erode trust among customers, auditors, and regulators, especially when outcomes appear unfair.
Smart Questions to Ask Your Generative AI Provider
- What tools or frameworks do you offer for ensuring transparency and explainability in the AI decision-making process?
- How does your platform support compliance with explainability requirements under GDPR and other regulations?
- Can your system provide detailed output logs that show how specific decisions are reached?
- What steps do you take to ensure that explainability efforts are not just technical but also accessible to non-technical stakeholders?
3. Job Displacement and Workforce Adaptation
It’s a story as old as the invention of automated textile equipment in the late 1700s.
For hundreds of years, humans have feared job loss as a result of automation. This debate is reaching a fever pitch as a result of AI because unlike automation, which is designed merely to perform certain tasks faster and more efficiently than humans, AI is capable of complicated reasoning.
In simple terms, automation is designed primarily to DO. AI, on the other hand, is designed to both think AND do.
If your firm is adopting Gen AI, here are three potential pitfalls to be aware of.
Automation of Repetitive Tasks
Risk: Gen AI excels at automating repetitive, rule-based tasks, such as data entry, report generation, and routine customer service. While this increases efficiency, it also risks displacing employees performing these roles, creating workforce instability and potential resistance to AI adoption.
Specific Concerns:
- Role Redundancy: Employees in operational roles like data processing or low-complexity client servicing may see their roles diminish.
- Devaluation of Skills: Workers who rely on routine tasks may face challenges in transitioning to higher-value roles without proper reskilling opportunities.
- Employee Resistance: Fear of job losses can result in resistance to AI implementation, reducing project success rates.
- Loss of Institutional Knowledge: Automation without human oversight risks losing valuable insights held by experienced employees.
Smart Questions to Ask Your Generative AI Provider
- How can your platform support the augmentation of human roles rather than the outright replacement of tasks?
- Can you share examples of successful implementations where AI automation enhanced employee productivity instead of displacing roles?
- What features are available to ensure a smooth handoff between automated systems and human teams?
Skills Gap and Reskilling Needs
Risk: The introduction of Gen AI into financial workflows creates demand for new skills, such as AI oversight, data analysis, and strategic interpretation. Without proper training, existing employees may struggle to adapt, leading to talent gaps and inefficiencies.
Specific Concerns:
- Rapid Skill Obsolescence: Employees in traditional roles may find their expertise outdated as AI becomes a core part of operations.
- Limited Reskilling Programs: Financial institutions often lack structured programs to retrain staff for AI-enhanced roles.
- Over-reliance on Specialists: Companies may rely heavily on external AI specialists instead of upskilling internal teams, increasing costs and dependency.
- Morale and Retention Risks: Employees may feel undervalued if they are not supported in acquiring new skills, leading to retention challenges.
Smart Questions to Ask Your Generative AI Provider
- Do you offer training programs or resources to help our workforce integrate Gen AI into their daily workflows?
- How user-friendly is your platform for non-technical employees, and what support is available to reduce the learning curve?
- Can your system provide role-specific recommendations for upskilling based on the tasks it automates?
- What partnerships or certifications do you offer to support ongoing employee education in AI-related skills?
Organisational Culture and Change Management
Risk: The integration of Gen AI requires a cultural shift within financial institutions. Employees must view AI as a tool for empowerment rather than a threat. Failure to manage this transition can lead to resistance, diminished collaboration, and strained leadership-employee relationships.
Specific Concerns:
- Communication Gaps: Poorly communicated AI initiatives can foster fear and misunderstanding among employees.
- Leadership Buy-In: Without strong support from leadership, change initiatives are less likely to succeed.
- Uneven Adoption Rates: Different teams or departments may adopt AI solutions at varying speeds, leading to operational misalignment.
- Trust Issues: Employees may distrust AI outputs, especially in critical decision-making scenarios.
Smart Questions to Ask Your Generative AI Provider
- How can your platform support transparent communication about AI’s role in the organisation?
- What tools or features do you offer to foster collaboration between AI systems and human teams?
- Can you provide examples of how AI adoption was successfully aligned with company culture in other organisations?
The Top 3 Generative AI Opportunities for Finance Professionals
1. Enhancing Customer Service and Personalisation
Gen AI has the potential to unlock 1:1 personalisation, thereby transforming how financial institutions interact with customers by delivering exceptional service and hyper-personalised experiences at scale.
Proactive Customer Support
Agentforce from Salesforce is a game-changer for proactive customer support, enabling financial institutions to anticipate and address customer needs with precision.
By equipping agents with a unified view of customer data and leveraging AI-powered insights, Agentforce empowers teams to identify patterns and predict issues before they arise. For example, it can alert agents to potential account issues, flag customer dissatisfaction trends, or recommend personalised solutions during interactions.
Specific Opportunities:
- Predictive Insights: AI-driven systems can predict common customer issues and preemptively offer solutions.
- Reduced Response Times: Automated systems handle routine inquiries instantly, improving customer satisfaction.
- Customer Retention: Personalised outreach based on predicted needs enhances loyalty.
- Seamless Escalation: AI integrates with human agents to address complex issues without delays.
Best Salesforce Product for Enhanced Customer Support: Service Cloud
Service Cloud’s AI capabilities, including Einstein Bots, streamline customer service by automating responses and enabling real-time escalations, making it ideal for proactive customer support.
Hyper-Personalisation at Scale
With Gen AI, financial institutions can tailor interactions to individual preferences and histories, creating meaningful and personalised client experiences.
Specific Opportunities:
- Customised Offers: Generate financial product recommendations tailored to individual goals.
- Enhanced Marketing Campaigns: Deliver personalised messaging that resonates with each customer.
- Behaviour-Based Engagement: Analyse transactional data to recommend relevant products or services.
- Cross-Selling and Upselling: Identify opportunities to offer complementary services to existing clients.
Best Salesforce Product to Unlock Personalisation: Financial Services Cloud (FSC)
FSC centralises customer financial data, enabling AI-driven insights for hyper-personalised service and tailored financial recommendations.
Sentiment Analysis for Customer Satisfaction
Gen AI can analyse customer communications to detect sentiment, allowing institutions to proactively address dissatisfaction or capitalise on positive feedback.
Specific Opportunities:
- Customer Sentiment Tracking: Continuously monitor how customers feel about services.
- Proactive Issue Resolution: Identify and address dissatisfaction before it escalates.
- Loyalty Program Enhancement: Tailor loyalty offers based on positive feedback.
- Brand Perception Management: Use sentiment insights to refine communication strategies.
Best Salesforce Product to Analyse Customer Sentiment: Einstein for Marketing Cloud
Einstein analyses customer sentiment in real-time, allowing marketing teams to adjust campaigns and outreach based on emotional insights.
2. Optimising Investment Strategies
Gen AI has the potential to equip financial professionals with tools to make informed decisions by analysing market trends, simulating strategies, and generating actionable insights.
Predictive Market Insights
Gen AI identifies patterns in market data to predict trends and opportunities, helping financial institutions stay ahead in a competitive landscape.
Specific Opportunities:
- Early Trend Detection: Identify shifts in market conditions before competitors.
- Data-Driven Decision Making: Base strategies on actionable, AI-generated insights.
- Market Volatility Analysis: Evaluate scenarios for mitigating risks during fluctuations.
- Customisable Dashboards: Present predictions in formats tailored to stakeholder needs.
Best Salesforce Product for Optimising Investment Strategies: Tableau
>Tableau’s advanced analytics capabilities allow users to visualise market trends and AI-driven predictions, making data actionable for investment strategies.
Portfolio Optimisation
Gen AI helps create and simulate diverse portfolio strategies, optimising returns while minimising risks based on individual investor profiles.
Specific Opportunities:
- Dynamic Portfolio Adjustments: Respond quickly to real-time market changes.
- Risk Profiling: Align portfolio strategies with individual risk tolerance.
- Ethical Investing: Incorporate ESG (Environmental, Social, Governance) factors into portfolio decisions.
- Scenario Planning: Simulate multiple investment strategies to evaluate outcomes.
Best Salesforce Product to Optimise Portfolios: Financial Services Cloud (FSC)
FSC integrates client financial profiles and AI-driven analytics, enabling advisors to provide tailored, optimised portfolio recommendations.
Risk Management Simulations
Gen AI enables institutions to simulate stress test scenarios, evaluating how investments perform under various economic conditions.
Specific Opportunities:
- Stress Testing: Assess portfolio resilience during market downturns.
- Risk Diversification: Model the impact of diverse asset allocations.
- Scenario Comparison: Evaluate multiple “what-if” scenarios to choose the best strategy.
Best Salesforce Product to Run Risk Management Simulations: Einstein Analytics (Tableau CRM)
Einstein Analytics integrates risk modelling into interactive dashboards, providing clear insights to assess portfolio performance under different scenarios.
3. Improving Operational Efficiency
Another huge plus of Gen AI is the ability to streamline financial operations, automate processes, and enable teams to focus on higher-value tasks. The three biggest wins Gen Ai unlocks here are:
Workflow Automation
Gen AI automates repetitive tasks, such as document generation and data processing, freeing up employees to focus on strategic initiatives.
Specific Opportunities:
- Document Processing: Automate creation, review, and compliance checks for contracts.
- Expense Reduction: Reduce operational costs through automation.
- Faster Turnaround Times: Complete administrative tasks in seconds rather than hours.
- Improved Accuracy: Minimise human errors in routine processes.
Best Salesforce Product to Automate Workflows: Salesforce Flow
Salesforce Flow automates complex workflows across systems, ensuring efficiency and consistency in repetitive processes.
Streamlined Onboarding
Gen AI has a huge role to play in simplifying customer and employee onboarding processes by automating identity verification, KYC (Know Your Customer), and documentation.
Specific Opportunities:
- Faster Verification: Automate ID checks and compliance validation.
- Centralised Documentation: Generate and store onboarding materials securely.
- Personalised Onboarding Journeys: Tailor onboarding steps based on client or employee needs.
- Error Reduction: Ensure accurate compliance with regulatory standards.
- Salesforce Product: Comnexa Onboarding Accelerator
Our very own Onboarding Accelerator, powered by Salesforce, automates customer data capture, ID verification, and KYC, accelerating onboarding while ensuring compliance.
Find out more here: LINK TBC
Real-Time Performance Monitoring
Gen AI enhances operational oversight by analysing workflows and providing actionable insights to improve efficiency in real-time.
Specific Opportunities:
- Bottleneck Identification: Pinpoint inefficiencies in workflows.
- Resource Allocation: Optimise resource use based on real-time demands.
- Performance Dashboards: Deliver insights on operational performance metrics.
- Continuous Improvement: Provide data-driven recommendations for process refinement.
Best Salesforce Product to Enable Real-Time Performance Monitoring: Einstein for Service Cloud
Einstein enables real-time monitoring of workflows and operations, offering actionable insights to improve efficiency and customer outcomes.
In Conclusion
So, is Gen AI all hype? Hardly. From speeding up processes to delivering personalised experiences and predicting risks, GenAI is driving a transformation across retail banking, insurance, and wealth management.
However, harnessing its full potential requires navigating challenges like data privacy, bias mitigation, and workforce adaptation with precision.
By asking the right questions and leveraging robust tools like Salesforce’s Financial Services Cloud, Tableau, and the Comnexa Onboarding Accelerator, financial professionals can unlock unprecedented value while safeguarding against risks.
Ready to explore how AI can drive innovation and efficiency in your organisation? Get in touch with us today to discover tailored solutions that align with your goals and future-proof your business.