finance technology Global 13-03-2026AI and Cybersecurity: Why Lenders Need a Holistic Approach to Operational ResilienceDavid Raw, Managing Director of Commercial Finance at UK Finance, shared insights from a roundtable discussion on cybersecurity, resilience and AI during the UK Invoice Finance and Asset-Based Lending Summit in Birmingham in early March. Raw reflected on how lenders are beginning to consider the opportunities and risks that AI introduces into operational resilience strategies. As AI becomes more widely adopted across financial services, cybersecurity and resilience are rising rapidly up the agenda for lenders and technology providers.At a recent roundtable session at the UK Invoice Finance and Asset-Based Lending Summit, industry participants explored how AI could strengthen resilience within lending organisations while also introducing new risks that need careful management.According to Raw, one of the most important themes from the discussion was the need for organisations to take a holistic approach when thinking about AI and cybersecurity.Rather than viewing AI purely as a technology initiative, businesses need to ensure that discussions around risks and opportunities involve teams across the organisation.“We talked about the importance of taking a holistic approach,” Raw explained, noting that AI should not be treated as something that sits solely within IT or technology departments. Understanding AI Risks in a Rapidly Evolving LandscapeOne of the central challenges highlighted during the roundtable was the constantly evolving nature of cybersecurity threats, particularly as AI technologies become more sophisticated.Participants discussed how specialist vendors and cybersecurity providers are playing an increasingly important role in helping organisations identify vulnerabilities and strengthen their defences.Some vendors actively test systems by simulating cyberattacks, allowing organisations to understand how attackers might exploit weaknesses and respond before those vulnerabilities can be used in real-world situations.This proactive testing helps lenders keep pace with the fast-changing threat environment.As AI tools become more widely integrated into financial systems, organisations will need to ensure that their security frameworks evolve at the same pace. Defining the Business Outcomes of AI AdoptionWhile cybersecurity risks are a major consideration, the roundtable also emphasised the importance of clearly defining what organisations hope to achieve through AI adoption.Investments in AI capabilities are often discussed in terms of return on investment, but participants suggested that the benefits may not always be immediately measurable.Instead, lenders may need to consider broader objectives, such as improving operational resilience, strengthening risk management processes or enhancing internal efficiency.Monitoring AI initiatives against clearly defined business outcomes will help organisations understand whether the technology is delivering the intended value. The Role of Staff Training and Responsible ExperimentationAnother key topic raised during the discussion was the importance of training employees to understand both AI and cybersecurity risks.Participants emphasised that staff across the organisation should be equipped with the knowledge needed to identify potential threats and use AI tools responsibly.At the same time, organisations should not discourage experimentation.Allowing employees to explore how AI tools might improve workflows can help businesses identify practical use cases and unlock new efficiencies.Striking the right balance between structured governance and innovation will be essential as adoption grows. Why Human Oversight Still MattersDespite the potential of AI to automate many processes, Raw stressed that human expertise will remain essential.AI systems can analyse vast amounts of data and identify patterns, but experienced professionals are still needed to interpret results and recognise when something does not look right.For example, risk managers may detect subtle warning signs based on their experience and contextual understanding, insights that automated systems alone may struggle to replicate.Human oversight also plays a critical role in addressing the ethical implications of AI, ensuring that models produce outcomes that are fair, transparent and aligned with organisational values. AI’s Growing Role in Operational ResilienceLooking ahead, Raw suggested that AI is likely to become an increasingly important component of operational resilience across the financial sector.In his role co-chairing the Cross-Market Operational Resilience Group with the Bank of England, Raw noted that discussions around AI are becoming more frequent among regulators and industry participants.Both the opportunities and the risks associated with AI are now central to conversations about the resilience of financial institutions.AI could potentially help organisations detect threats more quickly, respond to incidents faster and strengthen the stability of their operational systems.At the same time, businesses must also prepare to defend against new forms of AI-enabled cyber threats. A Sector Watching the Rapid Evolution of AIWhile many questions remain about how AI will ultimately reshape cybersecurity and operational resilience, one conclusion from the roundtable was clear: the industry is entering a period of rapid technological change.Financial institutions are actively exploring how AI might improve their resilience while also recognising the need to strengthen governance, security and oversight frameworks.For lenders operating in invoice finance and asset-based lending, the challenge will be to harness the benefits of AI while ensuring that cybersecurity and operational resilience remain robust.As Raw observed, the role of AI in shaping the future resilience of financial services is becoming impossible to ignore. #AI#asset based lending#cybersecurity#financial services#Financial technology#invoice finance#lenders#Operational resilience#risk management#UK Finance