DORA and AI Act: Boost your cybersecurity posture

by Isabella Sacchi - ICT compliance consultant
| minute read

Financial entities must manage data assets, assess risks, and ensure transparency to comply with DORA and the AI Act when deploying high-risk AI solutions © Getty Images 

 

Financial entities are going the extra mile to meet the compliance requirements of the Digital Operational Resilience Act (DORA), a European Union regulation aimed at enhancing the cyber resilience of the region’s financial sector. The deadline of January 17, 2025, is fast approaching. By then, financial entities subject to the regulation are expected to have established an information and communication technologies (ICT) risk management framework and strategy under DORA requirements. 

While financial entities are busy implementing DORA compliance programs, the EU has introduced a new, groundbreaking regulation: the AI Act. Effective August 2 of this year, the AI Act sets out rules to ensure more trustworthy, safe, and robust AI products, particularly those considered "high risk." 

Although these two regulations address different topics, they share numerous commonalities and one common purpose: making new technologies more secure. Financial entities should, therefore, consider the obligations arising from the AI Act that will impact their ICT risk management framework and integrate them into their cybersecurity program. 

What should financial entities consider? 

To understand how DORA and the AI Act interact, let's consider a financial entity deploying an AI solution for credit scoring of individuals or for pricing health insurance. This type of solution is considered high risk under the AI Act (Annex II). In such cases, there are obligations stemming from both DORA and the AI Act that the financial entity must consider and prepare for: 

Know the data assets

Robust data governance is the foundation of compliance with both DORA and the AI Act. DORA requires financial entities to inventory their ICT and information assets, particularly those critical in supporting business functions. A financial entity cannot adequately assess and mitigate associated risks without a clear understanding of its assets. Meanwhile, the AI Act mandates mapping out the data used to feed the AI solution, implementing security measures, and ensuring compliance with the general data protection regulation (GDPR) rules and the European Banking Authority (EBA) rule on loan origination and monitoring. Specifically, a new directive on consumer loans will add requirements regarding data protection. For instance, when deploying a credit scoring solution, the financial entity must track information related to clients' credit accounts, payment history, credit utilization, and outstanding debt, which feed into the credit scoring AI model. Under both regulations (and the GDPR + credit directive), a clear view of information assets is the starting point to ensure compliance in data utilization. 

Assess the risks

DORA involves continuous ICT risk assessment and imposes strict requirements for risk management. Simultaneously, the AI Act includes requirements to assess the risks of AI systems and their purposes to prevent potential harm to users or infrastructures. In the example above, the financial entity using a credit scoring solution must evaluate and manage associated digital risks under DORA and AI perspectives. Risks related to data integrity, data availability, low-quality data feeding the AI algorithm, algorithmic biases, etc., are relevant not only from a security perspective but also to ensure that the output is fair and no rights are harmed. It is crucial that the AI system complies with the security and resilience standards defined by both DORA and the AI Act, and that the DORA ICT risk management processes are integrated with AI risk assessment. 

Manage supply chain risks

Due to the lack of centralization, tracking the risks posed by AI services to an organization can be challenging. DORA and the AI Act outline strict rules for managing supply chain risks. Under DORA, the financial entity using a credit scoring solution must perform ICT due diligence before entering into a contract with the solution provider. Additionally, it must manage the ICT risk arising from the AI provider (and its subcontractors), for which the financial entity is held accountable. The AI Act requires the financial entity to properly use the credit scoring system, monitor its performance, and report any risks or incidents, including those related to third-party providers. Furthermore, the financial entity is implicitly required to conduct due diligence on third-party AI solutions as part of its overall responsibility to ensure the safe and compliant use of high-risk AI systems. Therefore, setting up a proper third-party risk management framework and due diligence practices is essential for compliance with both regulations. 

Test and audit

Both regulations emphasize continuous monitoring through testing and auditing. DORA mandates digital operational resilience testing and regular auditing, while the AI Act requires regular AI system validation. The credit scoring solution must undergo performance testing to detect whether it can provide timely and accurate credit scores under various operational conditions. This type of testing is crucial for detecting ICT weaknesses as required under DORA and ensuring, as per the AI Act, that the AI model remains accurate and reliable under various conditions and does not degrade over time. 

Be transparent and uphold accountability

Two common principles in most recent digital regulations are transparency and accountability (GDPR serves as a model here). These principles should guide financial organizations deploying credit scoring solutions when dealing with DORA and AI Act compliance. DORA emphasizes transparency in ICT risk management processes and ICT-related incidents that may affect the credit scoring solution. At the same time, the AI Act demands transparency from the financial entity in its use of the latter. In both cases, the financial entity will be held accountable. To this end, the financial entity must ensure it maps, documents, updates, and reports on all relevant ICT and AI processes. Strong governance will be key to complying with these two principles. 

Building a comprehensive roadmap integrating DORA and AI Act requirements 

DORA and the AI Act can be two formidable allies in boosting your security posture. However, to achieve robust cyber resilience and ensure compliance, it is crucial to view these regulations not as standalone mandates but as complementary components of a comprehensive framework. Integrating both from the outset can be challenging. 

At Sopra Steria, we believe this vision should already be incorporated into the roadmap of financial services companies. We are working on a strategy from conception to implementation that will help you: 

  • Adopt a holistic regulatory approach. 
  • Strengthen your governance and ICT risk management by integrating AI and DORA requirements. 
  • Develop documentation and due diligence procedures and conduct audit activities in the context of third-party risk management. 

Define your resilience testing strategy and support its implementation. 

Our practical approach is built transversally by a team of multidisciplinary experts combining compliance and cybersecurity knowledge, capitalizing on our in-house tools, frameworks, and solutions already deployed in numerous DORA-related projects. 

Bio (à ne pas indiquer directement dans l’article) : Isabella Sacchi is an ICT compliance consultant with specialized expertise in privacy and security-related legislation, including DORA and the AI act. Based in Brussels, she is part of the Compliance team of the Cybersecurity business line in Belgium. ISO27001 certified, with her combined legal and cyber knowledge, she can support you in your ICT compliance journey. 

Search

cybersecurity

Related content

Striking a balance between innovation and resilience in the banking sector

Balancing product development, security and compliance is a challenge for banks. Erwan Brouder, our Deputy Head of Cybersecurity, gives us his analysis.

As banks look to use generative AI, can they move quickly enough?

Banks must rapidly embrace generative artificial intelligence to stay competitive amid tech disruption 

Encryption (1/2) - The role of encryption under GDPR: solving the confusion

Since the adoption of the EU General Data Protection Regulation (GDPR) in May 2018, encryption can also be perceived as a multi-layered legal concept that leverages data protection and privacy.