AI Act could lead to a rush of half-baked products

by Robert Nordan - Solutions Architect
| minute read

The rush to market to beat the various deadlines for the new AI act could be disastrous, says Robert Nordan, solutions architect at Sopra Steria.  

Anyone planning to launch products developed with AI, or containing AI, in the near future should have noted the dates of the new AI Act which came into force in August.  

The EU's new regulation for artificial intelligence, the AI Act, is in many ways AI's answer to GDPR. Those of us who experienced the introduction of GDPR still remember how much time was spent justifying, documenting, and changing the use of personal data in various systems leading up to the 25 May, 2018 deadline.  

The AI Act will require similar work for all existing and future AI systems, with one important difference - systems operational before a given deadline will have several extra years to complete this work. However, if a system is put into operation after the deadline, this work must be done before it can be launched. 

These are the requirements you must meet 

For AI systems classified as high-risk, you must document the risk management system, the processing of training data, and the quality of the system's response. Additionally, you must prepare detailed technical documentation, write a declaration of conformity, and obtain CE marking for the system. A professional entity likely has much of this in place through GDPR and industry standards like ISO 27001, but there are still many new requirements to meet. 

Since it can involve many months of extra work, it's not surprising if AI system launches cluster tightly in the months before the set deadlines, followed by a lull in the months after. One might also imagine that the prioritisation between quality and time-to-market will be affected by the deadline, since things can always be "patched up" later, before tackling the AI Act documentation. 

Deadlines for AI systems 

Which dates should you be aware of? The EU AI Act has now been adopted by the European Parliament and came into force in August. By closely reading articles 111 and 113 of the regulation (there’s quite a bit to delve into), these seem to be the deadlines: 

  • End of 2024/2025: Six months after the regulations are enforced, systems that are not allowed in the EEA must be discontinued. These are systems that deliberately manipulate citizens or groups of citizens, engage in real-time biometric surveillance, or calculate "social scores." It doesn't matter if they were launched before this date; they must be removed regardless.
  • Summer 2025: Twelve months after enforcement, the rules apply to AI systems with limited risk, such as general chatbots and AI-generated content. These must indicate that AI is used. If launched before this date, they get two additional years to comply.
  • Summer 2026: Twenty-four months after enforcement, all AI Act rules apply. This includes high-risk systems, such as those monitoring critical infrastructure or evaluating applications for loans, visas, etc. If launched before this date, they must comply only if the system undergoes significant design changes. Public authority systems must comply within four years regardless.
  • Summer 2027: Thirty-six months after enforcement is the deadline for two high-risk exceptions. One involves AI used as a safety component in regulated goods like medical devices, cars, and planes — physical items with long development cycles. Here, no post-deadline documentation is allowed. 

The other is the EU's large-scale systems for Schengen border security, currently under development. These must comply by 31 December, 2030, regardless of when the AI Act comes into force. The extended deadline for the EU’s own prioritised projects might indicate the impact the deadline can have on project progress. 

Start planning now 

These dates should be added to your calendar and considered in your project planning. This applies whether you are launching something AI-based or if your business is considering purchasing a system released close to the deadline. 

The new documentation requirements should be followed continuously during the development process, so the process does not extend beyond the deadline, forcing you to choose between reduced quality and even greater delays. 

The post was first published on digi.no on 14 May 2024. 

Search

artificial-intelligence

emergent-technology

Related content

TradSNCF: AI to help rail staff welcome Olympic Games travellers

TradSNCF, rail operator SNCF’s AI-powered translation tool, enhances the travel experience for millions of passengers from around the world.

How Norad and Sopra Steria leverage AI and cloud tech to fight child illiteracy

A joint Norad-Sopra Steria project leverages AI and cloud tech to boost child literacy by creating open, accessible education resources. 

How AI is powering support services for EDF employees

World leader in low-carbon energy generation EDF wanted an innovative tech solution to ease pressure on IT support teams while also boosting service quality. AMY was the answer.