The adoption of Artificial Intelligence (AI) systems holds immense potential for societal benefits, economic growth, and boosting EU innovation and competitiveness. However, it’s also recognized that certain AI systems raise concerns, particularly in terms of safety, security, and the protection of fundamental rights.
The European Commission’s Proposal
In April 2021, the European Commission introduced a proposal for a new Artificial Intelligence Act (AI Act). This proposal aims to create a legal framework that addresses the challenges posed by AI while fostering innovation. Below you can find what you need to know about it.
The AI Act seeks to establish a technology-neutral definition of AI systems, accommodating various AI approaches.
The proposal categorizes AI systems into four levels of risk:
In December 2022, the Council narrowed down the definition of AI systems and extended the prohibition on AI for social scoring to private actors. They also introduced a layer to exclude AI systems with limited risk from stringent regulation.
The European Parliament made substantial amendments to the Commission’s proposal in June 2023, aligning the definition with OECD standards. They also expanded the list of banned AI systems, introduced criteria for high-risk designation, and emphasized fundamental rights and transparency.
Negotiations are underway to finalize the AI Act. The first and second trilogue meetings took place in June and July 2023, respectively. EU lawmakers are working to strike a balance between promoting AI innovation and safeguarding rights, safety, and ethics.
The AI Act represents a significant step in shaping the future of AI regulation in the EU, aiming to harness its potential while ensuring responsible use.
Interested to find out more? Check out:
Funded by the European Union. Views and opinions expressed are however those of the author(s) only and do not necessarily reflect those of the European Union or the Europe Research Executive Agency. Neither the European Union nor the granting authority can be held responsible for them.