Abstract data connectivity pattern
Compliance六月 26, 2024

Internal audit's role in the new European Union’s Artificial Intelligence Act

What is the European Union Artificial Intelligence Act?

The European Union's Artificial Intelligence (AI) Act is a landmark regulation that seeks to define and harmonize comprehensive rules for the development and use of AI systems across the EU. As AI technologies become increasingly integrated into the work that organizations do every day, the regulation aims to ensure that AI development is based on a human-centric and trustworthy approach aligned with the fundamental rights and values defined by the EU. Internal auditors will play a critical role in both helping their organizations to understand the risks and opportunities in their use of AI as well as navigating the challenges of an increasingly complex regulatory environment.

Scope and applicability

The EU AI Act provides a clear definition of what makes up an AI system, encompassing machine learning, logic-based and knowledge-based approaches, and systems capable of inference from data. Internal auditors must ensure that their organizations' AI systems either align directly with or can be mapped to these definitions. Understanding these distinctions is the first step in providing assurance and insights related to compliance.

Embracing a risk-based approach

The European Union AI Act takes a risk-based approach by categorizing AI systems based on the level of risk they pose to health, safety, and fundamental human rights. One of the first steps for internal auditors is to verify that their organizations are not engaging in prohibited AI practices, such as subliminal, manipulative, and deceptive techniques, discriminatory biometric categorization, or expanding facial recognition databases through untargeted scraping of images, among others. Identifying, understanding, and assessing high-risk AI systems, especially those used in critical areas like healthcare, law enforcement, and essential services, is vital.

Meeting mandatory requirements for high-risk AI systems

High-risk AI systems are subject to stringent requirements under the AI Act. Internal auditors must assess that robust risk management systems are in place. These systems should include processes for identifying, assessing, and mitigating risks associated with high-risk AI systems. The internal audit data that is used by AI systems is also important, so assessing the organization’s data governance structures and processes is essential. Auditors should verify that high-quality data is used, appropriate documentation is maintained, and applicable record-keeping practices are followed. Auditors should consider:

  • Where did the data come from?
  • Are the processes and controls that produced the data designed and operating effectively?
  • Is the data complete, accurate, and reliable?
  • How is the data being used by the AI?

Ensuring compliance and continuous monitoring

Compliance with the EU AI Act does not end with the initial deployment of AI systems. Continuous monitoring is necessary to ensure ongoing compliance. Internal auditors must verify that high-risk AI systems undergo proper assessments and understand when external evaluations are required. Auditors should also assess whether proper mechanisms are in place for continuous monitoring, including incident reporting and timely corrective actions. By taking a more proactive approach, auditors can help ensure that potential risks are being addressed and that these systems remain compliant throughout their lifecycle.

Upholding human oversight

To ensure that organizations can prevent unintended consequences and maintain trust in their AI systems, human oversight is included as a critical aspect of the EU AI Act. Internal auditors can ensure that AI systems are designed to enhance human decision-making by verifying that measures for human control are included throughout. An important component of this includes verifying that users of AI systems are adequately trained to understand and manage these complex systems.

Click below to view a demo of TeamMate+ Audit

Biometric data

The EU AI Act imposes specific restrictions on the use of biometric data and remote biometric identification systems. Internal auditors need to ensure that the use of biometric data for identification or categorization complies with these restrictions and is limited to permitted scenarios, such as security or authentication. There are also specific rules governing the use of remote biometric identification systems, especially in publicly accessible spaces and for law enforcement purposes. Auditors will need to understand the rules applicable to their organization and verify compliance.

Ethical considerations and fundamental rights

The European Union’s AI Act maintains significant focus around ethical considerations and the protection of fundamental rights. To that end, internal auditors should evaluate whether AI systems are developed and used in a non-discriminatory manner, promote equality, and encourage cultural diversity. Some areas for internal auditors to consider around respecting fundamental rights include privacy, data protection, freedom of expression, and non-discrimination.

Documentation and reporting

Comprehensive documentation and reporting are crucial for demonstrating compliance with the EU AI Act. Internal auditors should provide assurance that detailed technical documentation is maintained for all AI systems, providing a clear trail of their development, functioning, and compliance measures. Additionally, auditors should verify that their organizations comply with regulatory reporting obligations, including incident reporting and annual compliance statements.

Conclusion

As AI technologies continue to evolve, internal auditors play a crucial role in ensuring that their organizations navigate the regulatory complexities of the EU's AI Act. By focusing on these key areas, auditors can help ensure their organizations are appropriately managing AI-related risks and are developing and deploying AI systems responsibly and ethically in compliance with these new requirements.

Subscribe below to receive monthly Expert Insights in your inbox

Jim Pelletier Headshot
Senior Product Manager, Wolters Kluwer TeamMate
Jim has over 20 years of internal auditing experience in both the public and private sectors.
Back To Top