AI Compliance

The compliance to all legal frameworks in the deployment and application of AI systems and tools is an important part of the development of the UU AI policy. The development of the policy has a risk-based approach. Every part of the policy fits into the AI roadmap that is based on the AI holistic framework education.

We define AI as all AI systems and tools referred to in the EU AI Act as well as impactful algorithms that could have potentially risky effects on education, our education systems or human rights.

The European AI Regulation and risk categories

The European AI Regulation (EU AI Act) distinguishes between different risk categories:

  1. Prohibited AI
  2. High-risk AI
  3. Low and minimal risk AI

Certified AI Compliance Officers (CAICOs) faculties

CAICO stands for Certified AI Compliance Officer and is a certification that involves training and an exam. Having obtained this certification demonstrates that the required knowledge and competence are in place to be able to determine the risks of AI systems and tools. The CAICO will involve additional expertise to arrive at a well-informed decision a.o. via the AI compliancy SO&O and UB team.

For general questions on AI compliance in/for education you can contact: