What AI training has your organisation rolled out? The EU AI Act (Article 4) demands that providers and deployers of AI systems take measures to ensure a sufficient level of AI literacy of their staff and anyone else using AI systems in the EU on the organisation's behalf. This obligation began to apply on 2 February 2025. Many organisations launched AI literacy training programmes, unsure whether what they were doing was enough to comply. In May 2025, the European Commission released AI Literacy – Questions & Answers (Q&As), clarifying the scope of the AI literacy requirement. This briefing sets out key takeaways from the Q&As.
What are the minimum steps required to meet the AI literacy requirement?
Is AI training necessary to comply or can the AI literacy requirement be satisfied by other means?
The Commission's answer is "it depends": it depends on factors such as the role of each organisation in the AI value chain, the risk level of the systems used, and the current knowledge of staff. However, the Commission makes the point that, in many cases, simply asking staff to read an AI system's instructions for use may be ineffective and insufficient.
In deciding what training your organisation needs to provide and to whom, it is also important to consider potential crossover with other (explicit and implicit) training obligations under the AI Act, which will soon begin to apply. For example, if your organisation uses a "high-risk" system which, from August 2026, will require human oversight, those individuals will need to have the necessary training and support for that task (Article 26 of the EU AI Act).
There is, in summary, no "one size fits all" approach to what will be deemed a sufficient level of AI literacy, and a flexible approach will be taken to that assessment, given the broad topic of AI and its fast-evolving nature.
The Commission does, however, set out what organisations should be doing as a minimum:
- Ensure general AI understanding. Staff should know what AI is, how it works, which AI systems are in use, and the associated opportunities and risks.
- Define organisational roles. Clarify whether the organisation develops AI systems or simply uses systems supplied by others.
- Identify risks. Explain the risks associated with the AI systems and the necessary mitigations. Consider the context and the users of the systems.
- Tailor training. Check what staff already know and adjust training to their level of technical knowledge, experience, education, and training.
- Incorporate legal and ethical issues. Training should cover the AI Act, other relevant laws, and the principles of ethics and governance.
There is no obligation to measure employees' AI literacy levels, although it is important to record what training has been undertaken. There is also no specific governance structure required, so organisations do not need to appoint an AI officer or create an AI governance board to meet the AI literacy requirements.
When will the AI literacy requirement be enforced?
The Commission confirms that the supervision and enforcement of the AI literacy rules start on 2 August 2026. When asked if penalties could be imposed retrospectively from 2 February 2025, the Commission did not give a clear answer.
A lack of AI staff training/guidance will likely be seen by regulators as an aggravating factor in wider enforcement for other breaches of the EU AI Act – this is probably more likely than standalone enforcement of the AI literacy requirement.
How far does the AI literacy responsibility extend?
Article 4 requires organisations to take measures to ensure a sufficient level of AI literacy both of their staff and "other persons dealing with the operation and use of AI systems on their behalf". The Commission regards "other persons" as those broadly under the organisational remit, and the Q&As provide examples of contractors, service providers and clients. The notion that clients use AI systems on their supplier's behalf is perhaps a surprising interpretation of the wording. However, if clients fall within the purview of Article 4, a degree of proportionality will likely apply, with providers or deployers being required to do more for their employees than for groups not directly under their control.
Should a service provider using AI have a contractual obligation to demonstrate AI literacy? The Commission offers an equivocal answer, remarking that, in general, people working for a service provider or contractor need to have the appropriate AI skills to fulfil the task in question (in the same way as employees).
What other guidance is available?
- The Q&A is a living document - organisations should continue to check it regularly for updates.
- There is a living repository, comprising real examples of ongoing AI literacy practices among AI pact signatories.
- A webinar discussing Article 4 and sharing practices, with accompanying slides, is also available.
AI literacy as an opportunity
The Commission's approach to the AI literacy requirement is a flexible one. It enables organisations to take a proportionate approach to training their staff and others using AI systems. Instead of approaching AI literacy as another task to tick off the compliance to-do list, organisations are better viewing it as an opportunity to help them get the most out of their AI systems and an important defence to guard against AI-related risks.
Our team at Travers Smith have been supporting clients with their AI literacy programmes.
The content of this article is intended to provide a general guide to the subject matter. Specialist advice should be sought about your specific circumstances.