- within Technology topic(s)
- in United Kingdom
- within Technology, Transport, Media, Telecoms, IT and Entertainment topic(s)
- with Inhouse Counsel
- with readers working within the Law Firm industries
Following our publications on the introductory set of guides (Guides 1 and 2) issued by the Spanish Artificial Intelligence Supervisory Agency (Agencia Española de Supervisión de la Inteligencia Artificial, or AESIA) to support compliance with the European Artificial Intelligence Act (the AI Act), today we begin our analysis of the second set of guides: Specialised Technical Guides 3 to 15.
Specifically, today we will summarise the key aspects that should be taken into account in Guides 3 and 4:
1. Guide 3, titled "Conformity Assessment", provides a detailed and indicative overview of the mandatory conformity assessment process that high-risk Artificial Intelligence (AI) systems must undergo under the AI Act (Article 43) before being placed on the market or put into service in the European Union. Its purpose to provide a practical roadmap for providers (the party legally responsible for the process) to demonstrate that its system complies with the AI Act's essential security, transparency and data governance requirements.
Guide 3 clarifies the two main procedures for conducting a conformity assessment:
- Internal control (self-assessment): This is the rule-of-thumb for most high-risk systems (points 2 to 8 of Annex III), such as systems connected to employment, education, and financial services. Under this framework, the provider is able to verify internally that its Quality Management System (QMS) and technical documentation comply with the AI Act; or
- Third-party intervention (notified body): Mandatory in the case of biometric identification systems (Annex III, point 1) when the provider has not implemented or has only partially implemented the available harmonised standards or common specifications. In these cases, an independent body must audit and certify the system.
Guide 3 sets out the technical and documentary requirements that must be met to accredit compliance with the AI Act, integrating aspects such as QMS implementation, the development of technical documentation, and the design of post-market monitoring plans. It also provides examples and methodologies to facilitate implementation of these requirements by means of instruments such as harmonised standards and common specifications (respectively, technical standards created by standardisation bodies at the request of the Commission, and technical rules issued directly by the Commission itself when the former do not exist or are insufficient).
Process success will culminate with the drawing up of an EU Declaration of Conformity and the affixing of the CE mark (a key indicator, but not in itself absolute proof, of a product's compliance with EU legislation).
2. Guide 4, titled "Quality Management System", articulates all of the AI Act's operational obligations within an organisation. Its purpose is to analyse the organisational and technical measures that will serve providers (and, in specific cases of joint development, deployers) to comply with Article 17 of the AI Act. The purpose of the QMS is to ensure that high-risk AI systems are secure, reliable, auditable throughout their lifecycle.
Guide 4 sets out the key elements that an entity must integrate to adequately comply with the AI Act through policies, procedures and instructions. The following should be highlighted from among the 13 mandatory sections in Article 17: the development of a clear accountability and governance framework that defines the responsibilities of management and technical staff; documented processes for the design, development, testing and validation of systems; and cybersecurity and accuracy measures integrated from the design stage.
One fundamental aspect is that QMS implementation should be proportional to the size of the provider's organisation. This principle is designed to balance the administrative burden with the company's capacity..
- Read the first ebulletin 'New AESIA Guidelines to support compliance with the AI Act' (02 February 2026)
- Read the second ebulletin 'AESIA introductory guides to support compliance with the AI Act' (09 February 2026)
The content of this article is intended to provide a general guide to the subject matter. Specialist advice should be sought about your specific circumstances.