Expert's opinion

DPO’s role in AI Act era

tophic image
Contents

In this initial phase of enforcement of the AI Act many organisations are struggling to find clear indications on who will have to oversee the compliance with the new regulations. The lack of roles formally provided by the UE legislation for the internal governance of AI systems generates an operational void that risks translating into inefficiencies or disorganised approaches. The figure of the Chief Artificial Intelligence Officer (CAIO) is often evoked in the debate, but to date it appears more like a theoretical construct than a function that can actually be implemented in business organisations. In this scenario, the DPO appears as a reference point that has been present in companies for years, equipped with a transversal vision and regulatory skills that, although not exhaustive, can be enhanced to provide an initial response to compliance needs.

Although not explicitly provided for within the AI Act, the DPO can be considered as a “natural” extension towards AI, especially with regard to the requirements of transparency, traceability, documentation and human oversight. Even though this extension is not without critical aspects - the DPO remains formally responsible for monitoring compliance with the GDPR, not for all of the provisions of the AI Act - it represents to date a pragmatic solution, pending the definition of more structured and sector-specific roles.

It is true that the AI Act introduces obligations that go beyond the scope of personal data protection, touching on complex technical and organisational aspects. Anyway, just because of their experience in risk assessment, document management and the promotion of practices inspired by the principle of accountability, DPOs can effectively contribute, right from the start, to the integration of AI requirements into existing business processes, acting as a link between regulatory compliance and operational governance.

The risk of functional ambiguities, if not clearly governed, remains real: it is crucial that the DPO involvement does not turn into an improper delegation or an overload of responsibilities in areas that require interdisciplinary skills. However, if supported by adequate structures and complementary professionals (e.g. AI experts, risk management and applied ethics), the DPO can act as a catalyst for internal processes aimed at compliance, contributing to the definition of policies, integrated impact assessments and proportionate audit mechanisms.

In conclusion, far from being granted a regulatory centrality that they do not currently possess, DPOs can still play an active, realistic and supervisory role in the implementation of the provisions of the AI Act. Pending the definition of new institutional figures responsible for the supervision of artificial intelligence systems, their contribution represents a precious resource for a transitory but responsible governance, upon condition that the relevant limits are respected and specific skills valued.

AI Act: risks and opportunities

AI Act: risks and opportunities

Download the PDF [1728 kb]