What are the requirements for healthcare users?

Marguerite Brac de la Pierre , Tuesday 24 May 2022

In the previous section, professionals using artificial intelligence (AI) tools were presented in the context of compulsion, prevention, diagnosis or care. The purpose of this paper is to focus on the obligations of the healthcare organization when a publisher provides an AI system.

First, before using the AI ​​system to process personal data, it depends on the health organization, its ability as a data regulator within the means of GDPR, to ensure the publisher of the system in its capacity. As a subcontractor, it provides adequate guarantees for the implementation of appropriate technical and organizational arrangements so that the processing meets the requirements of the GDPR. In this sense, any health organization must request from the publisher a security assurance plan and / or an impact analysis project on the relevant operational scope, as well as relevant technical documentation, considering the specific risks of AI systems over traditional information systems. The ability to automate learning increases the “attack surface” of these systems, introducing many new vulnerabilities (see Linc Cnil file).

Of course, each AI system must be allowed to comply with the provisions of the PGSSI-S and must be hosted by a certified health data host in Europe or, in the case of an American provider, must be subject to additional provisions. Such as the possibility of accessing data from the excluded American intelligence service.

As a reminder, one must be included in the market and / or contract “Data Processing Agreement” Define the features of the assigned process and the commitment of the publisher, especially in the use of subsequent subcontractors, as well as in the appendix. Hosting health information Required by regulations.

In addition to the specific security measures for the AI ​​system used, the usual security measures must be applied to the processing of personal data or results for the individual and, if necessary, evaluated within the framework of impact analysis.

In order to improve the algorithm, the publisher must pay special attention to the conditions of possible reuse of the data processed using the AI ​​system, the responsibility of which rests solely with the organization as the decision maker.

In addition, and as is customary, healthcare providers must ensure the correct level of medical CE marking: Class II a) in accordance with medical device regulations in terms of diagnostic aid or therapeutic aid, and Class I does not prefer to keep certain diagnostic aid devices. In the market.

Finally, technical documentation for high-risk AI systems must comply with the requirements of the draft AI regulations. The instructions for use must not only allow the users of the organization to interpret the results of the system and use it in an appropriate manner, i.e. to use it according to its intended purpose, but also to be able to exercise human control.

It should be noted that the penalties for data protection and artificial intelligence are the same, but the latter can be up to মিল 30 million for data management requirements.


Author

Lerins & BCW’s associate lawyer, Marguerite Brac de La Perrière, is a digital health expert.
It helps healthcare players in their regulatory compliance, their development and their growth, especially in the processing or reuse of data and contracts.
mbracdelaperriere@lerinsbcw.com

# Therapeutic# GDPR#Security


Leave a Comment