On 18 July 2023, the Personal Data Protection Commission (PDPC) published a set of Proposed Advisory Guidelines on Use of Personal Data in AI Recommendation and Decision Systems (Proposed AI Guidelines) for public consultation.
The Proposed AI Guidelines aim to clarify how the Personal Data Protection Act 2012 (PDPA) applies to organisations that collect and use personal data to develop and deploy systems that embed machine learning (ML) models (AI Systems) which are used to make decisions autonomously or to assist a human decision-maker through recommendations and predictions.
Summary
The Proposed AI Guidelines particularly seek to address:
How organisations can rely on exceptions under the PDPA to use personal data in the development of AI Systems; and
How to meet PDPA consent, notification and accountability requirements when collecting personal data for use in AI Systems for decisions, recommendations or predictions.
The Proposed AI Guidelines include guidance on the following areas:
Business Improvement Exception and Research Exception. The Proposed AI Guidelines contain illustrations of when organisations may not need to obtain consent to collect or use personal data in AI Systems for business improvement purposes (Business Improvement Exception) or research purposes (Research Exception).
The Business Improvement Exception may be relevant (amongst others) where organisations collect or use personal data to develop AI Systems with the aim of improving operational efficiency by supporting decision-making, or to offer new or more personalised products and/or services. Examples include developing or enhancing recommendation engines in social media services to offer targeted content to users based on their browsing history, or job assignment systems which automatically assign jobs to platform workers.
The Research Exception may be relevant (amongst others) where organisations conduct commercial research to advance science and engineering without a product development roadmap.
Considerations when Using Personal Data. Organisations should implement legal, technical and process controls for data protection in designing, training, testing or monitoring AI Systems. For example, organisations should practise data minimisation by using only personal data containing attributes required to train and improve the AI System or ML model, and only use the volume of personal data necessary to train the AI System or ML model based on relevant time periods and other filters. Organisations should also anonymise datasets as far as possible instead of using personal data.
Consent and Notification. Organisations that deploy AI Systems to provide recommendations, predictions or decisions based off individuals’ personal data must comply with PDPA consent and notification requirements unless relevant exceptions apply. Organisations should include the following information in notices to individuals: (i) product functions that require processing of personal data (e.g. recommendation of movies); (ii) types of personal data that will be processed (e.g. movie viewing history); (iii) how personal data processing is relevant to product features (e.g. analysis of users’ viewing history to make movie recommendations); and (iv) specific features of personal data that are more likely to influence product features (e.g. whether movie was viewed completely or multiple times etc.).
Accountability. Organisations that use AI Systems should be transparent and include relevant practices and safeguards in their written personal data policies, which should contain a level of detail proportionate to the risks in each use case (e.g. taking into account potential harm to individuals and the level of autonomy in the AI System).
Procurement of AI Systems. Service providers that are engaged by other organisations to develop and deploy bespoke or customised AI Systems (e.g. system integrators) may be regarded as data intermediaries (i.e. processors), in which case they should: (i) at pre-processing stage, use techniques such as data mapping and labelling to keep track of data used in the training dataset; and (ii) maintain a provenance record to document the lineage of training data.
How could it be relevant for you?
PDPC guidelines are not legally binding, but set out how the PDPC intends to interpret and enforce the PDPA. Companies that deploy AI Systems should therefore take into account the PDPC’s final guidelines in planning and structuring their activities.
Next steps
The public consultation on the Proposed AI Guidelines closed on 31 August 2023. The PDPC can be expected to review comments received, before publishing its final guidelines subsequently. Companies that deploy or intend to deploy AI Systems should monitor this space for further developments.