23 September, 2018
On 5 June 2018, the Personal Data Protection Commission (PDPC) in Singapore released a discussion paper on Artificial Intelligence (AI) and Personal Data. The PDPC is Singapore's data protection regulator that covers how organisations collect, use and disclose personal data. The discussion paper takes a technology neutral and sector agnostic approach and is intended to be applied to a wide cross-section of organisations and industry bodies. This update shows the treatment of Artificial Intelligence by a regulator and will be of interest to any company that makes or uses automated data systems.
The framework divides the AI ecosystem into three components: those who make AI ("AI Developers"); those who use AI processes or sell AI enabled devices ("User Companies"); and consumers. The obligations of the framework primarily rest on AI Developers and User Companies.
The obligations set out by the framework broadly fall into the following categories:
- The ability to explain how your AI-enabled product works (i.e. explainability) or, where that is not possible, to supervise the AI system to ensure that the results are accurate (i.e. verifiability);
- Good data practices for organisations. This includes knowing the provenance of data and its movement (data lineage), keeping good records throughout the AI value chain, as well as minimising the risk of inherent or latent biases in the dataset; and
- Open and transparent communication, both between AI Developers and User Companies as well as with consumers, with a view towards building trust in the AI ecosystem.
The discussion paper also outlines governance measures for organisations to consider that will allow them to be accountable to regulators for their AI decision-making processes. It also suggests measures for building trust and managing relationships with consumers who interact with AI decision-making.
While no binding requirements have currently been imposed, the discussion paper provides an insight into potential regulatory touchpoints and considerations. It may also serve as a basis for legal counsels and compliance officers to obtain resources to improve data practices in organisations that place significant investments or reliance on AI.
By issuing a discussion paper, it is clear that the regulator has its eye on the increasing use of AI across various sectors. The regulator's current opinion is that "governance frameworks around AI should be … 'light-touch'". The industry's response will determine how this view develops in future.
You may find a copy of the discussion paper on Artificial Intelligence (AI) and Personal Data – Fostering responsible development and adoption of AI on this link.
For further information, please contact:
Andy Leck, Principal, Baker & McKenzie.Wong & Leow
andy.leck@bakermckenzie.com