Data protection by design is mandated by GDPR Article 25, Saudi PDPL Article 20, and EU AI Act Article 9 – yet the canonical Privacy by Design (PbD) framework (Cavoukian, 2009) was formulated a decade before modern machine learning and addresses data as discrete records flowing through processing steps, not as statistical patterns learned from millions of individuals simultaneously. Four AI-specific challenges render classical PbD insufficient: the learning problem (training embeds personal data into model weights in non-erasable form), the inference problem (models generate new sensitive personal information from innocuous inputs), the opacity problem (AI decision processes are not transparent to affected individuals), and the aggregation problem (combining individually innocuous data sources reveals sensitive personal information through AI inference). No published framework provides AI-specific PbD architectural guidance with measurable compliance metrics and systematic mapping to PDPL and GDPR simultaneously.
Got a question about the product? Email us at support@flevy.com or ask the author directly by using the "Ask the Author a Question" form. If you cannot view the preview above this document description, go here to view the large preview instead.
Source: Best Practices in Artificial Intelligence, Data Privacy PDF: Privacy by Design in AI-Enhanced Personal Data Processing PDF (PDF) Document, g51286802e84
|
Download our FREE Digital Transformation Templates
Download our free compilation of 50+ Digital Transformation slides and templates. DX concepts covered include Digital Leadership, Digital Maturity, Digital Value Chain, Customer Experience, Customer Journey, RPA, etc. |