The Regulatory Landscape Has Changed
In 2026, AI systems processing personal data face overlapping regulations: GDPR in Europe, CCPA in California, India’s DPDP Act, and the EU AI Act (now in enforcement). Each has AI-specific provisions that did not exist when GDPR was written. Builders who treated privacy as a legal formality in 2023 are now facing enforcement actions.
The GDPR AI Intersection
Key GDPR requirements that AI systems must navigate: the right to explanation for automated decisions (Art. 22), data minimisation (only process what you need), purpose limitation (data collected for one purpose cannot be used for model training without consent), and the right to erasure (can you delete a user’s data from a trained model?).
PII Detection and Redaction
Before data enters AI pipelines, PII (personally identifiable information) must be detected and handled appropriately. The Elastic Edge AI suite includes an `elastic-pii-detect` plugin that identifies PII at ingest time using ONNX-based NER, enabling automatic redaction or flagging before data is indexed or used for training.
The Training Data Problem
If you fine-tune a model on customer data, that data is potentially embedded in the model weights. Responding to a deletion request by removing the fine-tuning data is insufficient — you may need to retrain or use machine unlearning techniques. Design your training data pipeline with deletion in mind from day one.
Privacy-Preserving AI Techniques
For high-sensitivity applications, consider: differential privacy (adding calibrated noise to training to prevent memorisation), federated learning (training on data that never leaves the user’s device), and on-device inference (running models locally, never sending data to a server). These are no longer academic — production implementations exist for all three.