Privacy by Design Under GDPR Article 25: Implementation Guide for Product and Engineering Teams
Article 25 of the GDPR establishes two complementary obligations: data protection by design and data protection by default. These are not aspirational goals. They are legally binding requirements that apply to every controller, enforceable with fines of up to EUR 10 million or 2% of annual worldwide turnover under Article 83(4). Yet for many product and engineering teams, translating these legal obligations into practical development workflows remains a challenge.
This guide bridges the gap between the regulatory text and day-to-day software development. It provides a practical framework for embedding privacy into your product lifecycle, from initial design through deployment and beyond.
Understanding Article 25
Article 25(1) requires the controller to implement appropriate technical and organisational measures designed to implement data protection principles in an effective manner and to integrate the necessary safeguards into the processing. This obligation applies at the time of the determination of the means for processing and at the time of the processing itself.
Article 25(2) requires that, by default, only personal data which are necessary for each specific purpose of the processing are processed. This applies to the amount of data collected, the extent of processing, the period of storage, and accessibility. By default, personal data shall not be made accessible without the individual intervention to an indefinite number of natural persons.
The EDPB Guidelines 4/2019 on Article 25 clarify that the obligation to implement data protection by design is an ongoing, dynamic requirement that must be assessed continuously throughout the lifecycle of the processing.
The 7 Foundational Principles
Ann Cavoukian foundational framework for Privacy by Design, originally developed in the 1990s, remains the most widely referenced conceptual model. These seven principles provide the philosophical foundation for Article 25:
- Proactive not reactive; preventative not remedial. Anticipate and prevent privacy-invasive events before they happen. Do not wait for privacy risks to materialise.
- Privacy as the default setting. Ensure that personal data is automatically protected in any given system. No action should be required by the individual to protect their privacy.
- Privacy embedded into design. Build privacy into the design and architecture of IT systems and business practices. Privacy is not a bolt-on addition.
- Full functionality: positive-sum, not zero-sum. Accommodate all legitimate interests and objectives. Privacy should not come at the cost of functionality, and functionality should not come at the cost of privacy.
- End-to-end security: full lifecycle protection. Ensure security measures adequate to the data are in place throughout the entire lifecycle, from collection through deletion.
- Visibility and transparency: keep it open. Assure all stakeholders that processes involving personal data are operating according to stated promises and objectives, subject to independent verification.
- Respect for user privacy: keep it user-centric. Architects and operators must keep the interests of the individual uppermost by offering strong privacy defaults, appropriate notice, and user-friendly options.
Integrating Privacy by Design into Agile Workflows
Privacy by Design is often perceived as incompatible with agile development. In practice, it integrates naturally into sprint-based workflows when approached correctly.
Sprint Planning
During sprint planning, every user story that involves personal data should trigger a privacy assessment. Add a privacy checklist to your story template:
- Does this feature collect new personal data?
- Does this feature change how existing personal data is processed?
- Does this feature share personal data with new recipients?
- Does this feature involve automated decision-making or profiling?
- Can the feature be implemented with less personal data (data minimisation)?
If any answer is yes, the story requires privacy review before implementation begins. This review can be performed by the DPO, a privacy engineer, or a trained team member, depending on complexity.
User Stories with Privacy Requirements
Privacy requirements should be explicit in user stories. For example:
Standard user story: "As a user, I want to view my order history so that I can track my purchases."
Privacy-enhanced: "As a user, I want to view my order history so that I can track my purchases. Acceptance criteria: (1) Only orders belonging to the authenticated user are displayed. (2) Order data is retrieved with the minimum fields necessary for display. (3) Delivery addresses are partially masked by default. (4) Access is logged for audit purposes."
Definition of Done
Your team definition of done should include privacy criteria:
- Privacy checklist reviewed and completed
- Data minimisation confirmed (no unnecessary data collection or retention)
- Access controls implemented and tested
- Privacy notice updated if new data collection is introduced
- Data retention aligned with documented retention policy
- DPIA updated if the feature changes the risk profile
Data Minimisation Patterns
Data minimisation (Article 5(1)(c)) is the most practical expression of Privacy by Design. It requires that personal data be adequate, relevant, and limited to what is necessary for the purposes of processing. Effective data minimisation patterns include:
Collection minimisation:
- Collect only the fields required for the stated purpose
- Use progressive disclosure: collect minimal data initially, request additional data only when needed
- Avoid "nice to have" fields: if the data is not required for processing, do not collect it
Processing minimisation:
- Process data at the highest level of aggregation that serves the purpose
- Use pseudonymisation where individual identification is not required for processing
- Implement data separation: store identifiers separately from behavioural data
Retention minimisation:
- Define and enforce retention periods for every data category
- Implement automated deletion or anonymisation at the end of the retention period
- Review retention periods annually and reduce them where possible
Access minimisation:
- Apply role-based access controls: users should only access the data they need for their role
- Implement time-based access: temporary access for specific tasks, automatically revoked
- Log and audit all access to personal data
Privacy-Enhancing Technologies
Article 25 and Recital 78 of the GDPR specifically reference technical measures including pseudonymisation and encryption. A broader set of privacy-enhancing technologies (PETs) can strengthen your Privacy by Design implementation:
Pseudonymisation (Article 4(5)):
Replace directly identifying data (names, email addresses) with pseudonymous identifiers. The mapping between pseudonyms and identities is stored separately with strict access controls. Pseudonymisation reduces risk in the event of a breach while still allowing data to be re-identified when necessary for legitimate processing.
Encryption:
Implement encryption at rest (AES-256 or equivalent) for stored personal data and encryption in transit (TLS 1.3) for all data transmissions. Encryption renders data unintelligible to unauthorised parties and is explicitly recognised by Article 34(3)(a) as a factor that may eliminate the obligation to notify data subjects of a breach.
Access controls:
Implement granular, role-based access controls with the principle of least privilege. Combine with multi-factor authentication for sensitive data access. Monitor and log all access for audit and anomaly detection.
Anonymisation:
Where personal data is no longer needed for its original purpose but the underlying patterns are valuable (e.g., analytics), apply anonymisation techniques that render re-identification impossible. Truly anonymised data falls outside GDPR scope entirely. However, be cautious: the Article 29 Working Party Opinion 05/2014 warns that many purported anonymisation techniques can be reversed with auxiliary data.
Differential privacy:
For analytics and machine learning use cases, differential privacy adds calibrated noise to datasets to prevent individual records from being inferred. This technique allows statistical analysis while providing mathematical guarantees against re-identification.
DPIA Integration
Privacy by Design and Data Protection Impact Assessments (Article 35) are complementary processes. Where a new feature or system triggers a DPIA requirement, the Privacy by Design analysis feeds directly into the DPIA:
- The data minimisation assessment informs the DPIA necessity and proportionality analysis
- The PET selection informs the DPIA risk mitigation section
- The access control design informs the DPIA security measures section
- The retention policy informs the DPIA storage limitation analysis
Integrate DPIA reviews into your sprint cycle for features that involve high-risk processing. This avoids the common anti-pattern of conducting DPIAs retrospectively, after a system is already built and deployed.
Measuring Privacy by Design Maturity
To move beyond ad hoc privacy integration, establish metrics that track your organisation Privacy by Design maturity:
- Percentage of user stories with completed privacy checklists
- Number of features deployed without privacy review (target: zero)
- Mean time to resolve privacy findings from code review
- Data collection delta: number of new personal data fields added versus removed per quarter
- Retention compliance rate: percentage of data categories within their defined retention period
- DPIA completion rate: percentage of high-risk features with completed DPIAs before deployment
Building a Privacy Engineering Culture
Privacy by Design ultimately succeeds or fails based on whether engineering teams internalise privacy as a design constraint. This requires:
- Training: Regular privacy engineering training for developers, QA, and product managers
- Tooling: Privacy linting tools, data flow mapping integrations, and automated PIA triggers in your CI/CD pipeline
- Champions: Designate privacy champions within each engineering team who serve as the first point of contact for privacy questions
- Incentives: Recognise and reward privacy-protective design decisions. Privacy should be a quality attribute, not a bureaucratic obstacle.
- Feedback loops: Share enforcement actions, breach case studies, and audit findings with engineering teams to maintain awareness of real-world consequences
Article 25 is not a checkbox exercise. It is an ongoing commitment to building systems that respect individual privacy by design and by default. Organisations that embed this commitment into their engineering culture will find that privacy becomes an enabler, not a constraint: building user trust, reducing breach risk, and demonstrating the accountability that GDPR demands.
Check your compliance readiness
Run our free GDPR, NIS2 & AI Act readiness assessment and get personalised recommendations in minutes.
Start Free AssessmentEU Compliance Weekly
Get the latest regulatory updates, compliance tips, and enforcement news delivered to your inbox every week.