# Data Protection Impact Assessment (DPIA) Template ## Ask Eve AI **Date of Assessment**: [Date] **Assessed By**: [Name, Role] **Review Date**: [Date - recommend annual review] --- ## 1. Executive Summary | Field | Details | |-------|---------| | **Processing Activity Name** | [e.g., "Job Candidate Assessment Specialist"] | | **Brief Description** | [1-2 sentence summary] | | **Risk Level** | ☐ Low ☐ Medium ☐ High | | **DPIA Required?** | ☐ Yes ☐ No | | **Status** | ☐ Draft ☐ Under Review ☐ Approved ☐ Requires Revision | --- ## 2. Description of the Processing ### 2.1 Nature of the Processing **What Personal Data will be processed?** - [ ] Contact information (name, email, phone) - [ ] Identification data (ID numbers, passport) - [ ] Professional data (CV, work history, qualifications) - [ ] Assessment results or scores - [ ] Communication records - [ ] Behavioral data (how users interact with the system) - [ ] Technical data (IP addresses, device information) - [ ] Other: _______________ **Categories of Data Subjects:** - [ ] Job applicants/candidates - [ ] Employees - [ ] Customers - [ ] End users/consumers - [ ] Other: _______________ **Volume of Data Subjects:** - [ ] < 100 - [ ] 100-1,000 - [ ] 1,000-10,000 - [ ] > 10,000 ### 2.2 Scope of the Processing **What is the purpose of the processing?** [Describe the specific business purpose, e.g., "To assess job candidates' suitability for specific roles by analyzing their responses to standardized questions"] **How will the data be collected?** - [ ] Directly from data subjects (forms, interviews) - [ ] From third parties (recruiters, references) - [ ] Automated collection (web forms, chatbots) - [ ] Other: _______________ **Where will data be stored?** - [ ] EU (specify: France - Scaleway) - [ ] Non-EU (specify and justify): _______________ ### 2.3 Context of the Processing **Is this processing new or existing?** - [ ] New processing activity - [ ] Modification of existing processing - [ ] Existing processing (periodic review) **Who has access to the Personal Data?** - [ ] Ask Eve AI employees (specify roles): _______________ - [ ] Customer/Tenant employees - [ ] Partners (specify): _______________ - [ ] Sub-Processors (list): _______________ - [ ] Other: _______________ **How long will data be retained?** [Specify retention period and justification, e.g., "Candidate data retained for 12 months to comply with recruitment record-keeping requirements"] --- ## 3. Necessity and Proportionality Assessment ### 3.1 Lawful Basis **What is the lawful basis for processing? (Article 6 GDPR)** - [ ] **Consent** - Data subject has given explicit consent - [ ] **Contract** - Processing necessary for contract performance - [ ] **Legal obligation** - Required by law - [ ] **Vital interests** - Necessary to protect someone's life - [ ] **Public task** - Performing a public interest task - [ ] **Legitimate interests** - Necessary for legitimate interests (requires balancing test) **Justification:** [Explain why this lawful basis applies] ### 3.2 Special Categories of Data (if applicable) **Does the processing involve special categories of data? (Article 9 GDPR)** - [ ] No - [ ] Yes - racial or ethnic origin - [ ] Yes - political opinions - [ ] Yes - religious or philosophical beliefs - [ ] Yes - trade union membership - [ ] Yes - genetic data - [ ] Yes - biometric data for identification - [ ] Yes - health data - [ ] Yes - sex life or sexual orientation data **If yes, what is the additional lawful basis?** [Article 9(2) provides specific conditions - specify which applies] ### 3.3 Automated Decision-Making **Does the processing involve automated decision-making or profiling?** - [ ] No - [ ] Yes - automated decision-making WITH human oversight - [ ] Yes - fully automated decision-making (no human intervention) **If yes:** **Does it produce legal effects or similarly significant effects?** - [ ] No - [ ] Yes (explain): _______________ **What safeguards are in place?** - [ ] Right to obtain human intervention - [ ] Right to express point of view - [ ] Right to contest the decision - [ ] Regular accuracy reviews - [ ] Transparency about logic involved - [ ] Other: _______________ ### 3.4 Necessity Test **Is the processing necessary to achieve the stated purpose?** ☐ Yes ☐ No **Justification:** [Explain why this specific processing is necessary and whether less intrusive alternatives were considered] **Could the purpose be achieved with less data or through other means?** ☐ Yes (explain why not pursued): _______________ ☐ No ### 3.5 Proportionality Test **Is the processing proportionate to the purpose?** ☐ Yes ☐ No **Data Minimization:** - Are you collecting only the minimum data necessary? ☐ Yes ☐ No - Have you considered pseudonymization or anonymization? ☐ Yes ☐ No ☐ N/A - Can data be aggregated instead of individual records? ☐ Yes ☐ No ☐ N/A **Storage Limitation:** - Is the retention period justified and documented? ☐ Yes ☐ No - Is there an automated deletion process? ☐ Yes ☐ No ☐ Planned --- ## 4. Stakeholder Consultation ### 4.1 Data Subject Consultation **Have data subjects been consulted about this processing?** ☐ Yes ☐ No ☐ Not required **If yes, how were they consulted?** [Describe consultation method: surveys, focus groups, user research, etc.] **Key concerns raised by data subjects:** [List any concerns and how they were addressed] ### 4.2 DPO or Security Contact Consultation **Has the DPO or security contact been consulted?** ☐ Yes ☐ No ☐ N/A (no formal DPO) **Comments from DPO/Security Contact:** [Record any recommendations or concerns] --- ## 5. Risk Assessment ### 5.1 Risk Identification For each risk, assess: - **Likelihood**: Negligible / Low / Medium / High - **Severity**: Negligible / Low / Medium / High - **Overall Risk**: Low / Medium / High / Very High **Risk 1: Unauthorized Access or Data Breach** **Description**: Personal data could be accessed by unauthorized parties due to security vulnerabilities. | Assessment | Rating | |------------|--------| | Likelihood | ☐ Negligible ☐ Low ☐ Medium ☐ High | | Severity (if occurs) | ☐ Negligible ☐ Low ☐ Medium ☐ High | | **Overall Risk** | ☐ Low ☐ Medium ☐ High ☐ Very High | **Risk 2: Discrimination or Bias in Automated Decisions** **Description**: Automated processing could result in discriminatory outcomes or unfair treatment. | Assessment | Rating | |------------|--------| | Likelihood | ☐ Negligible ☐ Low ☐ Medium ☐ High | | Severity (if occurs) | ☐ Negligible ☐ Low ☐ Medium ☐ High | | **Overall Risk** | ☐ Low ☐ Medium ☐ High ☐ Very High | **Risk 3: Lack of Transparency** **Description**: Data subjects may not understand how their data is processed or decisions are made. | Assessment | Rating | |------------|--------| | Likelihood | ☐ Negligible ☐ Low ☐ Medium ☐ High | | Severity (if occurs) | ☐ Negligible ☐ Low ☐ Medium ☐ High | | **Overall Risk** | ☐ Low ☐ Medium ☐ High ☐ Very High | **Risk 4: Inability to Exercise Data Subject Rights** **Description**: Data subjects may have difficulty exercising their rights (access, erasure, portability, etc.). | Assessment | Rating | |------------|--------| | Likelihood | ☐ Negligible ☐ Low ☐ Medium ☐ High | | Severity (if occurs) | ☐ Negligible ☐ Low ☐ Medium ☐ High | | **Overall Risk** | ☐ Low ☐ Medium ☐ High ☐ Very High | **Risk 5: Data Quality Issues** **Description**: Inaccurate or outdated data could lead to incorrect decisions or outcomes. | Assessment | Rating | |------------|--------| | Likelihood | ☐ Negligible ☐ Low ☐ Medium ☐ High | | Severity (if occurs) | ☐ Negligible ☐ Low ☐ Medium ☐ High | | **Overall Risk** | ☐ Low ☐ Medium ☐ High ☐ Very High | **Risk 6: Function Creep / Scope Expansion** **Description**: Data collected for one purpose could be used for other purposes without consent. | Assessment | Rating | |------------|--------| | Likelihood | ☐ Negligible ☐ Low ☐ Medium ☐ High | | Severity (if occurs) | ☐ Negligible ☐ Low ☐ Medium ☐ High | | **Overall Risk** | ☐ Low ☐ Medium ☐ High ☐ Very High | **Additional Risks:** [Add any processing-specific risks] --- ## 6. Mitigation Measures For each identified risk, document mitigation measures: ### Risk 1: Unauthorized Access or Data Breach **Mitigation Measures:** - [ ] Encryption in transit (TLS 1.2+) - [ ] Encryption at rest - [ ] Multi-factor authentication - [ ] Access controls (RBAC) - [ ] Regular security audits - [ ] WAF and DDoS protection (Bunny.net Shield) - [ ] Multi-tenant data isolation - [ ] Regular security training - [ ] Incident response plan - [ ] Other: _______________ **Residual Risk After Mitigation:** ☐ Low ☐ Medium ☐ High ☐ Very High ### Risk 2: Discrimination or Bias in Automated Decisions **Mitigation Measures:** - [ ] Regular bias testing of AI models - [ ] Diverse training data sets - [ ] Human review of automated decisions - [ ] Clear criteria for decision-making - [ ] Right to contest decisions - [ ] Transparency about decision logic - [ ] Regular fairness audits - [ ] Monitoring of outcomes by demographic groups - [ ] Ability to request explanation - [ ] Other: _______________ **Residual Risk After Mitigation:** ☐ Low ☐ Medium ☐ High ☐ Very High ### Risk 3: Lack of Transparency **Mitigation Measures:** - [ ] Clear Privacy Policy explaining processing - [ ] Explicit consent mechanisms - [ ] Plain language explanations - [ ] Information provided before data collection - [ ] Explanation of automated decision logic - [ ] Contact information for questions - [ ] Regular communication with data subjects - [ ] Privacy-by-design approach (anonymous until consent) - [ ] Other: _______________ **Residual Risk After Mitigation:** ☐ Low ☐ Medium ☐ High ☐ Very High ### Risk 4: Inability to Exercise Data Subject Rights **Mitigation Measures:** - [ ] Clear procedures for rights requests - [ ] Multiple request channels (email, helpdesk) - [ ] 30-day response timeframe - [ ] Technical capability to extract data - [ ] Data portability in standard formats - [ ] Secure deletion processes - [ ] Account disabling/restriction capability - [ ] Identity verification procedures - [ ] Other: _______________ **Residual Risk After Mitigation:** ☐ Low ☐ Medium ☐ High ☐ Very High ### Risk 5: Data Quality Issues **Mitigation Measures:** - [ ] Data validation on input - [ ] Regular data accuracy reviews - [ ] Ability for data subjects to correct errors - [ ] Clear data update procedures - [ ] Data quality monitoring - [ ] Source verification for third-party data - [ ] Archiving of outdated data - [ ] Other: _______________ **Residual Risk After Mitigation:** ☐ Low ☐ Medium ☐ High ☐ Very High ### Risk 6: Function Creep / Scope Expansion **Mitigation Measures:** - [ ] Documented purpose limitation - [ ] Access controls preventing unauthorized use - [ ] Regular compliance audits - [ ] Privacy Policy clearly states purposes - [ ] Consent required for new purposes - [ ] Technical controls preventing misuse - [ ] Staff training on data protection - [ ] Other: _______________ **Residual Risk After Mitigation:** ☐ Low ☐ Medium ☐ High ☐ Very High ### Additional Mitigation Measures [Document any additional mitigation measures not covered above] --- ## 7. Data Subject Rights Implementation **How will you ensure data subjects can exercise their rights?** ### Right of Access (Article 15) - [ ] Procedure documented - [ ] Technical capability implemented - [ ] Response within 30 days - Method: _______________ ### Right to Rectification (Article 16) - [ ] Procedure documented - [ ] Technical capability implemented - [ ] Response within 30 days - Method: _______________ ### Right to Erasure (Article 17) - [ ] Procedure documented - [ ] Technical capability implemented - [ ] Response within 30 days - Method: _______________ - Limitations: _______________ ### Right to Restriction (Article 18) - [ ] Procedure documented - [ ] Technical capability implemented (account disabling) - [ ] Response within 30 days ### Right to Data Portability (Article 20) - [ ] Procedure documented - [ ] Technical capability implemented - [ ] Export format: JSON / CSV / XML / Other: _______________ ### Right to Object (Article 21) - [ ] Procedure documented - [ ] Opt-out mechanisms implemented - [ ] Clear in Privacy Policy ### Rights Related to Automated Decision-Making (Article 22) - [ ] Human intervention available - [ ] Explanation of logic provided - [ ] Right to contest implemented - [ ] Documented in Privacy Policy --- ## 8. Privacy by Design and Default **Privacy Enhancing Technologies Implemented:** - [ ] Data minimization (collect only necessary data) - [ ] Pseudonymization (where applicable) - [ ] Anonymization (where applicable) - [ ] Anonymous interaction until consent (privacy-by-design) - [ ] Encryption (in transit and at rest) - [ ] Access controls and authentication - [ ] Audit logging - [ ] Secure deletion - [ ] Data isolation (multi-tenant architecture) - [ ] Other: _______________ **Default Settings:** - [ ] Most privacy-protective settings by default - [ ] Opt-in (not opt-out) for non-essential processing - [ ] Clear consent mechanisms before data collection - [ ] Limited data sharing by default --- ## 9. Compliance with Principles **For each GDPR principle, confirm compliance:** ### Lawfulness, Fairness, Transparency (Article 5(1)(a)) - [ ] Lawful basis identified and documented - [ ] Processing is fair and transparent - [ ] Privacy Policy clearly explains processing - Evidence: _______________ ### Purpose Limitation (Article 5(1)(b)) - [ ] Specific purposes documented - [ ] Data not used for incompatible purposes - [ ] New purposes require new consent/legal basis - Evidence: _______________ ### Data Minimization (Article 5(1)(c)) - [ ] Only necessary data collected - [ ] Regular review of data collected - [ ] Excess data not retained - Evidence: _______________ ### Accuracy (Article 5(1)(d)) - [ ] Mechanisms to ensure data accuracy - [ ] Ability to correct inaccurate data - [ ] Regular data quality reviews - Evidence: _______________ ### Storage Limitation (Article 5(1)(e)) - [ ] Retention periods defined and documented - [ ] Automated deletion where appropriate - [ ] Justification for retention documented - Evidence: _______________ ### Integrity and Confidentiality (Article 5(1)(f)) - [ ] Appropriate security measures implemented - [ ] Protection against unauthorized access - [ ] Encryption and access controls in place - Evidence: See Annex 2 of DPA ### Accountability (Article 5(2)) - [ ] Documentation of compliance measures - [ ] Records of processing activities maintained - [ ] DPIA conducted and documented - [ ] DPA in place with processors - Evidence: This DPIA, DPA with customers --- ## 10. International Transfers **Does this processing involve transfer to third countries?** ☐ No - all processing within EU ☐ Yes (complete below) **If yes:** **Country/Region:** _______________ **Transfer Mechanism:** - [ ] Adequacy decision (Article 45) - [ ] Standard Contractual Clauses (Article 46) - [ ] Binding Corporate Rules (Article 47) - [ ] Other: _______________ **Transfer Impact Assessment Completed?** ☐ Yes ☐ No **Additional Safeguards:** [Document supplementary measures to ensure adequate protection] --- ## 11. Documentation and Records **Documentation Maintained:** - [ ] This DPIA - [ ] Privacy Policy - [ ] Data Processing Agreement - [ ] Consent records (if applicable) - [ ] Records of processing activities (Article 30) - [ ] Data breach register - [ ] Data Subject rights request log - [ ] Staff training records - [ ] Sub-processor agreements **Record of Processing Activities (Article 30) Completed?** ☐ Yes ☐ No ☐ In Progress --- ## 12. Outcomes and Recommendations ### 12.1 Overall Risk Assessment **After implementing mitigation measures, what is the residual risk level?** ☐ Low - processing can proceed ☐ Medium - additional measures recommended ☐ High - significant concerns, consult DPO/legal counsel ☐ Very High - processing should not proceed without major changes ### 12.2 Recommendations **Recommended Actions Before Processing Begins:** 1. [Action item 1] 2. [Action item 2] 3. [Action item 3] **Recommended Monitoring/Review Activities:** 1. [Monitoring item 1] 2. [Monitoring item 2] 3. [Monitoring item 3] ### 12.3 Consultation with Supervisory Authority **Is consultation with supervisory authority required?** ☐ No - residual risk is acceptable ☐ Yes - high residual risk remains despite mitigation (Article 36) **If yes, when will consultation occur?** _______________ ### 12.4 Sign-Off **DPIA Completed By:** Name: _______________ Role: _______________ Date: _______________ Signature: _______________ **Reviewed and Approved By:** Name: _______________ Role: _______________ Date: _______________ Signature: _______________ **Next Review Date:** _______________ *(Recommend annual review or when significant changes occur)* --- ## Appendix A: Completed Example - Job Candidate Assessment This appendix provides a completed example for reference. ### Example: Job Candidate Assessment Specialist **Processing Activity**: AI-powered job candidate assessment tool **Personal Data Processed**: - Assessment responses (text) - Communication records (chatbot interactions) - Contact information (name, email) - collected AFTER assessment with consent - Assessment scores/results **Purpose**: To assess candidates' suitability for job roles based on their responses to standardized questions **Lawful Basis**: - Consent (candidates explicitly consent before providing contact information) - Contract (processing necessary to take steps at request of data subject prior to entering into contract) **Automated Decision-Making**: Yes, with human oversight. Candidates are assessed by AI, but: - Contact information only collected AFTER positive assessment - Human recruiter makes final hiring decisions - Candidates can restart assessment at any time - Candidates informed about AI assessment before beginning **Key Risks Identified**: 1. Bias/discrimination in assessment algorithms - MEDIUM risk 2. Lack of transparency about assessment criteria - MEDIUM risk 3. Data breach exposing candidate information - LOW risk (after mitigation) **Key Mitigation Measures**: - Anonymous assessment until consent obtained - Clear explanation of assessment process - Right to contest results - Human review of all final decisions - Regular bias testing of algorithms - Strong technical security measures (encryption, access controls) - 12-month retention period with secure deletion **Residual Risk**: LOW - processing can proceed **Special Considerations**: - Candidates must be informed about automated decision-making - Privacy Policy must explain assessment logic - Contact information collected only after explicit consent - Right to human intervention clearly communicated --- ## Appendix B: Resources and References **GDPR Articles Referenced:** - Article 5: Principles relating to processing - Article 6: Lawfulness of processing - Article 9: Special categories of data - Article 13-14: Information to be provided - Article 15-22: Data subject rights - Article 22: Automated decision-making - Article 28: Processor obligations - Article 30: Records of processing activities - Article 33-34: Data breach notification - Article 35: Data Protection Impact Assessment - Article 36: Prior consultation with supervisory authority - Article 45-46: International transfers **Additional Guidance:** - WP29 Guidelines on DPIAs (WP 248) - WP29 Guidelines on Automated Decision-Making (WP 251) - ICO DPIA Guidance - EDPB Guidelines on processing personal data for scientific research - Belgian DPA Guidance (https://www.gegevensbeschermingsautoriteit.be) **Internal Documents:** - Ask Eve AI Data Protection Agreement - Ask Eve AI Privacy Policy - Technical and Organizational Measures (DPA Annex 2) --- **End of DPIA Template**