• Aucun résultat trouvé

Seminar Information Privacy Awareness

N/A
N/A
Protected

Academic year: 2022

Partager "Seminar Information Privacy Awareness"

Copied!
45
0
0

Texte intégral

(1)

Frank Dawson/Nokia, Director information privacy standards

Ecole Polytech Nice – Sophia Antipolis 2015-01-22

Information Privacy Awareness

Seminar

(2)

Information Privacy Awareness

1. WHY – The Privacy Imperative

Privacy Triangle of Trust

Privacy incidents

Regulatory impact for businesses 2. WHAT – EU GDPR and ISO 29100

Terminology

Roles within the Privacy Framework

Privacy Principles

Essence of privacy

Privacy data lifecycle

Personally Identifiable Information and Identifiability 3. HOW – Compliance or Accountability

Elements of an ACCOUNTABLE privacy program

Privacy intent of the organization

Foundation principles of PbD

Privacy program roles & responsibilies

Privacy activities across the product life cycle 4. HOW - Privacy Engineering & Assurance simplified

Applying Privacy Engineering

Privacy Engineering steps

Privacy Assurance steps

Privacy impact assessment

Privacy risk management

Assessing privacy maturity

Privacy related business processes

(3)

Privacy Triangle of Trust

Privacy

Regulatory, policy

(4)

No privacy without security

4

Impact:

HTC is required to fix all of these and to establish a comprehensive security program

…AND undergo independent

security assessments every other year for a 20 year period

HTC Case

HTC was punished by the US govt for negligence in

security engineering

No security training for staff

No security reviews or testing for vulnerabilities

Not following well-known secure coding practices

No process for receiving &

addressing vulnerability reports from 3rd parties

Millions of devices with vulnerabilities in so many ways (read the fine print)

(5)

Keep your privacy promises to your consumers

• Google / DoubleClick Case

Circumvented Apple privacy safeguards on Safari browsers

Stanford research discovers DoubleClick over-riding cookie control

Millions of consumer effected

Impact: FTC imposes record fine

Prompts EU investigations

Google pays a record $22.5m fine to the US Federal Trade Commission (FTC) after it tracked Apple iPhone, iPad and Mac computer users by circumventing privacy protections on the Safari browser for several months in end 2011 and early 2012.

The fine is the largest paid by one company to FTC, which imposed added 20-year privacy order on Google in March 2010 after concerns about the launch of Google Buzz social network.

Jon Lebowetz, FTC Chairman, said “The record setting penalty in this matter sends a clear message to all

companies under an FTC privacy order. No matter how big or small, all companies must abide by FTC orders against them and keep their privacy promises to consumer, or they will end up paying many times what it would have cost

(6)

Provide consumer fair notice and choice

Facebook owned Instagram photo sharing social network site changed its Terms of Use so it could exploit members' photographs for profit - without compensating the owners

Impact: Daily active users fell from almost 16.3 to about 7.6 million and some Brand damage caused (”To do a Zuckerberg” and ”To be

Instagrammed” marketing term coined)

(7)

Follow data minimization & purpose specification

Pentium III included a unique, retrievable, identification number, called PSN (Processor Serial Number) that could be read by

software through the CPUID instruction if feature not disabled through the BIOS

Impact: Product design decisions had far- reaching impact on consumers' online privacy. Intel's market dominance, coupled with the lack of accurate material about the privacy implications of the PSN, and the inability of individuals to control the use of the PSN, placed consumer privacy at risk.

Regulatory response: EU Parliament action to prevent chips from computer destined to EU consumers and public acquisition. Formal inquiry averted by Intel decision to remove PSN feature on Tualatin-based Pentium IIIs, and the feature was not carried through to the Pentium 4 or Pentium M.

(8)

Regulatory potential for businesses

Authorities are doing joint-enforcement on major companies

Example: Facebook

Canadian, US, Nordic, Irish regulators investigated complaints and found violations

Increasing public policy maker interest in mobile technologies

Example: Positioning technologies

More and more laws globally

Enforcement Actions:

€ Fines, up to 2% global revenue

€ Penalties

€ Cost of remediation

€ Forced privacy program

€ 20 year external audit

€ Deletion of unlawfully collected data

€ Sales stops, recalls

(9)

Information Privacy Awareness

1. WHY – The Privacy Imperative

Privacy Triangle of Trust

Privacy incidents

Regulatory impact for businesses 2. WHAT – EU GDPR and ISO 29100

Terminology

Roles within the Privacy Framework

Privacy Principles

Essence of privacy

Privacy data lifecycle

Personally Identifiable Information and Identifiability 3. HOW – Compliance or Accountability

Elements of an ACCOUNTABLE privacy program

Privacy intent of the organization

Foundation principles of PbD

Privacy program roles & responsibilies

Privacy activities across the product life cycle 4. HOW - Privacy Engineering & Assurance simplified

Applying Privacy Engineering

Privacy Engineering steps

Privacy Assurance steps

Privacy impact assessment

Privacy risk management

Assessing privacy maturity

Privacy related business processes

(10)

Information privacy

The right of an individual to control the processing of their personal data such that there is:

No hidden, unwanted, uncontrolled, excessive or insecure

Collection, processing and disclosure of

consumer’s personal data

(11)

EU GDPR and ISO 29100

TheEU data protection regulations will soon be based on the proposed General Data Protection Regulation

Potential harmonizing DP effect across EU businesses

ISO 29100 defines a Privacy Framework that reflects many of the proposed components of the GDPR

The PDF of the standard is freely available here

Privacy Framework includes:

Terminology

Roles and interactions

Recognizing PII

Privacy safeguarding requirements

Privacy policy

Privacy controls

Privacy principles

(12)

Terminology (29100 §2)

Identifiability - condition which results in a PII principal being identified, directly or indirectly, on the basis of a given set of PII

Personally Identifiable Information (PII) - any

information that (a) can be used to identify the PII principal to whom such information relates, or (b) is or might be directly or indirectly linked to a PII principal

PII Controller - privacy stakeholder (or privacy

stakeholders) that determines the purposes and means for

processing personally identifiable information (PII) other than

natural persons who use data for personal purposes

PII Principal - natural person to whom the personally identifiable information (PII) relates

PII Processor - privacy stakeholder that processes personally

identifiable information (PII) on behalf of and in accordance with the instructions of a PII controller

Privacy Breach - situation where PII is processed in violation of one or more relevant privacy safeguarding requirements

Privacy Safeguarding Requirements - set of

requirements an organization has to take into account when

processing personally identifiable information (PII) with respect to the privacy protection of PII

(13)

Roles within the privacy framework

DPA, Data Privacy Authority, Information Privacy

Commissioner, etc is the

independent legal authority for administering privacy rules within a country

The consumer is the PII Principal

The PII Controller is entity that determines purposes and means of processing consumer’s

personal data and is RESPONSIBLE for data

processing of data subject’s PII

The PII Processor performs

information processing on behalf of the Data Controller

Data Protection Authority (DPA)

PII

Principal PII

Processor ControllerPII

Sometimes a reference is also made to a Third Party, which can be viewed as outside this privacy

framework, but the responsibility of the Data

Controller.

(14)

Privacy Principles (ISO 29100 §5)

# Principle Description

1 Consent and choice PII Principal has choice on and has Opt-In to PII processing 2 Purpose legitimacy and specification Processing complies with laws, giving notice before

processing

3 Collection limitation Within laws and necessary for specified purposes 4 Data minimization Minimize the processing of PII

5 Use, retention and disclosure limitation Also applies to limitation on cross-border transfers 6 Accuracy and quality Measure to assure validity and correctness of PII

processing

7 Openness, transparency and notice Clear, complete and accessible information on PII processing

8 Individual participation and access PII Principal access to review their PII and correct inaccuracies

9 Accountability Demonstrate care in duty toward PII Principal for PII stewardship

10 Information security Protecting PII under its authority with appropriate controls 11 Privacy compliance Verifying and demonstrating adherence to laws with

internal or 3rd party audits

(15)

Essence of privacy

Privacy emerges from personally identifiable data

Personal data or information

Any information relating to an identified or identifiable natural person, an individual

+

Identifiability

(Nymity) The measure of the degree that personal data can be associated with an individual

(16)

Privacy data lifecycle

Also called the Consumer Data Lifecycle , it is a fundamental component of the privacy knowledge base

Define the actions related to personal data within the privacy framework

When analyzing the data flow in your specifications, you should also consider the complete

lifecycle for the associated PII

Within the EU, collection, itself is considered to be an act of

processing !

Deletion

Storage Processing

Transfer

Collection

x

(17)

Personal data/information

Relates to information about a natural person

When the data can be associated with an individual, it is referred to as Personally Identifiable Information (PII)

Criteria for linkability of data to an individual is a hot-topic within the privacy community

Sensitive PII must be treated specially

Generally, if PII is of a racial, religious, political, sexual

orientation, medical nature, it is characterized as Sensitive; but other categories should also be consisted

Also commonly referred to as

Basic data (E.G. first name, last name, mobile number)

Address data (E.G. postal code, email address) Restrictedcategories of data (E.G. racial or ethnic origin, religion, trade union membership – if allowed by applicable law) Social networking related data (E.G.. metadata of pictures uploaded, site activity information) Location data (E.G. GPS coordinates or mobile network base station ID) Identifiers(E.G. IMEI, device identifiers, IP- address)

System data is

information about how individual users are using the system (E.G. log files) Monetary data

transactions (E.G. credit card number, account

These are some of the categories of personal data to consider when identifying the PII in your

particular project

(18)

Information Privacy Awareness

1. WHY – The Privacy Imperative

Privacy Triangle of Trust

Privacy incidents

Regulatory impact for businesses 2. WHAT – EU GDPR and ISO 29100

Terminology

Roles within the Privacy Framework

Privacy Principles

Essence of privacy

Privacy data lifecycle

Personally Identifiable Information and Identifiability 3. HOW – Compliance or Accountability

Elements of an ACCOUNTABLE privacy program

Privacy intent of the organization

Foundation principles of PbD

Privacy program roles & responsibilies

Privacy activities across the product life cycle 4. HOW - Privacy Engineering & Assurance simplified

Applying Privacy Engineering

Privacy Engineering steps

Privacy Assurance steps

Privacy impact assessment

Privacy risk management

Assessing privacy maturity

Privacy related business processes

(19)

Compliance or Accountability

• Goal of being privacy compliance may not be sufficient for avoiding regulatory actions against your company

• Data protection authorities (DPA) now expect

organizations to demonstrate their good intentions

• Accountability has roots in 1980 OECD privacy guidelines

• Accountability framework builds trust between DPA and organizations for the handling of personal data

• Accountability means being able to show how your

company has holistically integrated privacy best practices

• Centre for Information & Policy Leadership (CIPL) has

defined a global DPA endorsed approach to Accountability Data Protection Accountability: The Essential Elements

(20)

Elements of an Accountable privacy program

1. Executive accountability and oversight

Internal senior executive oversight and responsibility for data privacy and data protection

2. Policies and processes to implement them

Binding and enforceable written policies and procedures that reflect applicable laws, regulations and industry standards, including procedures to put those policies into effect

3. Staffing and delegation

Allocation of resources to ensure that the organization's privacy program is appropriately staffed by adequately trained personnel

4. Education and awareness

Existence of up-to-date education and awareness programs to keep employees and on-site contractors aware of data protection obligations

5. Risk assessment and mitigation

Ongoing risk assessment and mitigation planning for new products, services, technologies and business models.

Periodic Program risk assessment to review the totality of the accountability program

6. Event management and complaint handling

Procedures for responding to inquiries, complaints and data protection breaches

7. Internal enforcement

Internal enforcement of the organization's policies and discipline for non-compliance

8. Redress

Provision of remedies for those whose privacy has been put risk

Not just compliant but accountable

(21)

Privacy intent of the organization

Vision: Organization articulates the high level aspirations towards protection of the personal data of individuals using their products and services

e.g., ”Consumers trust us to meet their privacy expectations”

Principles: Identify which privacy principles apply to the product

e.g.., select from those codified in ISO, OECD, FIPP, EU frameworks

Objectives and activities: Define concrete objectives and related activities to achieve the objectives

e.g., Industry leading privacy controls built into our software by adopting Privacy by Design,

e.g., Mature privacy aware culture through training and

(22)

Foundation principles of PbD

Privacy by Design

−7-Foundation Principles

1. Proactive not Reactive; Preventative not Remedial 2. Privacy as the Default Setting

3. Privacy Embedded into Design

4. Full Functionality — Positive-Sum, not Zero-Sum 5. End-to-End Security — Full Lifecycle Protection 6. Visibility and Transparency — Keep it Open

7. Respect for User Privacy — Keep it User-Centric

−Concept of baking-in privacy into products from the beginning, rather than a retro-fit to existing products

−Privacy by Re-Design (PbRD) is inevitable for legacy specifications

−Is now globally included into regulations

(23)

Privacy program roles & responsibilities

Executive privacy owner

• The senior executive with oversight and responsibility for data privacy and data protection in the organization

Chief privacy officer

• The senior manager with responsibility for the

implementation and operation of the privacy program in the organization

Privacy officer

• The privacy professional responsible for implementation and operation of the privacy program within an

organizational unit Privacy champ

• The program or product member with sufficient privacy competence to be responsible for transposing privacy requirements into product requirements

(24)

Privacy activities across the product life cycle

(25)

Information Privacy Awareness

1. WHY – The Privacy Imperative

Privacy Triangle of Trust

Privacy incidents

Regulatory impact for businesses 2. WHAT – EU GDPR and ISO 29100

Terminology

Roles within the Privacy Framework

Privacy Principles

Essence of privacy

Privacy data lifecycle

Personally Identifiable Information and Identifiability 3. HOW – Compliance or Accountability

Elements of an ACCOUNTABLE privacy program

Privacy intent of the organization

Foundation principles of PbD

Privacy program roles & responsibilies

Privacy activities across the product life cycle 4. HOW - Privacy Engineering & Assurance simplified

Applying Privacy Engineering

Privacy Engineering steps

Privacy Assurance steps

Privacy impact assessment

Privacy risk management

Assessing privacy maturity

Privacy related business processes

(26)

Privacy Engineering & Assurance simplified

Principles, Policies, Requirements,

Procedures, Guidelines,

Patterns

Design, Implement, Test

Map privacy requirements into product features

Select guidelines, patterns

Review

Against requirements Can be standalone

Release Assessment

Sign-off

Evidence

Evidence

Evidence

Privacy Engineering

Privacy Assurance Privacy

Knowledge Base

Planning & Concepting

Threat Assessment and Mitigation Privacy requirements identification

(27)

Applying Privacy Engineering

Principles

Requirements

Threats

Controls

Residual Risk Privacy Principles

Privacy Requirements &

Guidelines

Privacy & Security Threats &

Vulnerabilities

Privacy & Security Safeguards

Business Acceptable Risk

(28)

Privacy Engineering steps

• Define the product context

− Define product in terms of main functions, assets, stakeholders, business model, sales estimates, deployment target countries, release schedule(s), strategic importance, risk summary

• Document the data flows and classify the data

− Inventory of all the personal data

& data clusters

− Classification of each data element

− User story/epic based diagram of the flow of data through product components,

interactors

• Mitigation

− Selection of privacy & security safeguarding controls

− Identification of key test causes and test tools to verify control fidelity

− Identification of residual risk

• Analyze the threats and risks

− Identification of applicable

privacy principles and underlying requirements

− Definie inherent threats to key privacy & security principles

− Analysis of attack surface and minimization

− Identification of root cause or vulnerability

(29)

Privacy Assurance steps

Purpose of assurance is to verify that Privacy Engineering

activities have been implemented as agreed, operational, as well as any required staffing is in place

Kick-off the assessment process with Privacy Officer early to understand what will be needed for final sign-off

Privacy & security assessment is based on a thorough assessment of the Product Team evidence that Privacy

Engineering activities has been implemented and is operational

Final sign-off recommendation is made by Privacy Officer with approval by Product Management & Chief Privacy Officer

Escallation process may be needed to address disagreements over findings between Privacy Officer and Product Management

Non-compliance with privacy regulations SHOULD NOT be approved

A final assessment of all product or service that

(30)

Privacy impact assessment

EU GDPR Article 33 promulgates PIA for public/privacy orgs

Produces evidence of implementation of Privacy by Design

Conducted by staff when personal data is collected, used or disclosed in a product or service

Re-conducted if material changes made to product or service

ISO 29134 (WD) will standardize methodology

Identify describe the project, including the aims, whether any personal information will be handled, inherent privacy principles

Analyze identify the personal information flows, classify data, identify relevant regulations, privacy requirements, privacy impact

Verify validate that only essential data is collected and processed for legitimate purposes required by the product or service

Simplify change system and processes to only collect/store/process essential data for minimum period with a data deletion plan

Secure use industry best practices for safeguarding personal data through life cycle, providing consumer control over their data

Remediate identify remaining risk, level of harm and mitigation plan to eliminate or reduce risk to acceptable level

record findings, gain sponsor commitment to implement any

(31)

Privacy risk assessment

Produces evidence of minimization of possible privacy risk

Conducted by business team with input from PIA evidence

Re-conducted if material changes made to product or service

ISO 31000 defines an applicable risk management framework

When applicable, should include assessment of 3rd party risk

Product management will need to accept any residual risk

Context establish external, internal context for risk, risk management process and risk assessment criteria to be used

Identify identify sources of risk, areas of impact, events and causes, potential consequences

Analyze consider causes and sources of risk, positive & negative consequences, both tangible and intangible

Evaluate make decisions based on risk analysis, which risks need treatment and the priority for treatment implementation

Treat select remediation based on avoiding, taking on, removing, changing potential for, changing harm of, sharing of risk

Monitor &

Review assures controls effective, learn and improve, detect context changes, identify new risks, measure KPI

(32)

Privacy capability assessment

Provides a method for advancement of your privacy program

Conducted to measure baseline and incremental changes

Part of a commitment to accountability, constant improvement

ISO 29190 (new IS) will standardize a methodology

Plan agree on privacy capability assessment model (e.g., context or business process based) and assessment scale to be used

Assess rate the current capability against target capability

Review identify sub-optimal capabilities to be improved and overall improvement plan

Report communicate to management the assessment activity,

results, improvement actions and next scheduled assessment

Improve implement improvement plan

(33)

Privacy related business processes

• Quality management process

• Risk management process

• Assessment process

• Security engineering process

• Business continuity process

• Customer care process

• Incident response management process

• External communications process

• Authority request/lawful intercept process

(34)

5. References

(35)

References

OECD Privacy Principles

EU Data Protection Directive 95/46/EC

EU Proposed General Data Protection Regulation ISO 29100 from ISO (PDF version is freely available) CIPL Implementing Accountability

CIPL Accountability Self-Assessment Tool

frank.dawson@nokia.com

(36)

What did you learn?

(37)

Q1: The Triangle of Trust is a model to show the three primary forces influencing Privacy. Which one is not one of the three primary forces?

a. Technology / Industry b. Advocacy / Consumers c. Legal / Intelligence

d. Policy / Regulatory

(38)

Q2: Sensitive Personal Data or PII needs to be protected with additional privacy safeguards.

Which personal data is not in this category of PII?

a. Sexual orientation b. Email address

c. Financial account credentials d. Professional memberships e. None of these

(39)

Q3: Which factoid about privacy framework roles and responsiblities is not correct?

a. PII Principle is the owner of the personal data being processed

b. PII Processor is not ultimately responsible for privacy

breaches in their processing of personal data on behalf of the PII Controller

c. PII Controller is the privacy stakeholder that determines the purposes and means for processing personally

identifiable information (PII)

d. Data Protection Authority is the independent legal

authority for administering privacy rules within a country

(40)

Q4: What is the essence of privacy?

a. Personal data

b. Privacy data lifecycle

c. Identifiability of personal data d. a and c

e. Nymity

f. a and c and e

(41)

Q5: Which statement about Privacy Engineering and Privacy Assurance is not correct?

a. Privacy Engineering involves implementation of Privacy by Design

b. Privacy Assurance involves the acceptance of any residual product privacy risk

c. Privacy Engineering includes activities at all stages of the product life cycle and should begin as early as feasible

d. Privacy Assurance should include a final verification that the findings from the Privacy Engineering have been

implemented and are operational in the product e. Privacy Engineering is an emerging discipline

(42)

Q6: Which Privacy Engineering evidence is not always needed to demonstrate Privacy by

Design?

a. Product description documented

b. 3rd party risk assessment documented c. Personal data flows documented

d. Classified personal data inventory documented

e. List of applicable privacy principles, requirements, threats and mitigation documented

(43)

Quiz Answers

1. c 2. b 3. a 4. f 5. b 6. b

(44)

Policy Question

for Discussion

(45)

Policy topic for discussion

Question: Which principle “trumps” when there seemingly are equal, two competing privacy principles?

Background: PbD Foundation Principles include “Privacy as the Default Setting”, as well as “Respect for User Privacy — Keep it User-Centric”. Within the context of the standardization of a Do Not Track mechanism for Web Browsers within the W3C Tracking Protection Working Group, the question of whether the consumer Do-Not-Track preference is raised by MSFT IE setting default to DNT:YES. But this removes consumer participation, as no

informed-explicit-active consent is given on FUE.

Debate: Pro/Con what should W3C TPWG set the default value for DNT preference to? A) DNT:YES, B) DNT:NO, C) Not defaultable.

Références

Documents relatifs

This can be achieved by using smart contracts: the research entity deploys a smart contract that guarantees, upon execution, that whenever data with some certified feratures (that

Les sources de données concernées et les possibilités de les croiser sont multiples, qu'il s'agisse de données captées par les majors du Web, de bases de données publiques

Section 2.3 presents the Long Read Filtering (LRF) approach, which consists in studying all subsequences of a read using Bloom filters, initialized with dictionaries of (K,

We represent a specific system using two automata, such that first, behaviour automaton, represents behaviour (e.g. gathering and using the gathered data) and second,

In this paper, we contributed to the literature on privacy-enhancing technolo- gies and users’ privacy by assessing the specific relationships between information privacy

The scheme also supports blind retrieval of search tokens and decryption keys in the sense neither data owner nor cloud server learns the query content. Additional feature of user

We performed a comparison of 6 data generative methods 2 on the MIMIC-III mortality problem: (1) Gaussian Multivariate [2], (2) Wasserstein GAN (WGAN) [3, 4], (3) Parzen Windows

CLASSIFICATION FOR MIR PERSONAL DATA PROCESSING Considering the legal definition of personal data we can now propose a less naive classification of MIR processes and