These notes have been completed as part of work in University of Oxford and towards GCHQ accreditation. All comments are my own.
Cross functional & Management Involvement
Unless management have an enterprise wide understanding of the needs for privacy by design it’s difficult to promote its widespread adoption throughout the organisations day to day activity. Management will require a need to understand the requirement and what benefits PbD can provide an organisation. Each organisation has finite resources, these resources need to be optimized. Mapping PbD into an advantage makes conversations and buy-in from C-suite level management an easier and stronger approach to be adopted throughout the organisation; similarities with implementing secure programming practices hold true for when privacy by design i.e. implemented throughout an organisation, Graw  has documented a maturity module related to security BSIMM (Building Security In Maturity Model), highlighting executive leadership is a primary consideration as "Grassroots approaches to software security sparked and led solely by developers and their direct managers have a poor track record in the real world". Privacy by design will encounter similar challenges, as such initiatives from BSIMM should prove useful with respect to implementing PbD. Make PbD as a differentiator with competitors, being transparent about PbD today prior to new GDPR regulation could prove beneficial from an advantageous perspective. PbD will require having a collective cross functional involvement i.e. PbD is not possible to implement within one division alone, e.g. engineering alone cannot achieve PbD nor can management alone achieve PbD.
Take a software development company as an example; software engineers are building a software as a service solution. These engineers may practice PbD while designing a solution. Such SaaS solution most likely will require integration hook points with other software products, vendors and other SaaS solutions or in most cases a collection of same. This leads to a business decision requiring selecting these integration providers. Do these business decision makers implement PbD, do they know what PbD is etc. ? The point is PbD requires cross functional involvement and cannot be practiced by one division alone.
Expanding above example and further involving other divisions. Organisations invest significant resources in marketing and brand positioning an organisation. These divisions will understand the length and resources required to achieve such branding; this could be seen an opportunity for buy-in from these divisions i.e. a privacy breach would be bad from a branding perspective and could affect bottom line. Though research by  suggests strategically controlling the news at the same time as a privacy breach can have a positive impact on stock prices.  does show a 0.27% decrease of stock prices; while strategically controlling the news media by publishing a high volume of positive news can increase a stock price by .47% around the time of a privacy breach. However,  research is derived from data between 2005 and 2014; in some recent breaches C-suite management have been forced to resign e.g. Targets CEO Gregg Steinhafel, U.S. Office of Personnel Management's director, Katerine Archuletta.
Mandatory PIA's & Application / System Designers
Spiekermann  advices mandatory PIA (Privacy Impact Assessments) for system designers would be a positive step towards PbD goals; reasoning "individuals make irrational privacy decisions and ... underestimate ... privacy risks", this extends to managers involved in implementing PbD within organisations. Article 25 of  mentions an approved certification mechanism in accordance with article 42 of , however the language would not satisfy Spiekermann's mandatory step for system designers. Firstly article 25 indicates such approved certification mechanisms 'may' be used, while article 42 indicates the certification is 'voluntary' and member states 'encourage' establishing such certification processes. Secondly, no mention of system designers, the article is more focused on data controllers and processors. The important challenge point here is organisations purchase applications without any say in the design of these applications, and the regulation is more focused on the data controllers and processors rather than the designers of the applications.
Education & Awareness
Lahlou  highlights concerning results associated with engineers working on invisible ubiquitous computing projects e.g. hidden sensors. When Lahlou  surveyed engineers about PbD, engineers responded PbD is
"An abstract problem", "Not a problem yet", "not a problem at all...solved by Firewalls and cryptography", "Not their problem", "Not part of the project deliverables"
While Spiekermann  also notes privacy is not a "primary consideration" for engineers. Also noted by Spiekermann , education is a challenge, in particular related to distinguishing the difference between security and privacy. Clearly based on Lahlou  findings some engineers highlight privacy is solved by firewalls and cryptography. Educating engineers that privacy is about minimization of personal data collection while maximizing a user’s control, whereas security should be seen as an enabler of privacy i.e. security mechanisms such as SSL/TLS provides an element of privacy communicating between two end points, it should be further noted in this example SSL/TLS between end points can still leak meta data i.e. a network sniff can determine person A communicated between two end points. Furthermore, Landau , highlights "privacy isn’t only about compliance .... its designing them to ensure they protect privacy", and designing such systems requires input from several areas of expertise including social science, computer scientists, engineers, the law etc. As universities overall are not doing so well (with a few exceptions) educating these multidisciplinary about privacy, this leads to ad-hoc education, Landau  highlights engineers educating about privacy is primarily ad-hoc when building systems and challenged with privacy related concerns. With such multidisciplinary requirements having common terminology to help bridge the communication gap was highlighted by NIST  as one of the key challenges when engaging privacy communication across multiple disciplines.
Landau  further highlights the differing standards of privacy across the world i.e. one culture holds personal privacy differently to another culture, this is a similar concern raised by NIST ; this workshop raised the concern system design approaches should be applied internationally where applicable.
Ubiquitous Invisible Computing a conflict with PbD
Both Lahlou  and NIST  highlight challenges PbD encounters with the ever growing ubiquitous invisible computing projects. NIST  raises the challenges associated with obtaining individuals consent when sensors are built into walls and doors i.e. these sensors are invisible. While Lahlou  raises a similar concern related to individuals sensory borders. As Lahlou  indicates deployment of ubiquitous computing leads to more data flowing, how does PbD get implemented ? According to Lahlou  individuals physical interactions with invisible devices reduces ones sensory borders i.e. where an individual’s privacy awareness reduces when one does not physical see or interact with an invisible sensor.
An example; an individual might allow sensors associated with a certain store, while disapproving of sensory detection in other stores. How will this be practical from an implementation point of view and how would an individual give consent in such scenarios ? Does an individual get notifications each time they pass a store, how practical would this be in a large shopping mall ? For each store you enter does an individual physically have to consent to sensor tracking, and how practical can this be implemented ?
Lahlou  summarizes this challenge by highlighting "privacy and ubiquity seem in constant conflict".
Location or context based privacy by design
Different definitions of privacy across the world and differing regulations makes implementing privacy by design location based which adds to the complexity of system designs, adding PbD can raise complexity of an application and introduces friction for the user as highlighted by ENISA .
How does PbD cater for scenarios where differing privacy regulations exist based on location, does an individual have to consent differently depending on location of system usage, e.g. in US you consent or have different privacy settings to settings in the EU ?
Privacy by Default & Personal Data at the heart of an organisations economic and business models
A founding principle "privacy by default" appears very aspirational in particular when viewed in the context of social media and their business models. Cavoukian  further explains "No action is required on the part of the individual to protect their privacy — it is built into the system, by default."
Taking Facebook as an example, the very essence of these social media services require a significant element of personal data not to be private, this means data is either public or available to third party applications. This 'privacy by default' principle goes against the fundamental goals of these social media services. For example:
Facebook's mission "is to make the world more open and connected" as seen in figure
Figure 3 - Facebook Mission
Davies  agrees privacy by default is akin to having an opt-in approach and would appear to be "almost genetically opposed by many companies." . Spiekermann  also highlights most PbD advocates do not take into account the economic and business models of organisations where personal data is at the heart of their business models e.g. social networking sites. Applying a strict PbD approach to Facebooks social networking site would prohibit collection of personal data and adversely affecting bottom lines.
Privacy by default can be exemplified by reviewing one of Facebook's third party integrations i.e. Facebook connector. This feature allows third party applications leverage Facebooks authentication mechanism, allowing a user to authenticate with their Facebook username and password, which has positive benefits. However, by default significant data is exposed by default to the third party application. Below is an example, Glassdoor  is a service providing job postings and employee reviews on companies among other employee sourced features. Glassdoor leverages the Facebook connector mechanism figure 4 below
Figure 4 - Facebook connector leveraged by Glassdoor
Figure 4 above presents the scenario a user steps through when accepting Glassdoor and Facebooks authentication connector. From the initial interaction at step 1 a user is presented with the "Sign In with Facebook" option, at this point the user is not aware any data will be available to the third party application (i.e. Glassdoor), as such the privacy by default principle is not evident with this initial interaction. In step 2 user is presented with some information indicating certain information will be available to Glassdoor, selecting the "Edit this" presents further detail. Overall six options are listed (the location option exists off screen in step 3 under education history), by default all six options are selected for the user, again privacy by default is not applied. Only one option is mandatory i.e. the users public profile.
Taking privacy by default on the face of it, how would this affect Facebooks connector ?
Social Sign In. If you access Glassdoor through a social networking site, such as Facebook or Google+ ("Social Networking Site"), you agree that we may access, make available, and store (if applicable) any information, data, text, messages, tags, and/or other materials accessible through Glassdoor that you have provided to and stored in your Social Networking Site account so that it is available on and through Glassdoor via your account and your profile page. Subject to the privacy settings that you have set with the Social Networking Site account you use to access Glassdoor, personally identifiable information that you post to that Social Networking Site may be displayed on Glassdoor. Please note: your relationship with your Social Networking Sites is governed solely by your agreement with those Social Networking Sites and we disclaim any liability for personally identifiable information that may be provided to us by a Social Networking Site in violation of the privacy settings that you have set with that Social Networking Site account.
Very Briefly in addition to points highlighted above, further areas will cause challenges for privacy by design, legacy applications these applications would have to implement a re-design which would be significant undertaking, law enforcement hindering conflicts with this principles in particular UK's investigatory powers bill with respect to users meta data . Systems could become more complex and require more of a burden on the user Schaar . How does an organisation determine the tangible benefits ?
 General Data Protection Regulation
 Ann Cavoukian Ph.D., Privacy by design The 7 Foundational Principles
 Susan Landau, Educating Engineers: Teaching Privacy in a World of Open Doors
 Cavoukian, Privacy by Design, Workshop, Foreword.
 Sarah Spiekermann, The Challenges of Privacy by design
http://ec-wu.at/spiekermann/publications/The Challenges of Privacy by Design.pdf
 Peter Schaar, Privacy by design. Identity in the Information
 EU Directive 95/46,
 Bill Gates Memo Trustworthy Computing
 Microsoft SD3 + C, Secure By design, By Default and By Deployment + Communication
 Sebastian Gray, Strategic News Bundling and Privacy Breach Disclosures
 Lahlou, Privacy and trust issues with invisible computers
 Langheinrich, M. and Lahlou, S.A Troubadour Approach to Privacy. Ambient
Agoras report 15.3.1. Disappearing Computer Initiative (Nov. 2003).
 Landau, Educating Engineers & Teaching Privacy
 NIST, Summary of the Privacy Engineering Workshop
 ENISA - Privacy and Data Protection by Design
 Narayanan & Felton, No Silver bullet: De-identification still doesn't work.
 Cavoukian & Castro, Big Data and Innovation,
Setting the Record Straight: De-identification Does Work
 Paul Ohm Broken Promises of Privacy,
 Montjoye et al, Unique in the Crowd
 Latanya Sweeney, Uniqueness of Simple Demographics in the U.S. Population
 Phillippe Golle, Revisiting the Uniqueness of Simple Demographics in the US Population
 Bruce Schneier, The Process of Security
 Khaled El Emam et al., “De-identification Methods for Open Health Data"
 Narayanan, An Adversarial Analysis of the Reidentifiability of the Heritage Health Prize Dataset
 Michael Barbaro, Biography.
 Tom Zeller, LinkedIn Profile
 Computer Applicants, 2012 v 2013
 IMDB, Conditions of Use
 Nelson, Practical Implications of Sharing Data
 Chawla1, Dwork et al. Toward Privacy in Public Databases
 Basin et al, Improving the security of Cryptographic Protocol Standards http://www.cs.ox.ac.uk/people/cas.cremers/downloads/papers/BCMRW2013-standards-draft.pdf, 2014.
 Experian, Data Hack for Credit Agency
 Anthem Health, 80 million individual members data accessed
 Community Health Systems, 4.5 Million health records accessed
 Narayanan & Shmatikov, FAQ on the Netflix re-identification
 Netflix Prize FAQ
 Garfinkel, NIST De-identification of personal information
 Google, Anonymization is difficult
 Davies, Why Privacy by Design is the next crucial step for privacy protection.
 Richard Chow, Privacy by Design for the security Practitioner
 McGraw et al. BSIMM 6
(accessed 30th June 2016)
 UK, Investigatory Powers Bill.