Privacy By Design Challenges

These notes have been completed as part of work in University of Oxford and towards GCHQ accreditation. All comments are my own.


Cross functional & Management Involvement

Unless management have an enterprise wide understanding of the needs for privacy by design it’s difficult to promote its widespread adoption throughout the organisations day to day activity. Management will require a need to understand the requirement and what benefits PbD can provide an organisation. Each organisation has finite resources, these resources need to be optimized. Mapping PbD into an advantage makes conversations and buy-in from C-suite level management an easier and stronger approach to be adopted throughout the organisation; similarities with implementing secure programming practices hold true for when privacy by design i.e. implemented throughout an organisation, Graw [43] has documented a maturity module related to security BSIMM (Building Security In Maturity Model), highlighting executive leadership is a primary consideration as "Grassroots approaches to software security sparked and led solely by developers and their direct managers have a poor track record in the real world". Privacy by design will encounter similar challenges, as such initiatives from BSIMM should prove useful with respect to implementing PbD. Make PbD as a differentiator with competitors, being transparent about PbD today prior to new GDPR regulation could prove beneficial from an advantageous perspective. PbD will require having a collective cross functional involvement i.e. PbD is not possible to implement within one division alone, e.g. engineering alone cannot achieve PbD nor can management alone achieve PbD.

Take a software development company as an example; software engineers are building a software as a service solution. These engineers may practice PbD while designing a solution. Such SaaS solution most likely will require integration hook points with other software products, vendors and other SaaS solutions or in most cases a collection of same. This leads to a business decision requiring selecting these integration providers. Do these business decision makers implement PbD, do they know what PbD is etc. ? The point is PbD requires cross functional involvement and cannot be practiced by one division alone.

Expanding above example and further involving other divisions. Organisations invest significant resources in marketing and brand positioning an organisation. These divisions will understand the length and resources required to achieve such branding; this could be seen an opportunity for buy-in from these divisions i.e. a privacy breach would be bad from a branding perspective and could affect bottom line. Though research by [11] suggests strategically controlling the news at the same time as a privacy breach can have a positive impact on stock prices. [11] does show a 0.27% decrease of stock prices; while strategically controlling the news media by publishing a high volume of positive news can increase a stock price by .47% around the time of a privacy breach. However, [11] research is derived from data between 2005 and 2014; in some recent breaches C-suite management have been forced to resign e.g. Targets CEO Gregg Steinhafel, U.S. Office of Personnel Management's director, Katerine Archuletta.

Mandatory PIA's & Application / System Designers

Spiekermann [5] advices mandatory PIA (Privacy Impact Assessments) for system designers would be a positive step towards PbD goals; reasoning "individuals make irrational privacy decisions and ... underestimate ... privacy risks", this extends to managers involved in implementing PbD within organisations. Article 25 of [1] mentions an approved certification mechanism in accordance with article 42 of [1], however the language would not satisfy Spiekermann's mandatory step for system designers. Firstly article 25 indicates such approved certification mechanisms 'may' be used, while article 42 indicates the certification is 'voluntary' and member states 'encourage' establishing such certification processes. Secondly, no mention of system designers, the article is more focused on data controllers and processors. The important challenge point here is organisations purchase applications without any say in the design of these applications, and the regulation is more focused on the data controllers and processors rather than the designers of the applications.

Education & Awareness

Lahlou [12] highlights concerning results associated with engineers working on invisible ubiquitous computing projects e.g. hidden sensors. When Lahlou [13] surveyed engineers about PbD, engineers responded PbD is

"An abstract problem", 
"Not a problem yet", 
"not a problem at all...solved by Firewalls and cryptography", 
"Not their problem", 
"Not part of the project deliverables"

While Spiekermann [5] also notes privacy is not a "primary consideration" for engineers. Also noted by Spiekermann [5], education is a challenge, in particular related to distinguishing the difference between security and privacy. Clearly based on Lahlou [13] findings some engineers highlight privacy is solved by firewalls and cryptography. Educating engineers that privacy is about minimization of personal data collection while maximizing a user’s control, whereas security should be seen as an enabler of privacy i.e. security mechanisms such as SSL/TLS provides an element of privacy communicating between two end points, it should be further noted in this example SSL/TLS between end points can still leak meta data i.e. a network sniff can determine person A communicated between two end points. Furthermore, Landau [14], highlights "privacy isn’t only about compliance .... its designing them to ensure they protect privacy", and designing such systems requires input from several areas of expertise including social science, computer scientists, engineers, the law etc. As universities overall are not doing so well (with a few exceptions) educating these multidisciplinary about privacy, this leads to ad-hoc education, Landau [14] highlights engineers educating about privacy is primarily ad-hoc when building systems and challenged with privacy related concerns. With such multidisciplinary requirements having common terminology to help bridge the communication gap was highlighted by NIST [15] as one of the key challenges when engaging privacy communication across multiple disciplines.

Landau [14] further highlights the differing standards of privacy across the world i.e. one culture holds personal privacy differently to another culture, this is a similar concern raised by NIST [15]; this workshop raised the concern system design approaches should be applied internationally where applicable.

Ubiquitous Invisible Computing a conflict with PbD

Both Lahlou [12] and NIST [15] highlight challenges PbD encounters with the ever growing ubiquitous invisible computing projects. NIST [15] raises the challenges associated with obtaining individuals consent when sensors are built into walls and doors i.e. these sensors are invisible. While Lahlou [12] raises a similar concern related to individuals sensory borders. As Lahlou [12] indicates deployment of ubiquitous computing leads to more data flowing, how does PbD get implemented ? According to Lahlou [12] individuals physical interactions with invisible devices reduces ones sensory borders i.e. where an individual’s privacy awareness reduces when one does not physical see or interact with an invisible sensor.

An example; an individual might allow sensors associated with a certain store, while disapproving of sensory detection in other stores. How will this be practical from an implementation point of view and how would an individual give consent in such scenarios ? Does an individual get notifications each time they pass a store, how practical would this be in a large shopping mall ? For each store you enter does an individual physically have to consent to sensor tracking, and how practical can this be implemented ?

Lahlou [12] summarizes this challenge by highlighting "privacy and ubiquity seem in constant conflict".

Location or context based privacy by design

Different definitions of privacy across the world and differing regulations makes implementing privacy by design location based which adds to the complexity of system designs, adding PbD can raise complexity of an application and introduces friction for the user as highlighted by ENISA [16].

How does PbD cater for scenarios where differing privacy regulations exist based on location, does an individual have to consent differently depending on location of system usage, e.g. in US you consent or have different privacy settings to settings in the EU ?

Privacy by Default & Personal Data at the heart of an organisations economic and business models

A founding principle "privacy by default" appears very aspirational in particular when viewed in the context of social media and their business models. Cavoukian [2] further explains "No action is required on the part of the individual to protect their privacy — it is built into the system, by default."

Taking Facebook as an example, the very essence of these social media services require a significant element of personal data not to be private, this means data is either public or available to third party applications. This 'privacy by default' principle goes against the fundamental goals of these social media services. For example:

Facebook's mission "is to make the world more open and connected" as seen in figure

Figure 3 - Facebook Mission

Davies [41] agrees privacy by default is akin to having an opt-in approach and would appear to be "almost genetically opposed by many companies." [41]. Spiekermann [5] also highlights most PbD advocates do not take into account the economic and business models of organisations where personal data is at the heart of their business models e.g. social networking sites. Applying a strict PbD approach to Facebooks social networking site would prohibit collection of personal data and adversely affecting bottom lines.

Privacy by default can be exemplified by reviewing one of Facebook's third party integrations i.e. Facebook connector. This feature allows third party applications leverage Facebooks authentication mechanism, allowing a user to authenticate with their Facebook username and password, which has positive benefits. However, by default significant data is exposed by default to the third party application. Below is an example, Glassdoor [10] is a service providing job postings and employee reviews on companies among other employee sourced features. Glassdoor leverages the Facebook connector mechanism figure 4 below

Figure 4 - Facebook connector leveraged by Glassdoor

Figure 4 above presents the scenario a user steps through when accepting Glassdoor and Facebooks authentication connector. From the initial interaction at step 1 a user is presented with the "Sign In with Facebook" option, at this point the user is not aware any data will be available to the third party application (i.e. Glassdoor), as such the privacy by default principle is not evident with this initial interaction. In step 2 user is presented with some information indicating certain information will be available to Glassdoor, selecting the "Edit this" presents further detail. Overall six options are listed (the location option exists off screen in step 3 under education history), by default all six options are selected for the user, again privacy by default is not applied. Only one option is mandatory i.e. the users public profile.

Taking privacy by default on the face of it, how would this affect Facebooks connector ?

Social Sign In. If you access Glassdoor through a social networking site, such as Facebook or Google+ ("Social Networking Site"), you agree that we may access, make available, and store (if applicable) any information, data, text, messages, tags, and/or other materials accessible through Glassdoor that you have provided to and stored in your Social Networking Site account so that it is available on and through Glassdoor via your account and your profile page. Subject to the privacy settings that you have set with the Social Networking Site account you use to access Glassdoor, personally identifiable information that you post to that Social Networking Site may be displayed on Glassdoor. Please note: your relationship with your Social Networking Sites is governed solely by your agreement with those Social Networking Sites and we disclaim any liability for personally identifiable information that may be provided to us by a Social Networking Site in violation of the privacy settings that you have set with that Social Networking Site account.

Reviewing the Glassdoor privacy policy [44] focusing on the section "Social Sign in". A number of privacy concerns can be raised. Firstly Glassdoor 'make available' data accessed, no further information exists as to what make available actually means. Secondly, by default all six options are enabled and initial transfer of data makes all these six categories available to Glassdoor, no provisions appear to exist where a user changes their privacy settings subsequent to transfer to Glassdoor e.g. users education details may be stored in Glassdoor, no provisions exists for this information to be purged if the user changes their privacy settings it seems the implementation is a 'point in time' principle. I am not aware of implementations with Facebook where a user changes their privacy policy and Facebook issues a purge request to providers. Above policy indicates "subject to privacy settings that you have set with the Social Networking Site you use to access Glassdoor" suggests if a user changes their privacy settings within Facebook these will be applied to Glassdoor, however these can only be applied on subsequent data transfer between Facebook and Glassdoor, and as noted no purge requests exists, even if purge requests existed would they be adhered? Other concerns could also be raised, 'if application' how does Glassdoor determine if applicable this would suggest Glassdoor have to data mine the data and as such have to store this information, therefore the statement 'store (if application)' can be seen as a contradiction.

Very Briefly in addition to points highlighted above, further areas will cause challenges for privacy by design, legacy applications these applications would have to implement a re-design which would be significant undertaking, law enforcement hindering conflicts with this principles in particular UK's investigatory powers bill with respect to users meta data [45]. Systems could become more complex and require more of a burden on the user Schaar [6]. How does an organisation determine the tangible benefits ?


[1] General Data Protection Regulation http://eur-lex.europa.eu/legal-content/EN/TXT/PDF/?uri=CONSIL:ST54192016_INIT&from=EN

[2] Ann Cavoukian Ph.D., Privacy by design The 7 Foundational Principles https://www.ipc.on.ca/images/Resources/7foundationalprinciples.pdf

[3] Susan Landau, Educating Engineers: Teaching Privacy in a World of Open Doors http://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=6824531

[4] Cavoukian, Privacy by Design, Workshop, Foreword. http://link.springer.com/article/10.1007%2Fs12394-010-0062-y

[5] Sarah Spiekermann, The Challenges of Privacy by design http://ec-wu.at/spiekermann/publications/The%20Challenges%20of%20Privacy%20by%20Design.pdf

[6] Peter Schaar, Privacy by design. Identity in the Information http://www.bfdi.bund.de/SharedDocs/Publikationen/EN/0610EUPrivacyByDesign.pdf

[7] EU Directive 95/46, http://eur-lex.europa.eu/legal-content/EN/TXT/PDF/?uri=CELEX:31995L0046&from=EN

[8] Bill Gates Memo Trustworthy Computing https://news.microsoft.com/2012/01/11/memo-from-bill-gates/#sm.0001p1e5t2157ucy8r5zj4x60e9mw
http://www.wired.com/2002/01/bill-gates-trustworthy-computing/

[9] Microsoft SD3 + C, Secure By design, By Default and By Deployment + Communication https://msdn.microsoft.com/en-us/library/windows/desktop/cc307406.aspx

[10] Glassdoor https://www.glassdoor.com

[11] Sebastian Gray, Strategic News Bundling and Privacy Breach Disclosures http://papers.ssrn.com/sol3/papers.cfm?abstract_id=2643780
https://iapp.org/news/a/do-data-breaches-hurt-the-bottom-line-maybe-not/

[12] Lahlou, Privacy and trust issues with invisible computers http://www.comm.rwth-aachen.de/files/cacm_2005.pdf

[13] Langheinrich, M. and Lahlou, S.A Troubadour Approach to Privacy. Ambient Agoras report 15.3.1. Disappearing Computer Initiative (Nov. 2003).

[14] Landau, Educating Engineers & Teaching Privacy http://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=6824531

  [15] NIST, Summary of the Privacy Engineering Workshop http://www.nist.gov/cyberframework/upload/privacy-workshop-summary-052114.pdf

[16] ENISA - Privacy and Data Protection by Design https://www.enisa.europa.eu/publications/privacy-and-data-protection-by-design

[17] Narayanan & Felton, No Silver bullet: De-identification still doesn't work. http://randomwalker.info/publications/no-silver-bullet-de-identification.pdf

[18] Cavoukian & Castro, Big Data and Innovation, Setting the Record Straight: De-identification Does Work
http://www2.itif.org/2014-big-data-deidentification.pdf

[19] Paul Ohm Broken Promises of Privacy, http://www.uclalawreview.org/pdf/57-6-3.pdf

[20] Narayanan & Shmatikov - How to break anonymity of the Netflix prize dataset http://arxiv.org/abs/cs/0610105v1 & http://arxiv.org/pdf/cs/0610105v2.pdf

[21] Montjoye et al, Unique in the Crowd http://www.nature.com/articles/srep01376

[22] Latanya Sweeney, Uniqueness of Simple Demographics in the U.S. Population http://dataprivacylab.org/projects/identifiability/paper1.pdf

[23] Phillippe Golle, Revisiting the Uniqueness of Simple Demographics in the US Population http://crypto.stanford.edu/~pgolle/papers/census.pdf

[24] Bruce Schneier, The Process of Security https://www.schneier.com/essays/archives/2000/04/theprocessof_secur.html

[25] Khaled El Emam et al., “De-identification Methods for Open Health Data" http://www.ncbi.nlm.nih.gov/pubmed/22370452

[26] Narayanan, An Adversarial Analysis of the Reidentifiability of the Heritage Health Prize Dataset http://randomwalker.info/publications/heritage-health-re-identifiability.pdf

[27] Michael Barbaro, Biography. http://topics.nytimes.com/top/reference/timestopics/people/b/michael_barbaro/index.html

[28] Tom Zeller, LinkedIn Profile https://www.linkedin.com/in/tomzellerjr

[29] Computer Applicants, 2012 v 2013 https://www.theguardian.com/news/datablog/2013/jan/30/university-applications-subjects-age-gender-country
[30] IMDB, Conditions of Use http://www.imdb.com/conditions

[31] Nelson, Practical Implications of Sharing Data http://support.sas.com/resources/papers/proceedings15/1884-2015.pdf

[32] Chawla1, Dwork et al. Toward Privacy in Public Databases https://www.microsoft.com/en-us/research/wp-content/uploads/2016/02/tcc05-cdmsw.pdf

[33] Basin et al, Improving the security of Cryptographic Protocol Standards http://www.cs.ox.ac.uk/people/cas.cremers/downloads/papers/BCMRW2013-standards-draft.pdf, 2014.

[34] Experian, Data Hack for Credit Agency https://www.theguardian.com/business/2015/oct/01/experian-hack-t-mobile-credit-checks-personal-information

[35] Anthem Health, 80 million individual members data accessed http://www.wsj.com/articles/anthem-hacked-database-included-78-8-million-people-1424807364

[36] Community Health Systems, 4.5 Million health records accessed http://www.reuters.com/article/us-community-health-cybersecurity-idUSKBN0GI16N20140818

[37] Narayanan & Shmatikov, FAQ on the Netflix re-identification https://www.cs.utexas.edu/~shmat/netflix-faq.html

[38] Netflix Prize FAQ http://www.netflixprize.com/faq

[39] Garfinkel, NIST De-identification of personal information http://nvlpubs.nist.gov/nistpubs/ir/2015/NIST.IR.8053.pdf

[40] Google, Anonymization is difficult http://www.cnet.com/news/debunking-googles-log-anonymization-propaganda/

[41] Davies, Why Privacy by Design is the next crucial step for privacy protection. http://i-comp.org/wp-content/uploads/2013/07/privacy-by-design.pdf

[42] Richard Chow, Privacy by Design for the security Practitioner https://www.blackhat.com/docs/asia-14/materials/Chow/WP-Asia-14-Chow-Privacy-By-Design-For-The-Security-Practitioner.pdf

[43] McGraw et al. BSIMM 6 https://www.bsimm.com/

[44] Glassdoor Terms of Use https://www.glassdoor.ie/about/terms.htm
(accessed 30th June 2016)

[45] UK, Investigatory Powers Bill. https://www.gov.uk/government/publications/investigatory-powers-bill-overarching-documents