text
stringclasses
22 values
<Chapter>: chapter 2
<Section Text>: foundational concepts and frameworks from within the secret court of mens hearts, tom was a dead man the minute mayella ewell opened her mouth and screamed. harper lee, to kill a mockingbird, 1960 we cannot escape the secret courts within the hearts of men. opinions, impressions, judgments and prejudices are formed, often instantly and subconsciously, based upon available data, context, and experience. the availability of greater and greater quantities of multimedia-enriched data makes more acute the imperative to manage and respect the power of information to impact individual lives as well as those of entire races and nation-states. theres a terror in knowing what the world is about. david bowie this chapter addresses key definitions and concepts of privacy that anyone involved in engineering writ large (i.e., architecting, designing, developing, managing, and implementing components, products, services, processes, systems, or applications that process personal information) must understand to be successful as we enter a new stage in the information agethat of intelligence and data science. we also will define what privacy engineering is, what a privacy engineer does, and the goals of privacy engineering. in subsequent chapters, we will discuss how to apply these definitions and concepts to a privacy engineers work, broadly defined as designing, creating, inventing, imagining, and building things that process personal information. what is privacy? a great majority of the complexity this book addresses arises, in fact, from the imperfections and difficulty of defining this multifaceted thing called privacy. there are different forms of privacy. data privacy (also known as data protection in europe), which is the kind of privacy this book addresses, can be discussed at great length, but finding one, global, consistent definition can be elusive. this chapter will propose an operational definition of chapter 2 fata ccept a fraer 26 data privacy as it is most often conceived by organizations that consume and process data about people and the governments and institutions who wish to regulate its many aspects and uses. this is not a book about public policy, philosophy, religion, or advocacy other than for privacy engineering. data privacy is one form of privacy that is derived from substantive privacy. substantive privacy describes the right and ability of an individual to define and live his or her life in a self-determined fashion. other forms of privacy attempt to describe and define this basic human fact. data privacy is a derivative of the substantive right to privacy in that it is about data that has been created about an individual (1) by him- or herself, (2) by others through observations and analysis, or (3) by the consumption or processing (i.e., use) of that data about an individual by others. some of the other forms of privacy, or ways in which substantive privacy may be broken down, are behavioral privacy, decisional privacy, and physical privacy. they all interrelate and overlap in various ways. for simplicity sake, throughout this book, whenever we refer to privacy or data privacy we intend them as one and the same (i.e., data privacy) and if another form of privacy is intended, it will be identified. the different forms of privacy there are different forms of privacy such as behavioral privacy, decisional privacy, and physical privacy. decisional privacy is really about being able to make decisions and choices without third-party inspection or intrusion. this may be thought of as self-determination within ones own private life. ot having to explain or justify ones behavior or share personal opinions or thoughts is an example of decisional privacy. behavioral privacy is about being able to act as one wants, free from unwanted third-party intrusion or observation (assuming no harm to others is incurred or laws broken). n this realm, people may dance in their living rooms or whistle in their cars or don various forms of dress or undress upon their own discretion. physical privacy is privacy about ones body or person. odesty is another word for it. ome people are more sensitive to physical privacy than others. two things about the different forms of privacy should be noted. first, in many instances the examples overlap. rarely is an example of one kind of privacy exclusive of another. econd, data privacy runs through all types of privacy because as soon as something about you or someone is observed or articulated (even just by you), you cantilever into the data privacy space. ata privacy is literally the language of substantive privacy forms whenever an action or behavior or even a stillness occurs. as such, as soon as any third party becomes involved in data that describe another person, data privacy becomes a fiduciary activity where access, sharing, or exchange of personal information is the corpus of the fiduciary trust. chapter 2 fata ccept a fraer 27 the substantive nature of privacy by tewart room, partner, field fisher aterhouse p the right to privacy has been described in many different ways. lawyers often talk about the fourth amendment prohibition against unreasonable search and seizures as protecting private spaces. european human rights law says that the right to privacy protects our home life, family life, and correspondence from unreasonable interference by the state. egislation that is commonly grouped together as privacy laws has focused on the topics of health, financial services, children, electronic communications, and data security breaches. famous court cases have protected the image rights of celebrities, the chassis of cars,1 and office computers2 all in the name of privacy. tatutory regulators use consumer laws to prevent the misselling of home closed-circuit television systems and smartphones as being privacy enhancing.3 two golden threads run through this diverse list of interests, creating a common and uniting bond among them: the concepts of substantive and informational privacy. ithin a civilized society, it is the desire to protect substantive and informational privacy that unites the celebrity, the child, the consumer, the smartphone, the camera, the home, the workplace, and the car. all theories of privacy and all privacy laws will pay service to one or both of these concepts. the idea at the heart of the concept of substantive privacy is that people should be free to make decisions about how they lead their lives, free from interference by others. the idea at the heart of the concept of informational privacy is that people should be able to control the use of information about themselves. ithin a state of privacy, these concepts reinforce and support each other; substantive privacy needs and relies upon informational privacy, and vice versa. n this day and age it is readily appreciated that the threats to a persons privacy do not flow only from the statethe dentity theft bogeyman is as much an icon for privacy interference as big brotheryet the example of the malevolent state provides the easiest way to demonstrate the relationship between and the concepts of substantive and informational privacy and their interdependencies. and among the many sickening examples of state-level evil that have plagued mankind and shamed our history, hitlers azi regime in germany stands among the very worst. 1us v. jones, 565 us __, 132 s. ct. 945 (2012). 2see, for example, copland v. united kingdom, 62617/00 [2007] echr 253 (3 april 2007). see also the uk information commissioners employment statutory code of practice (2008). 3see, for example, us federal trade commission v. htc, file no. 122 3049 (2013). chapter 2 fata ccept a fraer 28 the jew in hitlers germany was required to wear a yellow star. this badge said publicly am a jew. the information it conveyed restricted the jew to the ghetto and, later, it destined him to the gas chamber. the evil azi state controlled the information, and the substantive effects will never be forgotten. hortly after the end of the war, europe adopted the convention on human rights, ensuring the right to privacy for all persons, so that these horrors could not be repeated. yet even in the modern world, states still interfere with informational privacy to substantively maligning effects. the nternet is intentionally tapped in orth orea and china to gain information about dissidents, which creates a general appreciation of the presence of surveillance and creates fear, which causes modifications to substantive actions, decisions, and the way people live their lives. but why is any of this important to the privacy engineer? imply put, remembering the very real connections between information and substantive actions and decisions creates a mental knot in the handkerchief of the mind (not to be glib about the use of information and the design of information processing systems). ften the substantive effects of information mishandling are hard to see, fathom, or articulate. the connection between a yellow star and a gas chamber is nonobvious. the harms or distress that may result from a security breach can also be nonobvious, likewise those resulting from data profiling, data aggregation, or data monetization. the privacy engineer will understand, however, that adherence to the principles and disciplines of engineering will provide the best prospects of understanding the substantive risks that can flow from the processing of personal information, and that engineering gives the best prospects for risk mitigation. a captain of the industry has famously stated that the boundary between lawful data processing and unlawful interference with privacy is a creepy line, a statement that for good or bad will sustain along with the right to be let alone within the lexicon of privacy. f the boundary between lawfulness and illegality is to creep and shift, the risk of unwelcome substantive effects becomes embedded within the organization. a risky business may accept this, but the privacy engineer who understands the connections between information and substantive privacy will understand the truth of this fascinating area; the boundary cannot creep and change, but should be fixed. this can only be achieved by coding the boundary into the architecture of the processing system. privacy engineering too often the necessary controls and measures to protect personal information required by a process, application, or system are either ignored or bolted on at the 11th hour of development. when this happens, it usually results in poor user experience, with subpar protections, unnecessary overhead, and customer dissatisfaction. chapter 2 fata ccept a fraer 29 this is not a wishful or hopeful book about the management of data centers or leadership. this is a practical and pragmatic book that charts out an approach allowing for innovation from many workbencheslegal, technical, political, artistic, or logical. we can call these disciplines, when they come together to create something that promotes the best of data privacy, the innovative and beneficial uses of personal information or those that chase out uncertainty and risk to data wherever possible: privacy engineering. engineering has been defined by the engineers council for professional development as the creative application of scientific principles to design or develop structures, machines, apparatus, or manufacturing processes, or works utilizing them singly or in combination; or to construct or operate the same with full cognizance of their design; or to forecast their behavior under specific operating conditions; all as respects an intended function, economics of operation, and safety to life and property.4 privacy engineering as a discrete discipline or field of inquiry and innovation may be defined as using engineering principles and processes to build controls and measures into processes, systems, components, and products that enable the authorized, fair, and legitimate processing of personal information. privacy engineering may also be applied to the creative innovation process to manage increasingly more complex data streams and datasets that describe individual humans. privacy engineering can be considered the gathering and application of privacy requirements with the same primacy as other traditional feature or process requirements and then incorporating, prioritizing, and addressing them at each stage of the development lifecycle, whether its for a process, project, product, system, application, or other. the intent of privacy engineering is to close the gap between privacy policy and the reality of systems or technologies or processes. the greater the mismatch between the two, the greater the opportunity for needless inefficiencies, risk, or both. the risk of failure to follow a privacy engineering approach will be discussed in greater detail in later chapters. in short, poor system design, poor policy requirement gathering, or poor communication (which are the hallmarks of design without privacy engineering techniques) may cause risk or harm to the inventors of such systems, the owners of them, and the individuals described or implicated by the data, or all of the above. further, the monetary, reputational, organizational, or even criminal risks or harms will only increase for those who fail to recognize a privacy engineering approach as systems become more complex and personal data more valued. privacy engineering is not merely a call for mindful engineering where personal information is involved. the call for privacy engineering use and study is a call for leadership, innovation, and even a good measure of courage to change the status quo for design and information management. once every system owner, designer, and user expects and understands privacy engineering principles, we expect that privacy engineering will become so integrated into standard innovation cycles that there will be no need for reference to a discrete practice. rather, the principles of privacy engineering will be an obvious and necessary part of engineering of any kind when personal information is involved or potentially involved. 4www.britannica.com/ebchecked/topic/187549/engineering chapter 2 fata ccept a fraer 30 when privacy engineering becomes ubiquitous, individuals will not be treated as inventory, and data about them will be viewed as a special asset, important, sometimes profitable, and always one with a fundamental ethical value. when this happens, systems that use personal information will be designed, implemented, and decommissioned accordingly. however, to accelerate the arrival of this day and the ability to safely unlock the rewards of the internet and the personal information service economy, there is an urgent need for leadership and for stakeholders to act expeditiously in adopting and extending the vision of privacy engineering as articulated throughout this book. getting to privacy engineering ubiquity will require many acts of courage and cunning. but, as clearly articulated by ford prefect in douglas adamss a hitchhikers guide to the galaxy, dont panic and always carry a towel. please consider this book your towel. what are the real privacy risks? o far, most of the individuals who have gone to jail for data privacy violations have been hackers, spammers, identity thieves, and peeping toms. nless related to large or multimillion dollar operations, most of the convictions do not receive wide-scale coverage in the mainstream media attention. t is the same with data breaches, which, unfortunately, are increasingly commonplace and thus less newsworthy. but jail isnt the only possible repercussion for misbehaving in the privacy space and getting caught. ncreasingly, corporations and organizations are being cited for privacy violations and are being fined, given sanctions, being placed under regulatory supervision, or pilloried in the public square of opinion. ome of these fines have been in the multimillion dollar range, required recoding of software and data deletions, resulted in multiyear sanctions requiring biannual privacy audits being submitted to regulatory authorities for review, or caused a decline in shareholder value. we propose that privacy engineers take responsibility for: designing and constructing processes, products, and systems with privacy in mind that appropriately collect or use personal information supporting the development, implementation, and measurement of privacy policies, standards, guidelines, and rules analyzing software and hardware designs and implementation from a privacy and user experience perspective supporting privacy audits working with other stakeholders to ensure privacy requirements are met outside as well as inside the engineering space chapter 2 fata ccept a fraer 31 we propose that privacy engineers, in addition to better protecting and ensuring the proper use of personal information in the things they design, build, and implement, will provide the following benefits to individuals, as well as government and business enterprises: protection for customers, users, or citizens a more objective basis for a trusted data platform a foundation to drive more thoughtful and higher-quality personal information services, sharing, and engagement these benefits can lead to better and more information from users, which in turn helps to build and inspire better user experiences, better applications, better services, better products, and greater innovation. before we get into the toolbox for privacy engineering or the implications privacy engineering has for organizational design, lets explore some key privacy concepts and frameworks. personal information it is critical for privacy engineers to thoroughly understand how personal information is defined and how its definition evolves and shifts over time. personal information is the asset protected by privacy rules, processes, and technologies. traditionally, personal information has been defined as information that directly identifies or, in combination with other data, allows for the identification of an individual (i.e., basic examples are an individuals name, address, phone number, or national or tax identification number) or any otherwise-anonymous information that when combined can only be a single person. an example of this would be the cpo of sun microsystems in 2005, because there is only one person who fits this description. an example of anonymous information would be three of the thousand engineers carry laptops, because the characterization fits more than one person and, therefore, does not identify anyone in particular. traditionally, the term for these data elements has been personally identifiable information (pii) or, alternatively it could be called personal information (pi). using different nomenclature can create unnecessary confusion due to unnecessary distinctions. the real issue is does the data alone, or in combination with other data, identify a single individual? the term pii is useful, however, in terms of determining which elements make a collection of information personal or identifying which data elements need to be removed to depersonalize or deidentify it. we will use pi as our convention throughout the rest of the book. some forms of pi are additionally considered sensitive, either culturally, under the law, or both (e.g., the type of information that can be used to embarrass, harm, or discriminate against someone). different cultures consider different categories of pi as sensitive pi, but the following are fairly common: information about an individuals medical or health conditions financial information racial or ethnic origin chapter 2 fata ccept a fraer 32 political opinions religious or philosophical beliefs trade union membership sexual orientation information related to offenses or criminal convictions largely due to the explosion of the internet, mobile computing, and telecommunications technology, the definition of pi is evolving to include unique device and network identifiers such as the universally unique identifier (uuid) and internet protocol (ip) addresses. the federal trade commission effectively redefined pi to include certain types of what used to be considered machine data such as device id and ip addresses when it stated in its 2010 report, protecting consumer privacy in an era of rapid change, that: the proposed framework is not limited to those who collect personally identifiable information (pii). rather, it applies to those commercial entities that collect data that can be reasonably linked to a specific consumer, computer, or other device.5 it should be noted that not all device ids or ip addresses should be considered pi de facto. some devices, just as some ip addresses, are not associated with an identifiable person or personal system. how to think about deidentifying or anonymizing data ne way to remove risk or potential harm in processing personal information is to only use what is needed. ne strategy for this is to deidentify or anonymize the data before using it. anonymizing or deidentifying data begins when deciding what to collect or use. f personal information is not needed, then it is better not to collect or use it. always ask (1) is the information needed to serve the purpose of the processing; and (2) what is the minimum amount of information that is needed? example: birth date: s the day and month of birth needed or the actual birth data (day, month, year)? f the purpose is to automate birthday salutation, then month and date of birth should be sufficient. f the requirement is to ascertain age as part of authorizing access to content on a web site, just ask month and year, or age, or better yet, ask the age in ranges of 5 years. 5federal trade commission, protecting consumer privacy in an era of rapid change: recommendations for businesses and policymakers, p. 43. www.ftc.gov/os/2012/03/120326privacyreport.pdf chapter 2 fata ccept a fraer 33 example: geographic location: f the requirement is geographic, is gp needed or will street address, zp code, or just city and state meet the need? the second part of the discussion has to do with uses of the data. ome of the uses of the data may require the elements that make it personal information; others may not. o it becomes important to think about how to anonymize or deidentify data. oes p p = ? n other words, if one removes the personal, is what is left just information? ell, technically yes; but this is something you may not want to be right merely on a technicality. consider the number of people in the data pool. for instance, although the information may be anonymous (because the personal identifiers have been removed), the data is still very distinct and the pool of possibilities so small that it might effectively reflect only three or four people. o, although the information does not truly identify a single person, the group is so small that an educated guess can easily be made as to whom is in it. you could say there are different levels of anonymization. ne in 10 is different from one in 10,000. another vector to be considered is the methodology. how was the data anonymized? ere the unique identifiers removed completely from the dataset or were they merely replaced with a pseudonym? f it was replaced with a pseudonym, does the pseudonym pass a reidentification test? r can the data still be used to take action or contact a person? f it doesnt pass the reidentification test or it still can be used to contact a person or reasonably linked to a system, then it cannot be truly called anonymized, perhaps deidentified, but not anonymized. a third vector to consider is whether specific data elements are needed or whether ranges or categories suffice. n other words, using an executive income report as an example, one can remove name and titles, but even in large organizations, the actual income may be unique enough that it identifies an individual even though all other descriptors have been removed or genericized. finally, if the decision is to aggregate data, make sure it is anonymized as well. aggregate data about a single individual is not necessarily anonymized. chapter 2 fata ccept a fraer 34 privacy merriam-websters dictionary defines privacy as: 1. a: the quality or state of being apart from company or observation: seclusion b: freedom from unauthorized intrusion ones right to privacy 2. archaic: a place of seclusion 3. a: secrecy b: a private matter: secret according to yael onn et al. in privacy in the digital environment: the right to privacy is our right to keep a domain around us, which includes all those things that are part of us, such as our body, home, thoughts, feelings, secrets, and identity. the right to privacy gives us the ability to choose which parts in this domain can be accessed by others, and to control the extent, manner, and timing of the use of those parts we choose to disclose.6 privacy defined colloquially seems to be subjective rather than systematic or governed by objective or pragmatic requirements; privacy is certainly contextual, including cultural and time-sensitive contexts that introduce variability and complexity. what one person may feel is the appropriate level of privacy can change, based on the situation. one persons sense of what is the appropriate privacy level for a given situation may be different from anothers. further complicating this is the fact that across the world, cultural values and social norms vary widely. finally, the same persons notions and sensitivities may change over time and context, which is to say, what one may want to share at one point in his or her life may change as life progresses, just as it changes based on the environment. consider, as an example, the act of wearing a bathing suit. an office worker would probably feel that his or her sense of privacy was being violated if a condition of employment was to wear a bathing suit to work; but this is not so for a swimming pool lifeguard. external social and cultural norms would also be violated in the former instance (contextual). however, even for a lifeguard, the type and cut of bathing suit is a factor to acceptability, social normative value, and sense of well-being (subjective). the challenge of privacy engineering is to architect and design products, processes, or systems that are sufficiently configurable to allow this sort of control. an operational definition of privacy data privacy may be defined as the authorized, fair, and legitimate processing of personal information. much of the activity resulting from this functional definition will appear to focus on organizations and the managements philosophies and policies from that 6yael onn et al., privacy in the digital environment. haifa center of law & technology, 2005. chapter 2 fata ccept a fraer 35 perspective, but it must always be remembered that the individual data subjectliterally the subject matter of the information (i.e., the individual to whom the data applies) remains the ultimate requirement-setting entity. to the extent feasible, flexibility built into privacy-engineered solutions will always be critical to properly govern that very human variability. note, too, that it is not always possible to make everyone happy. although this operational definition may seem deceptively simple, we can break it down into its components to start to see this definition as the beginnings of a pragmatic framework to not only define data privacy but also to begin to build it from these foundations (figure 2-1). figure 2-1. what is privacy? we have already discussed and defined personal information, so now lets turn to what is meant by processing, authorized, and fair and legitimate. processing of personal information data is processed upon any action or inaction that can be performed in relation to that data or dataset. processing personal information includes, but is not limited to, collection, storage, use, sharing, organization, display, recording, alignment, combination, disclosure by transmission, copying, consultation, erasure, destruction, and alteration of personally identifiable information and any data related to it. chapter 2 fata ccept a fraer 36 does the use of data fit within a cultural context by artin abrams, executive irector and chief trategist for the nformation accountability foundation the slogan eep austin eird works really well for that swinging texas city, but the culture in hebron, texas, would likely not be associated with weird, at least not in the same way. ocal cultures are reflected in the way people interact with people. and privacy is one of those areas where culture is reflected. privacy scholars such as alan estin who established the basis for modern privacy management understood that privacy culture is a function of how a society balances the autonomy of the individual against the interests of the group, and then factors in the way a society defines a space reserved for the individual, free from observation from others. although residents of both hebron and austin might have similar views on concepts of space, the balance between individual expression and community cohesiveness would be very different. nderstanding cultural diversity and applying it to privacy is difficult enough when making decisions about what is an appropriate use in texas, now think about looking at a global program that needs to work in germany, japan, weird austin, and stern hebron. how does an engineer begin building application requirements that fit the cultural context of diverse populations? ets use an example. illions of smartphones are sold each year in places as diverse as galesburg, llinois; bangalore, ndia; and frankfurt, germany. each smartphone has a unique signature, just like each of us have distinct finger prints. all smartphones are designed to run on i-fi networks. this design factor saves consumers money on their monthly mobile bills. t is no surprise that most consumers want to save money, so they set their phone to look for available i-fi networks. an innovative engineer quickly figured out that one can track a device through a physical space like a store by equipping the space with i-fi. furthermore, the engineer can see how much time the individual spends within a physical quadrant and can then link that information to the activities that take place in that quadrant. f it is a store, the activity is most likely shopping. for example, if the mobile device is in a home improvement store, the engineer now knows how long the device spends in the paint department and when it moves from paint to window treatments. aybe he or she can even link the shopping activity to the items purchased and track what the device buys over time. ts not the device that buys the item, it is actually the individual holding the device; while the device might not have a cultural perspective, the individual does. t really doesnt make any difference whether we know the name of the individual. the actions we take based on tracking the device are particular to that individual. o the privacy question becomes: s it appropriate to take actions based on the predicted behavior of the individual holding the device? chapter 2 fata ccept a fraer 37 the answer is: t depends. n the nited tates we have many conflicting values. first and foremost, we believe that we are free to observe what we are free to see and hear within the public commons. n the physical world, we, as a society, have defined the public commons: pretty much, it is anything outside ones home. t is the public street, the shopping mall, front yard, and the courtyard, if one is flying over in an airplane. furthermore, we are free to express ourselves based on how we process what we have observed. aking a sales offer is a form of expression. this value is captured by the first amendment to the constitution. the american people also cherish seclusion. that means, in our private space, we are free to do what we will do and think what we will think without fear of others observing and using what they hear and see. ur home is our castle, and it is not part of the public commons. you may watch me in my front yard, but you may not look in my window and invade my seclusion. n the nited tates, the i-fi-enabled store is the public commons. the observation of a device in a public space is probably okay, even if some might consider it obnoxious. furthermore, we are free to think about what we have learned and apply that knowledge for practical ends such as increasing sales. the preeminent nature of observation based on free expression doesnt have the same deference in other cultures. n those cultures, the sense that privacy as a fundamental right trumps the recording of what we observe and making use of that information. this is particularly so for most other estern cultures. n germany or france, the collection of the device signature, if it is easily linkable to an identifiable individual, is probably subject to data protection law. uch a collection would be a processing of personal information that requires either permission from the individual or the law. furthermore, any additional processing of that information, even storage, would also require permission from the law or the individual. e are talking about the same activity in different locations and having two different takes on whether the use is appropriate. culture puts a premium on free observation in the public commons, while societies with traditional data protection have no such deference for free observation. o, if an engineering team were to develop an observation model for a client that is dependent on observing devices in a physical space, the application would probably work in stores but would be a violation of both societal norms and laws in stores in estern europe. the analysis might be entirely different in asia, where rights to seclusion are limited but where such observation might be seen as violating norms necessary for a crowded society where physical space is limited. the laws are different because the cultures are different. chapter 2 fata ccept a fraer 38 these differences in privacy culture have impacted digital public policy for more than 30 years. justice ichael irby, former chief justice of the australia high court, led the experts that developed the rganisation for economic co-operation and evelopment (ec) privacy guidelines between 1978 and 1980. he said the most difficult issue he had to overcome in leading that group was the huge deference americans give to free expression. even though these differences are understood, we tend to default to what feels comfortable to each of us. business concepts based on monetizing the fruits of observation have been developed in the nited tates, but when the same applications are applied outside the nited tates, we tend to see friction. inety percent of the privacy issues that concern both individuals and regulators are the same no matter where the activity takes place. these include ensuring security, accommodating transparency, and not facilitating illegal behavior by others. f one deals with these issues, one can have a fairly high level of certainty that an application is okay. oving beyond what is the same, one can anticipate key cultural markers. ne such marker is at what age an individual reaches the age of maturity. this influences the consent children and adults are able to grant. astly, one needs to be truly sensitive to cultural differences related to observation. you know when the technology tracks behavior, so tracking is an indicator that a cultural review is necessary when a technology is taken from one geographic market to another. uch applications probably require a privacy impact assessment (discussed in chapter 10) with experts who understand the cultural frame. astly, there are cultural aspects to automated decision making. f applications make decisions without human intervention that impact the ability of an individual to gain employment, get credit or insurance, or travel, one should check cultural norms related to such decision making. just be sensitive to the fact that what is appropriate where you are doesnt mean it will be appropriate somewhere else; if you keep this in mind, you should be successful in your data-use initiatives. authorized authorized processing of personal information only happens where the person or organization processing it has appropriate privilege for that processing. additionally, there is a chain of custody and a sense of fiduciary responsibility that must follow the pi throughout the lifecycle of its processing. for example, those who can access a system containing pi must be authenticated to be the person he or she claims to be and that individual must also be acting within a role that would allow him or her to process the data within a system. chapter 2 fata ccept a fraer 39 the type of data, the nature of the processing, as well as local laws and regulations will determine the nature and level of permission that may be required. the four primary protocols for permission gathering are: opt out/opt in implied consent informed consent express consent opt out allows processing of pi unless or until an individual rejects data processing according to the context at hand. opt in (the logical twin to opt out) is where no processing is allowed unless and until permission is granted. these concepts are relatively new in the comparative areas under the law, as discussed below, particularly in common law jurisdictions. context, narrowness of purpose, and transparency practices can make opt out or opt in relatively effective mechanisms. implied consent is a relatively straightforward concept where the context of collection and other processing is deemed so routine, obvious, and expected that permission for processing within this context may be implied by an individuals participation in the contextual relationship at all. an example of implied consent would be when pi is used for necessary processes (business or otherwise). when you give your name and telephone number for a reservation, the permission to use it to hold your table and for the matred to use it to call you is implied because it is necessary and within the scope of the function for which it is being used. however, if the matred chose to send text messages to the reservation number to solicit charitable donations to his favorite charity, he would be violating the implied consent to use contact information. informed consent relates to a very well-established and understood area of contract and tort law where a data subject has all relevant and timely facts to enable a reasonable choice of whether, how, how much, and for what purpose data will be processed. a good example of well-informed consent in a nondata context is the difference between giving consent or accepting the risks of skiing vs. receiving medical treatment from a trained doctor. in the former example, an individual is physically aware of his condition, standing on a snowy mountain, on two small skis. yet there may be unexpected risks, and thus a disclaimer may be written on his ticket, but that disclaimer may be in smaller type and with no individualized explanations. in the latter example, however, the doctor and patient have very different levels of expertise, the procedures and risks may be unfathomable to the reasonable layperson, and the side effects may be unknowable without specific clarity. the type and depth of disclaimer and expository of risks and rewards are much different and far more extensive in this case. informed consent requires some responsibility and action on the part of the data subject and so may never become universally accepted as the standard for gaining or maintaining authorization, but its longevity in other fields of risk management and conflict resolution and the various aspects that allow breaking informed consent into measurable components make this form of consent particularly attractive to the budding privacy engineer. chapter 2 fata ccept a fraer 40 express consent is simply where a person takes a specific observable action to indicate and confirm that they give permission for their information to be processed. an example of this is checking a box that says, yes on an online form. so that it does not go unrecognized, express consent and informed consent are both subspecies of the opt in. the strength and validity of any of these permission forms and types depend on the clarity, conspicuousness, and proximity of data processing intended to be governed by authorization. it must be clear that the user knew what was being accepted to make the permission valid when permission was granted. similarly, permission must be freely given and not under duress for data processing to be authorized to the appropriate degree. the other key ingredient is, for all these different forms of permission, they must be presented before personal information is collected and before it is processed. for example, there has been much debate about the ability for web site operators to use cookies on the first page of a web site where notice is presented about the possibility of data collection through electronic means. in fact, the difficulty in ensuring that data subjects know and understand the potential and actuality of data privacy in a clear, conspicuous, and proximate fashion is one of the many reasons that those processing the data, governing bodies, and users are skeptical that a governance and enforcement regime focused on notice and consent is effective in todays data-enriched environment. permission is only one component of ensuring that pi is processed with authorization. in addition to ensuring that one has permission to use the data, one also has to be able to manage and prevent unauthorized use or access to the data. this requires using controls and measures to ensure pi and related data is processed in an authorized and legitimate manner. these controls and measures can take the form of administrative, logical, technical, and physical routines or a combination of all of these, which will be discussed later in this chapter and in chapter 6. the evolution of consent by eduardo staran, ata protection awyer and author of the future of privacy s individual choice still the essence of data privacy law? n the early days of data protection as a regulated activity, putting people in control of their information was thought to be what mattered the most. from the 1980 ec guidelines to the latest version of the e e-privacy directive, consent has been a cornerstone across legal regimes and jurisdictions. european data protection law is based on the principle that an individuals consent is the most legitimate of all legitimate grounds to use information about people. but does this approach still hold true? can weas individualsattempt to have a meaningful degree of control over the vast amount of information we generate as we go about our lives? nformation about who we are, what we do, what we are like, and how we behave is captured every single second of the day. from the moment we turn on the light (or the blackberry or our smartphone) in the morning to the moment we turn it off in the evening, every action that involves using technology is recorded somewhere. chapter 2 fata ccept a fraer 41 the nternet has maximized this in such an unprecedented way that the value of the information we generate by simply using it makes other more traditional identifying factors look trivial. from a legal perspective, this phenomenon has entirely distorted the meaning and scope of personal data, but the point is that information about us is constantly flowing around the world without our knowledge, let alone our consent. ets face it, attempting to put people in control of their own information by giving them the power to consent to the uses made by others is simply unachievable. the concept of consent should not be underestimated. the ability to make choices is what makes people free. however, pretending that we can take a view in any meaningful way as to how information about us is gathered, shared, and used by others is wishful thinking. e cannot even attempt to recognize what personal information is being made available by us in our daily comings and goings, so how could we possibly decide whether to consent or not to every possible use of that information? consent might have been a valid mechanism to control data handling activities in the past, but not anymore. o what now? s data privacy dead? hope not. but in the same way that our ability to control our own information is moving away from us, our responsibility to decide what others can know about us is also receding. ur privacy is less than ever in our own hands because the decision-making power is not really ours. any legal regime that puts the onus on individuals (who are meant to be protected by that regime) is bound to be wrong. the onus should not be on us to decide whether a cookie may reside in our computer when hardly anyone in the real world knows what a cookie does. hat the law should really do is put the onus on those who want to exploit our information by assigning different conditions to different degrees of usage, leaving consent to the very few situations where it can be truly meaningful. the law should regulate data users, not data subjects. ike it or not, individuals have a limited role in the data-handling decision-making process. that is a fact, and regulation should face up to that fact. technology is more and more complex, while our human ability to decide remains static. feeding us with more detailed and complex privacy policies does not change that. n the crucial task of protecting our personal information and our privacy, consent can only have a residual role. continuing to give consent a central role in the protection of our privacy is not only unrealistic, but also dangerous because it becomes an unhelpful distraction for individuals, organizations, and regulators. the emphasis must simply be put elsewhere. fair and legitimate of all the concepts that underpin the notion of data privacy, the ability to provide information handling that is fair and legitimate is probably the most complex and difficult to reduce to a scientific rule or even an approximate measurable metric. chapter 2 fata ccept a fraer 42 the concept of fair and legitimate processing is not limited to the organizational view of fair as necessary (or, more often, desired) processing. however, a series of principles called the fair information practice principles (fipps)as embraced by the oecd in the oecd privacy guidelines, is a useful prism through which to look at the notion of fairness and legitimacy. fair information processing principles and the oecd guidelines the original fipps were developed by the department of health, education, and welfare in the 1960s in reaction to and concerns over implementation of large government databases containing information on us citizens. as mentioned earlier, the principles were then extended by the oecd in 1980 in a document titled the oecd guidelines on the protection of privacy and transborder flows of personal data.7 these principles, commonly know as the oecd principles, have since become the foundation for much of the existing privacy legislation and thinking throughout the world. more important, they continue to be a cornerstone in grounding governments, businesses, and consumer advocates in their approach and dialogues on privacy and the use of personal information. in other words, they form the common vocabulary in which privacy is discussed. as we detail later in this chapter and elsewhere in part 2, most privacy laws and regulations (and thus privacy policies and the privacy rules) are derived from the fipps and the oecd guidelines. collection limitation principle the oecd guidelines, published in 1980, state that there should be limits to the collection of personal data and any such data should be obtained by lawful and fair means and, where appropriate, with the knowledge or consent of the data subject.8 this means before pi is collected or processed in another fashion, the processor must obtain permission to process the data. there are rare exceptions to this requirement, including certain types of law enforcement practices and for national security purposes.9 given the increasing reality of law enforcement requests and requirements from around the world, it is imperative that privacy engineers contemplate such uses and their potential conflict with the collection limitation principle for their processing. 7an outgrowth of the organisation for european economic cooperation (oeec), which was formed in 1948 and chartered to run the marshall plan, the oecd, established in 1961, consists of 34 countries who work collaboratively to to help governments foster prosperity and fight poverty through economic growth and financial stability. 8the oecd guidelines on the protection of privacy and transborder flows of personal data. http://www.oecd.org/internet/ieconomy/oecdguidelinesontheprotectionof privacyandtransborderflowsofpersonaldata.htm#part2. all quotes from the oecd guidelines come from this source. 9even those cases are not consistent from jurisdiction to jurisdiction and, in those cases, there must be other control processes in place to ensure that individual rights are not being violated and that the data is collected in a manner that allows law enforcement to use them for policing or security. chapter 2 fata ccept a fraer 43 data quality principle from the oecd guidelines: personal data should be relevant to the purposes for which they are to be used, and, to the extent necessary for those purposes, should be accurate, complete, and kept up-to-date. there are two key ideas in this principle. the first is relevancy (i.e., the data collected/used must be genuinely pertinent to the purpose and proportional, that is, only the appropriate amount and type of data to suit the purpose for its collection or processing). the second idea is accuracy. this is important because it creates obligations on behalf of the entity that controls the data to ensure data integrity. this requirement has evolved to also require giving data owners the ability to access their data and correct or update any errors. it should be noted that data integrity is one of the core principles and goals for the security practitioner as well. for security, confidentiality, integrity, and availability are key markers for success and planning security requirements. throughout this book we will note where synergies and common goals exist such as the case of data integrity. in doing so, the building and maintenance requirements for privacy engineers should be viewed as additive to other requirements rather than competing or negating compliance post facto requirements. purpose specification principle from the oecd guidelines: the purposes for which personal data are collected should be specified not later than at the time of data collection and the subsequent use limited to the fulfilment of those purposes or such others as are not incompatible with those purposes and as are specified on each occasion of change of purpose. this principle provides guidance regarding the type and quality of transparency or notice. from an innovators perspective, creators of systems or services should carefully consider how pi will be used throughout the lifecycle of the current situation and should plan ahead as carefully and fully as possible to ensure that enough flexibility for data processing is introduced into the system and any contextual cues, including notice leading to transparency and understanding of data use. use limitation principle from the oecd guidelines: personal data should not be disclosed, made available or otherwise used for purposes other than those specified in accordance with paragraph 9 [purpose specification principle] except: a) with the consent of the data subject; or b) by the authority of law. this principle qualifies both the limits for data processing and the expectations of the data subject and also suggests conditions for potentially adding to the type, kind, and timing of data processing when that processing was not included in the initial authorization. as discussed previously, some legal enforcement should be contemplated and presented in the original purpose specification of the notice. chapter 2 fata ccept a fraer 44 security safeguards principle from the oecd guidelines: personal data should be protected by reasonable security safeguards against such risks as loss or unauthorised access, destruction, use, modification or disclosure of data. any entity controlling pi must protect it from unauthorized access or processing. this principle clearly invokes the wide and complicated discipline of security for all types of data but focuses the requirement to specifically protect personal data. this is one of the overlaps between privacy and security that will be discussed later in this chapter. openness principle from the oecd guidelines: there should be a general policy of openness about developments, practices and policies with respect to personal data. means should be readily available of establishing the existence and nature of personal data, and the main purposes of their use, as well as the identity and usual residence of the data controller. publication of privacy policies and statements is one means to achieve a level of openness in and about an organization. individual participation principle from the oecd guidelines: an individual should have the right: a) to obtain from a data controller,10 or otherwise, confirmation of whether or not the data controller has data relating to him; b) to have communicated to him, data relating to him within a reasonable time; at a charge, if any, that is not excessive; in a reasonable manner; and in a form that is readily intelligible to him; c) to be given reasons if a request made under subparagraphs(a) and (b) is denied, and to be able to challenge such denial; and d) to challenge data relating to him and, if the challenge is successful to have the data erased, rectified, completed, or amended. this principle describes an individuals right to update, correct, and know which data has been collected about them from a given entity. it is closely related to the accuracy principle. much innovation is required for this principle, in particular in a world of vastly dispersed and complex data sharing and processing even to achieve relatively simple goals. 10author note: a data controller is the entity that is responsible for determining how data is processed. the data controller gives direction to the data processor. sometimes the data controller and data processor are one in the same; sometimes not, such as in outsourcing. in such as situation, the service provider is the data processor. chapter 2 fata ccept a fraer 45 an example of some of this complexity may be the fulfillment of an online contact lens service where an individual may be described by a common carrier, an ophthalmologist, a fulfillment center, a manufacturer, and more. for any one individual to possibly glean where and when his data changes hands among all of these specialized and related steps is a daunting task indeed. a practical approach to determining if data collection and use is fair and legitimate here is a two-tiered process to determine if data is needed. the first tier is to ask the question. s this data needed? ot wanted, needed. f they answer is yes and all other design and architectural reviews and options (such as not collecting at all, truncating or de-identifying the data) have been exhausted, then run each data element through the following set of formulas: need x to do y ithout x cannot do y. f the answer to the first two equations is true, proceed to the third: y is a subset of uses for the data for which z has given permission (y < ?). f the answer to this equation is true, then ask, does it pass the smell test (fit the spirit of the permission, as well as the letter). f the answer to this is yes, then proceed. f the answer is no, then based on the data and the use (i.e., the risk), explore what level and type of notice and consent are required and consider who best to expand the existing permission to cover the contemplated use. f there is reluctance to go back to an individual for permission, then someone has to ask what is the locus of that discomfort. t usually is because the benefit is not so much for the person but for the organization or because there is a lack of proportionality between the risk to the privacy of the individual vs. the benefit to him or her. nowledge of this will help the real goals and purpose of the processing to surface, which will then lead to a more productive discussion of how to address and manage the risks. accountability principle from the oecd guidelines: a data controller should be accountable for complying with measures which give effect to the principles stated above. this principle means whomever is controlling the data, that is, in charge of determining how they are going to be used and processed, is the party who will be held responsible for ensuring the data is processed in an authorized and fair and legitimate manner and will bear the consequences if they are not. chapter 2 fata ccept a fraer 46 the intersection of privacy, unique identifiers, and collecting telemetry telemetry is the collection of information about machines and systems. t is often collected remotely to monitor how a system is functioning so that issues can be detected and resolved in advance or in order to provide services. ometimes it contains unique identifiers. the most obvious of these were p address, but there were also things like machine name, media access control (ac) address, and so on. although collection of telemetry was not considered in the past the same as collecting personal information now, there have always been privacy concerns with it. these concerns were mainly whether the collection of it was authorized or not and thus whether it was a form of spyware or not (think industrial espionage). however, with the widespread adoption of smartphones, pas, and other devices, the quantum leap in the ability to collect, parse, and understand patterns (i.e., big ata or ata cience) and the ability to act on those patterns and push communications to devices (or take other actions) based on what was once just considered machine data has all changed. ow unique identifiers such as those collected as part of collecting telemetry need to be examined and considered. the important thing to remember in evaluating whether a unique identifier falls under the definition of p is that not all unique identifiers are equal. below is a list of characteristics to consider when evaluating unique identifiers to see if any one of them is something that can reasonably be linked to a person or a persons device (vs. a system that front ends a network): niqueness reidentification (correlating an identifier with other data that leads to the ability to identify the user) sing as an anchor to aggregate and analyze information from one or more sources permanence frequency of change ease of change reachability (can it be used to contact or track) chapter 2 fata ccept a fraer 47 other governance standards of which to be aware in addition to the oecd guidelines, there are other frameworks such as the generally accepted privacy principles (gapp), the 1995 eu data directive (also known as directive eu 95/46/ec), the federal trade commissions version of the fipps, the asia-pacific economic cooperation (apec) privacy principles, and international organization for standardization (iso) standards that will inform how personal information and privacy issues are managed and governed. in the previous section, the oecd guidelines have been highlighted to explain the notion of fair and legitimate processing of personal information. these other frameworks help one get to a more granular and comprehensive view of data governance, which will be discussed in chapter 3. privacy is not confidentiality and security is not privacy confidentiality is about protecting designated nonpublic information (often information that is either a trade secret or proprietary) (figure 2-2). confidentiality privacy figure 2-2. confidentiality is not privacy chapter 2 fata ccept a fraer 48 confidentiality rules only apply to what is designated by agreement as confidential. sometimes confidential information is also personal information. for example, some information relating to the private lives of individuals may be confidential, such as medical records or family secrets. sometimes, actually often, confidential information contains no pi. this is the first difference between confidentiality and privacy. confidential is an imposed label that signifies access control. pi is an organic label; it speaks to the substance of the information. just as with that famous line in shakespeares immortal play romeo and juliet a rose by any other name would smell as sweet, so it goes with pi. pi is always going to be personal information when it identifies an individual. another difference is that rules that govern or protect the pi apply whether the personal information is public or not. just because pi is public does not mean it can be used or processed for ones own purposes. one example of this is e-marketing lists. many of our e-mail address are publically available, but that does not mean they can be wantonly maintained on e-marketing lists without our permission. a third difference, and perhaps the most important, is that when the pi is nonpublic personal information, keeping it confidential only addresses the access requirement and not the use or any of the other requirements of the oecd guidelines. so, although there is overlap between the safeguards used to protect personal information and the safeguards used to protect confidential informationmost of the overlap is in terms of access controlprotecting one is not the same as protecting the other. just as privacy and confidentiality overlap but are not the same, privacy and security overlap in that each is about data protection, but they are not the same (figure 2-3). security privacy figure 2-3. security does not equal privacy chapter 2 fata ccept a fraer 49 information security has three areas of focus, known as cia: confidentiality (i.e., preventing unauthorized access) integrity (i.e., ensuring the data is not altered without approval) availability (i.e., ensuring the data is accessible) it uses logical, administrative, physical safeguards to ensure the cia of the data is maintained. aspects of security that do not overlap privacy include: defense in depth: a sophisticated firewall structure can protect personal information. data loss prevention (dlp): discovering and monitoring the location and flow of sensitive data such as customer credit card data, employee pi, or corporate intellectual property. security information and event management (siem) the overlaps the safeguards enable the authorized in the authorized access and use element that is a cornerstone the operational definition of privacy. this is the first overlap between privacy and information security. in addition to the fact that both information security and privacy are data protection regimens, other areas of overlap are: integrity (information security) and accuracy (privacy) availability (information security) and access (privacy) accountability (both) confidentiality (when the data is both personal information and nonpublic) information securitys focus on data integrity overlaps with privacys accuracy requirement in that both target ensuring the data is not altered with authorization. information securitys availability requirement supports privacys access requirement because if the data is not available, they cannot be accessed. both information security and privacy doctrines require data owners and custodians to be responsible for protecting the data in accordance with the respective protection regimen, which is a form of accountability. and when the information is both nonpublic and personal information, confidentiality supports privacy because nonpublic data need to be kept nonpublic. chapter 2 fata ccept a fraer 50 the disconnects the reason there is not a complete overlap between privacy and information security is threefold. first, privacy has a wider set of obligations and responsibilities than information security does, such as: collection limitation openness relevancy use limitation this means there are things privacy addresses that information security does not. the second disconnect is confidentiality. because pi is not always nonpublic (consider the phonebook), the notion of confidentiality does not apply. also, in a resource-constrained world, if the data is not considered confidential, they are not always valued and the necessary measures to ensure authorized access and use will be overlooked. third, and perhaps most important, while information security techniques can be privacy-enabling technologies (pets) (which means they are tools that enable privacy) and are often necessary, these pets can also become feral if applied incorrectly (i.e., in an invasive manner). this is why you can have security without privacy, but you cannot have privacy without security. this will be discussed further in part 2. conclusion the purpose of this chapter is to enable you to understand the nature of privacy and privacy engineering. this is the foundation and context for the guidancethe explanation of tools and techniquesthat makes up the remainder of this book. if you follow the guidance in this book, you will be poised for success and you will have a set of tools you can use and configure to enable privacy, but the actual success will ultimately depend on how you tailor the guidance that follows to specific situations (i.e., the data, the processing, whose data, and specific jurisdiction, regulations, or best practices that apply) and how you configure the tools we are providing. chapter 3 will discuss privacy and data governance concepts. 51
<Chapter>: chapter 3
<Section Text>: data and privacy governance concepts computers are magnificent tools for the realization of our dreams, but no machine can replace the human spark of spirit, compassion, love, and understanding. louis gerstner this chapter will look at the relationship among privacy frameworks and data management, data governance, and data stewardship, highlighting how frameworks such as the oecd guidelines and gapp are used for personal information management. included in this discussion will be a look at privacy by design (pbd), which supports and complements privacy engineering (figure 3-1). chapter 3 data ad prac gerace ccept 52 data management: the management of stuff the raison detre of any organization, whether a corporation, a nonprofit, or a governmental entity, is to do stuff; doing stuff requires managing stuff. data represents this stuff. examples include: customers suppliers money resources products customer orders customer order line items inventory figure 3-1. good privacy engineering is built on a foundation of data management and governance chapter 3 data ad prac gerace ccept 53 policies business rules privacy rules roles intellectual property 1 the administration of the data that represents the stuff of an organization is the science and art of data management, or as it is defined in the dama data management body of knowledge: data management is the development, execution, and supervision of plans, policies, programs and practices that control, protect, deliver and enhance the value of data and information assets.2 in a structured data management program, data stewards, who are domain or subject matter experts for each of these classes of data, work with data management experts to ensure that procedures, processes, standards, guidelines, and business rules for using such information support the goals and objectives of the enterprise. this is called data governance. data governance data governance is a strategic, top-down program for data management in which an organizations leadership communicates the core value of data quality and integrity to stakeholders. it includes the development and enforcement of standards and procedures. it requires broad understanding of data entrusted to the organization, the value and use of data, upstream and downstream stakeholders, systems, and processes for all decisions and issue resolution. to be effective, data governance requires data stewardship and data stewards. it also requires executive sponsors and support. stewardship is not ownership. a steward is a custodian who is responsible for managing something that belongs to someone else. data stewardship is the managing of information on behalf of the owners of the data. the data steward is in effect the feet on the ground, ensuring the data governance standards are adhered to and evolve as necessary. 1for any enterprise, we would expect to find over 20 different data models containing at least five unique classes or data entities and the relationships between these classes or data entities. we have built these types of enterprise data models for a number of pharmaceutical companies, communica- tions companies, oil companies, hospitality companies, and government agencies, among others. 2dama-dmbok guide (data management body of knowledge) introduction & project status. www.dama.org/files/public/di_dama_dmbok_guide_presentation_2007.pdf. chapter 3 data ad prac gerace ccept 54 an effective data governance program requires that: data is created, recorded, and distributed in compliance with standards an established metadata gathering process clearly describes requirements and characteristics of the data to be maintained (discussed in part 2 of this book, and appendix a contains a variety of metadata) there is a metric-driven adherence of all data definition standards there is a feedback or notification system to identify inadequacies in the data there is a data quality assurance process that monitors the integrity of information within the system there is a data management structure that includes data stewardship, a data governance panel, and an executive layer there are two data steward roles: data producer stewards and data usage stewards. data producer stewards are responsible for: appropriate data content creation and maintenance of quality. appropriate business rules related to all data elements and attributes for which the data steward has responsibility. a data attribute is a fact or characteristic about a data element or entity. data usage stewards are responsible for: appropriate data usage quality, including screens and reports appropriate business rules, including privacy appropriate presentation: method design architecture aesthetics (ugly user interfaces are avoided) in addition to the role of the data producer and data usage steward, there is the role of data administrator. data administrators are those responsible for: data analysis data acquisition design data organizing or classifying data storage and distribution design chapter 3 data ad prac gerace ccept 55 data archiving ensuring the implementation of business rules data management (metadata) tool administration (as a data dictionary) depending on the size and volume of the data being managed, these roles may be combined or staffed by more than one person. benefits of data governance data management programs that have implemented data governance have benefited from features such as: common names and definitions: if existing data is not well named, they cannot be found and therefore cannot be shared.3 in order to determine whether a data object already exists, common names, based on a standard naming convention, speed the analysis. common names imply that there is a readily understandable business name and an abbreviated short physical name, based in part on a standard abbreviation list. consistent data: a consistent business definition of the data is important so that the knowledge worker can determine whether a data object with a name similar to his or her data requirement is in fact the same data object. consistent reports: if data attributes are well named or well defined, then the reports resulting from the analysis or use of the elements are apt to be more consistent because the underlying data is consistent. less duplication of data: consistent names and definitions will facilitate the discovery of redundant data. data modeling normalization is a process for eliminating duplication. trust by the business users: well-executed data governance and data stewardship should improve quality and reliability, which, in turn, should increase accuracy and trust in the data analysis process. less data correction: better managed data should be more accurate and require less correction. however, the most important feature and benefit of data governance is that the data is being governed and that there are structured, mindful controls and measures in place to manage the data and ensure that its use is in alignment with the organizations overall goals and requirements. in short, the data is being viewed as an asset and is appropriately and meaningfully curated. 3b. van halle and c. fleming, handbook of relational database design, addison-wesley, 1989, p. 16. chapter 3 data ad prac gerace ccept 56 the privacy and data governance/stewardship connection although it is not often articulated this way, data privacy is a key part of data governance for personal information. in this context, privacy engineering is engineering data governance for personal information into the design and implementation of routines, systems, and products that process personal information. an enterprises privacy policy (including rules, standards, guidelines, etc.) governs the processing of personal information by an enterprise (and in chapter 4, the privacy policy is not only viewed as a governance concept but also the meta-set of personal information data protection use-case requirements for privacy engineering). understanding how data management frameworks (such as data governance and data stewardship) fit with privacy frameworks (such as gapp and the oecd guidelines) is key to organizational development. such frameworks and guidelines help to create the necessary roles and responsibilities to build and maintain a privacy-aware and ready enterprise. such understanding will also help to recognize and understand privacy policies at meta-use-case requirements for privacy engineering. although the connection between data governance and privacy frameworks should be very close, the closeness is not often recognized nor leveraged by either domain. too often data privacy teams sit outside enterprise-wide data governance and stewardship initiatives. this is unfortunate. file this under the opportunity not realized category. ultimately both groups should have a shared goal of ensuring data is curated and cared for as an asset whose value is recognized and cultivated within defined parameters. data privacy governance frameworks the oecd guidelines, that were discussed in chapter 2, is one of the better-known privacy governance frameworks. in addition to it, are other global and regional frameworks such as the 1995 eu data protection directive (also known as directive eu 95/46/ec), the federal trade commissions version of the fair information privacy principles, (fipps), the iso 2700x series of security standards, and the generally accepted privacy principles (gapp), which were created by the american institute of certified public accountants (aicpa) and the canadian institute of chartered accountants (cica) privacy task force. all these and others are worth knowing and learning about to perfect a privacy engineering tradecraft. how the frameworks align ou can see from table 3-1 how the various frameworks cited align. ne of the most comprehensive is gapp, which was designed to create a set of principles that would encompass the key points of the existing frameworks. chapter 3 data ad prac gerace ccept 57 table 3-1. how key privacy frameworks align gapp oecd guidelines ftc fipps eu directive iso 27002 apec management operations management preventing harm collection collection limitation proportionality information acquisition collection limitations quality data quality integrity of personal information notice specification of purpose notice/awareness transparency notice use, retention, disposal use limitation legitimate purpose asset management uses of personal information security for privacy security safeguards integrity/security security security safeguards access openness access/participation access control access and correction choice/consent individual participation choice/consent asset management choice monitoring and enforcement accountability enforcement/redress supervisory authority compliance accountability disclosure to third parties transfer of personal data to third parties chapter 3 data ad prac gerace ccept 58 generally accepted privacy principles (gapp) according to the american institute of certified public accountants (aicpa), which developed the generally accepted privacy principles: generally accepted privacy principles (gapp) have been developed from a business perspective, referencing some, but by no means all, significant local, national and international privacy regulations. gapp operationalizes complex privacy requirements into a single privacy objective that is supported by 10 privacy principles. each principle is supported by objective, measurable criteria that form the basis for effective management of privacy risk and compliance in an organization.4 the following are the 10 gapp: 1. management: the entity defines, documents, communicates, and assigns accountability for its privacy policies and procedures. 2. notice: the entity provides notice about its privacy policies and procedures and identifies the purposes for which personal information is collected, used, retained, and disclosed. 3. choice and consent: the entity describes the choices available to the individual and obtains implicit or explicit consent with respect to the collection, use, and disclosure of personal information. 4. collection: the entity collects personal information only for the purposes identified in the notice. 5. use, retention, and disposal: the entity limits the use of personal information to the purposes identified in the notice and for which the individual has provided implicit or explicit consent. the entity retains personal information only as long as necessary to fulfill the stated purposes or as required by law or regulation and thereafter appropriately disposes of such information. 6. access: the entity provides individuals with access to their personal information for review and update. 7. disclosure to third parties: the entity discloses personal information to third parties only for the purposes identified in the notice and with the implicit or explicit consent of the individual. 4see www.aicpa.org/interestareas/informationtechnology/resources/privacy/generally acceptedprivacyprinciples/downloadabledocuments/10261378execoverviewgapp.pdf chapter 3 data ad prac gerace ccept 59 8. security for privacy: the entity protects personal information against unauthorized access (both physical and logical). 9. quality: the entity maintains accurate, complete, and relevant personal information for the purposes identified in the notice. 10. monitoring and enforcement: the entity monitors compliance with its privacy policies and procedures and has procedures to address privacy-related complaints and disputes. we will show in later chapters how frameworks like the oecd guidelines and gapp are used as a basis for developing the enterprises privacy policies, processes, procedures, standards, guidelines, and mechanisms. by joel weise, director of ecurity and compliance, hootsuite the 27001:2005 nformation technologyecurity techniquesnformation security management systemsrequirements and the complementary 27002:2005 nformation technologyecurity techniquescode of practice for information security management standards provide a very good framework for defining, creating, and managing a comprehensive security architecture and governance framework that supports not only security but also privacy. ome of the primary advantages are that these are mature standards, internationally recognized and well harmonized with other local and national standards such as the u t pecial publication 800-53 recommended ecurity controls for federal nformation ystems and rganizations. further, when utilized, the standards can enable compliance to privacy laws, demonstrate an organizations commitment to privacy and minimize, or limit the opportunity for breaches that could affect security and privacy of data, people as well as supporting technology and governance. the overall value of the standards is to elaborate an information security management system (m) as noted in 27001:2005 and based on the security control objectives as noted in 27002:2005. the m uses a continuous improvement approach so that it is flexible and can change as new laws, technology, and threats emerge. the standards further allow for the foundation of a framework that can be audited so that its effectiveness can be measured. uch a foundation is critical to supporting security and privacy efforts in an organization. according to the standards, the m is designed to ensure the selection of adequate and proportionate security controls that protect information assets and give confidence to interested parties. this goal is fundamental to how the m functions and addresses both security and privacy. the overall benefit of the standards is that they are used to enable the design, configuration, implementation, and use of controls iso2700x: how security standards support privacy chapter 3 data ad prac gerace ccept 60 that reflect best practices, and, most important, it allows for interoperability and a lingua franca so that different organization, security, and privacy professionals as well as auditor and legal authorities can analyze the use of those controls. when considering security and privacy controls, one must always consider the costs of such controls. t is important that controls be balanced against their actual and intangible costs. for example, it would not be reasonable to implement a $100 control to address a risk that is only worth $10. a security practitioner must always evaluate controls within the business context of the environment in which they will be implemented. n addition to an actual value, one must consider the intangible costs of controls. for example, even if a $100 control is used to address a risk valued at $1,000, the security practitioner must consider intangible costs such as the impact the moral, productivity, and general perception of security. f a control negatively impacts the organization, even in such intangible ways, those should be taken into consideration. the 27002:2005 standard has 11 different sections. table 3-2 outlines each of these areas as they apply to privacy. table 3-2. standards that apply to privacy standard topic area overview privacy objective policy the policy is a high-level statement about information security and privacy. it lays down the key information security and privacy directives for an organization. the policy should reflect the privacy compliance objectives of the organization and reference applicable standards, legal and regulatory mandates, and relevant industry-best practices. organizing information security an information security governance structure should span the entire business and technical components of the organization. the organizational governance structure should include specific individuals and functions that have privacy as their primary mandate. asset management asset management is a means for an organization to identify, organize, and manage their information resources. the maintenance of privacy for data assets is an organizational imperative because many assets include a privacy component. (continued) chapter 3 (ctued) 61 standard topic area overview privacy objective human resources security the organization should manage user access rights as well as undertake suitable security awareness training and educational activities. these are all necessary to ensure the human element actively participates in the overall security effort. in order to ensure employee personal information is secure, protected, and used appropriately, privacy needs to be instilled in an organizations culture through training and awareness activities. physical and environmental security valuable it equipment should be physically protected against malicious or accidental damage or loss including damage or loss due to environmental factors such as an inadvertent loss of power or overheating. maintaining privacy in an organizations physical space is also important as is security and privacy of data assets. communications and operations management controls for systems and network management include a broad range of capabilities from network management to operational procedures. in the it world, privacy can only be enabled when appropriate system and network controls are utilized to ensure the security, availability, and reliability of operational resources. access controlcommunications and operations management access control includes user access controls for it systems, including, operating systems, networks, and applications and data. access control is critical for the support of privacy in any environment where data and processing resources may contain personal information. table 3-2. (continued) (continued) chapter 3 (ctued) 62 standard topic area overview privacy objective information systems acquisition, development, and maintenance this section details the policies covering everything from cryptography to processes for specifying, building or acquiring, testing, implementing, and maintaining it systems. maintaining the privacy of data is predicated upon implementing and supporting an it infrastructure that works as advertised. without that assurance, it is not possible to state that an organization is capable of maintaining the privacy of data. information security incident management incident management covers procedures required to manage incidents consistently and effectively. knowing that intrusions can exacerbate vulnerabilities, maintaining the privacy of data relies upon a comprehensive incident management function. it also alerts you to breaches so you can remedy them as quickly as possible. business continuity management this section describes the relationship between it disaster recovery planning, business continuity management, and contingency planning. to the extent that personal information is retained in backups, then disaster recovery and business resumption processes must ensure the continued control over those assets. compliance compliance includes not only compliance with legal requirements, but also with security and privacy policies and standards. compliance to relevant security and privacy policies is integral to ensuring privacy as this enables users a means to validate adherence to those policies. table 3-2. (continued) chapter 3 data ad prac gerace ccept 63 impact of frameworks on the privacy engineer privacy engineers must understand the oecd guidelines, gapp, and the other frameworks, as well as their organizations own privacy policies, standards, and guidelines sufficiently to understand their purpose and limitations. in doing so, any creative innovation should have a tie into a rationalized set of existing requirements. this will, in turn, make it easier to implement such an innovation or manage change effectively as a logical leap forward in achieving the ultimate goal of efficiently, effectively, and ethically protecting information about people. if data is processed in a way that honors or adheres to the oecd guidelines or gapp, or one of the other frameworks, then chances are, under most data privacy regimes, it will likely be considered to be fair and legitimate processing as most privacy laws are based on the fipps in some fashion (and these other frameworks essentially follow the fipps). however, as noted later, each specific case or legal regime can and often does interpret the fipps, adherence, and individual level of competency differently. in part 2 of this book, we will discuss how privacy rules are developed based on privacy policies, processes, procedures, standards, guidelines, and best practices that are derived in part from these frameworks. these privacy rules will be used to implement mechanisms that are used within systems satisfying privacy requirements. frameworks are not the same as laws how each enterprise addresses privacy requirements at a deeper more granular level is a decision that is based on many factors such as size, jurisdiction, risk profile, internal policies and public positions, and, most important, what kind of personal information is involved (i.e., how much and how sensitive) and whose data it is. to get to this level of granularity in understanding requirements, you should work with legal resources with privacy domain expertise and look at the specific laws and regulations that govern the space in which you are working, as well as applicable internal policies and requirements. for this reason, the techniques for privacy engineering that will be discussed in this book and the issues that they will address are going to be characterized at a framework level, not based on a specific statute or regulation level. ubiquitous computing requires global privacy law awareness by francoise gilbert, founder and managing director of t law group and author and editor of global privacy and security laws as citizens, we might feel allegiance to a particular region where our ancestors were born and our family roots were formed, but these boundaries are artificial. when looking at the earth from the 10,000-foot level, states merge into one another seamlessly. clouds that fly over country borders ignore the passport control booths. chapter 3 data ad prac gerace ccept 64 like their geophysical cousins, the clouds in which our electronic files are stored and processed know no borders. ur smartphones, tablets, laptop computers, smart watches or glasses and the underlying technology into which we plug our equipment allow us to be connected at all times, from anywhere to, to anyone. data, like the genie, have jumped out of their bottle. they are taking a path of their own that does not stop at the edge of the device that was used to collect them or at the political border of the country in which that device is operated. with interconnectivity and ubiquitous computing available to us, we can, while seated on a bench in the middle of golden gate park in an francisco, access or modify files that are processed in argentina by a payroll service established in france. these files may be simultaneously backed-up in ingapore and replicated for disaster recovery purposes in ew zealand. they may pertain to the employees of an australian company who telecommute to work from outh africa. this might look like a law school exam hypothetical. t happens increasingly in the 21st-century world of virtual companies or virtual employees where intangible intellectual property is frequently the most valuable asset of a business. which privacy or data protection law applies to this hypothetical? which state or country has jurisdiction over a particular dataset? ask five different judges, and you are likely to receive five different answers. the laws of several countries might apply, and more than one court could assert jurisdiction: that of the country where the data controller is located; that of the countries where the servers that process or store the data are located; that of the country where the data subject is physically located, or where his employer is established to do business, or where his payroll is generated. countries are very protective of their citizens and want to apply their lawsor are asked by plaintiff to apply their lawsto matters that may take place within their boundaries or affect their citizens. ee, for instance, the current article 3territorial cope-- of the draft eu data protection regulation, which is expected to supersede the 1995 eu data protection directive. this provision might allow the application of the eu data protection laws to the hypothetical above, due to the fact that the payroll company is established in the eu, even though the data subjects are located in outh africa and their employer in australia. article 3 provides in part (emphasis added): this regulation applies to the processing of personal data in the context of the activities of an establishment of a controller or a processor in the union, whether the processing takes place in the union or not. this regulation applies to the processing of personal data of data subjects residing in the [european] union by a controller or processor not established in the union, where the processing activities are related to: (a) the offering of goods or services, irrespective of whether a payment of chapter 3 data ad prac gerace ccept 65 the data subject is required, to such data subjects in the union; or (b) the monitoring of such data subjects. this regulation applies to the processing of personal data by a controller not established in the union, but in a place where the national law of a member tate applies by virtue of public international law. we cannot rely on the law of a single country as the framework in which to develop policies, practices, and procedures or evaluate the risk to which data might be exposed. ubiquitous computing, business process outsourcing, and cloud computing are available to all companies. ize no longer matters. the proverbial flower shop around the corner may have its accounting or payroll data processed or stored on another continent, in the same manner as a fortune 10 company can. privacy professionals must be aware, and keep abreast of, the legal developments regarding information privacy or security laws in all the countries in which the personal data in their clients custody are or might be located. t is only with this global knowledge and legal awareness that they will be able to properly evaluate and anticipate the legal constraints to which these data might be subject. although most of the worlds data protection laws take an approach to the protection of personal information, personal space, and intimacy that is loosely based on similar fair information privacy principles (whether they are expressed in the ecd guidelines, the apec privacy framework, or other document), the devil is in the detail. each countrys legal framework is different. when these principles are implemented, each country has its own view and its own sensitivity to a particular topic. keeping abreast of these developments is difficult and time consuming. t is not that simple to know and appreciate a countrys vision of privacy and what is necessary to achieve compliance in that particular country. t is a major mistake to take a one-size-fits-all approach or ignore the legal and cultural nuances among countries, even neighboring ones, or the historical foundation that have resulted in a certain legal system or certain local customs or behaviors. a formality that does not exist here may be required there and may be attached to prison terms elsewhere in cases of delinquency. privacy is a cross-functional and complex concept. unlike tax, real property, or corporate law, privacy laws do not have hundreds of years of history in the making. evertheless, all over the world, there is more to privacy than what judges or legal scholars have designed. the social aspects and the individual, cultural, or ethnic sensitivities are also part of the foundation. before becoming regulated, privacy has evolved in great parts outside courts, being shaped slowly by reactions to significant or traumatic events. chapter 3 data ad prac gerace ccept 66 privacy concepts and privacy laws may result from societal pressures, changes in mores and habits, reaction to government abuses, or may respond to technology advances. n each country, they are a reflection of the countrys culture, history, and sensitivity. at times, the religious and philosophical beliefs of its citizens may have also influenced the way in which a country designed and implemented (or not) data protection principles and protected (or not) the privacy rights of its citizens. developing a global privacy program requires an appreciation and understanding of these nuances and sensitivities. the world of privacy and data protection is uniquely complex. as the field evolves, and, concurrently ubiquitous computing is becoming the norm, it is indispensible to take a global approach to privacy and data protection while remaining aware of the significant discrepancies between the laws, regulations, guidelines, and sensitivities that exist and will remain at the micro level in each country or state. privacy by design privacy by design (pbd) is a concept popularized by ann cavoukian, the commissioner for information and privacy for the province of ontario, canada. it was developed to ensure that privacy was protected and that people gained control over their information and the information of their enterprises. in 2011, at their 32nd annual conference, the international data protection and privacy commissioners recognized pbd as an essential component of fundamental privacy protection.5 it teaches the following seven foundational principles:6 1. proactive not reactive; preventative not remedial 2. privacy as the default setting 3. privacy embedded into design 4. full functionalitypositive-sum, not zero-sum 5. end-to-end securityfull lifecycle protection 6. visibility and transparencykeep it open 7. respect for user privacykeep it user-centric 5resolution on privacy by design, 32nd international conference of data protection and privacy commissioners, jerusalem, israel. www.justice.gov.il/nr/rdonlyres/f8a79347-170c-4eef- a0ad-155554558a5f/26502/resolutiononprivacybydesign.pdf 6foundational principles, privacy by design. www.privacybydesign.ca/index.php/ about-pbd/7-foundational-principles/ chapter 3 data ad prac gerace ccept 67 next-generation privacy for a next-generation world: privacy by design resolution by ann cavoukian, phd, nformation and privacy commissioner, ntario, canada n ctober 2010, a landmark resolution was unanimously passed by the nternational privacy commissioners and data protection authorities at their annual conference, recognizing privacy by design (pbd) as an essential component of fundamental privacy protection. the resolution also: encouraged the adoption of the principles of privacy by design as part of an organizations default mode of operation; and nvited data protection and privacy commissioners to promote privacy by design, foster the incorporation of its foundational principles in privacy policy and legislation in their respective jurisdictions, and encourage research into privacy by design. ince then, pbd has become a global operation, having been translated into 35 languages. public policymakers in the united tates, europe, and australia have issued proposals to express pbd in reformed information privacy governance and oversight regimes. more than a concept, pbd has become a legal and regulatory requirement in major jurisdictions around the world. with the world evolving so rapidly, privacy protections must also evolve in equal measure. evolving privacy contexts privacy is often said to be in crisis today as a result of numerous developments: leapfrogging information and communications technology developments; the advent of social, cloud, mobile, and ambient computing; evolving cultural norms; and a global patchwork of outdated privacy laws. the information privacy solution requires a combination of data minimization techniques, credible safeguards, meaningful individual participation, and robust accountability measures, informed by an enhanced and enforceable set of universal privacy principles adapted to modern realities. pbd evolved from early efforts to express fair nformation practice principles directly in the design and operation of information and communications technologies, resulting in privacy enhancing technologies (pets). ver time, the broader systems and processes in which pets were embedded and operated were also considered. these include organizational practices and networked information ecosystems. pbd principles emphasize proactive leadership, systematic methods, and demonstrable results. chapter 3 data ad prac gerace ccept 68 proactive not reactive; preventative not remedial pbd principles have changed the global privacy conversation by shifting emphasis away from reactively detecting and punishing privacy offenses after they occur to minimizing risks and preventing harms before they occur. build it in early is now a common message from data protection authorities around the world. pbd principles aspire to the highest global standards of practical privacy possibleto go beyond compliance and achieve visible evidence of leadership, regardless of jurisdiction. good privacy doesnt happen by itself; it requires proactive leadership and continuous goal setting at the earliest stages. global leadership begins with explicit recognition of the benefits and value of adopting strong privacy practices, early and consistently (e.g., preventing data breaches and harms from arising). this implies: a clear commitment, at the highest levels, to prescribe and enforce high standards of privacy, generally higher than the standards set out by global laws and regulation; a demonstrable privacy commitment that is shared by organization members, user communities, and stakeholders in a culture of continuous improvement; establishing methods to recognize poor privacy designs, to anticipate poor practices and outcomes, and to correct any unintended or negative impacts, well before they occur, in proactive, systematic, and innovative ways; and continuous commitment and iterative processes to identify and mitigate privacy risks. the preventative and systematic approach to engineering privacy is often associated with privacy-enhancing technologies, particularly in europe. although pbd is often best illustrated through specific technologies (the more user-centric the better), it is the organization that has become a more central and effective focus for applying pbd principles, especially in view of the requirement to comply with privacy and data protection laws. being proactive and preventative requires a clear understanding of the strategic risks, challenges, and rewards of applying strong privacy throughout an organization and across information systems, in a comprehensive manner. chapter 3 data ad prac gerace ccept 69 privacy embedded into design privacy promises are not enoughthey must be implemented in systematic and verifiable ways. nformation and communications technologies, systems, and networks are highly complex and dynamic in nature. data processing is interdependent and tends to be opaque in nature, requiring more trust than ever from stakeholders and users for sustainability. these are not ideal conditions for ensuring that accountability, data protection, and individual privacy will thrive. privacy commitments and controls must be embedded into technologies, operations, and information architectures in holistic, integrative, and creative ways: holistic, because broader contexts must be considered to properly assess privacy risks and remedies; ntegrative, because all stakeholders should be consulted in the development dialogue; and creative, because embedding privacy rights and controls, at times means reinventing the choices offered because existing alternatives are unacceptable. a systematic, principled approach to operationalizing privacy should be adopted, one that relies on accepted standards and process frameworks, amenable to external reviews and audits. all fair information practices should be applied with equal rigor, at every design step. wherever possible, detailed privacy impact and risk assessments should be carried out, documenting the privacy risks and measures taken to mitigate those risks, including consideration of alternatives and the selection of metrics. the privacy impacts of the resulting technologies, processes, and information architectures should be demonstrably minimized and not easily degraded through use, misconfiguration, or error. n the united tates, the federal trade commission (ftc) has begun to require some organizations to put in place comprehensive, auditable privacy programs. n the european union, prior checking and other due diligence requirements are becoming mandatory for organizations to demonstrate compliance with privacy laws. chapter 3 data ad prac gerace ccept 70 full functionality: positive-sum not zero-sum privacy is not an absolute value. to design practical, yet effective, privacy controls into information technologies, organizational processes, or networked architectures, privacy architects need to acknowledge many legitimate (and, yes, sometimes competing) goals, requirements, and interests and accommodate them in optimized, innovative ways. the pbd principle of full functionality requires going beyond privacy declarations and best efforts to demonstrate how data processing and other objectives have been, and are being, satisfied in a doubly-enabling, win-win model. external accountability and leadership are enhanced by applying this principle, which emphasizes transparency and measurable outcomes of multiple functionalities: when embedding privacy into a given information technology, process, system, or architecture, it should be done in such a way that full functionality is not impaired, and that all legitimate interests are accommodated and requirements optimized; privacy is often positioned in a zero-sum manner; that is, having to compete with other legitimate interests, design objectives, and technical capabilities in a given domain. pbd rejects this approach; it embraces legitimate non-privacy objectives and accommodates them in an innovative, positive-sum manner; and all interests and objectives must be clearly documented, desired functions articulated, metrics agreed upon and applied, and unnecessary trade-offs rejected, in favor of finding a solution that enables multi- functionality. additional recognition is deserved for creativity and innovation in achieving all objectives and functionalities in an integrative, positive-sum manner. rganizations that succeed in overcoming outmoded zero-sum choices demonstrate global privacy leadership. this principle challenges policymakers, technologists, and designers, among others, to find ways to achieve better privacy in a given technology, system, or domain than is currently the case and to document and demonstrate achievements that become best practices. there are many examples of positive-sum transformative technologies that achieve multiple objectives in tandem in a privacy-enhancing manner. for example, biometric encryption (be) achieves positive identification without the need for centrally stored templates. be has been successfully deployed across ntario gaming facilities to identify gamblers requesting to be barred from entering the premises. the positive-sum pbd principle has also been successfully applied in a chapter 3 data ad prac gerace ccept 71 wide range of areas: road toll pricing, smart meters, whole-body image scanners, rfd-enabled systems, geolocation-enabled services, and many other technologies and services. the creation, recognition, and adoption of pets as a means to achieve pbd operational goals is being actively promoted by the european commission, not only as a major ongoing research funding initiative under the framework programme, but notably in the context of the eu review of, and proposed amendments to, the data protection regulation. current work by international data protection authorities to define accountability is also establishing common definitions and best practices that help advance organizational pbd practices. imilar work is also under way in international standards groups to define privacy implementation, assessment, and documentation methods. the preparation, use, and publication, whether mandatory, contractual, or voluntary, of privacy impact assessments and privacy management frameworks are also on the rise. we are seeing the growth of standardized privacy evaluation, audit, and assurance systems, innovative co-regulatory initiatives, certification seals and trust marks, and other criteria. enhanced diligence and accountability measures are consistent with the pbd emphasis on demonstrating results. the publication of successful case studies adds illustrative and educational value for others to emulate. perhaps the most exciting chapters on achieving pbd results have yet to be written, as public policymakers on both sides of the atlantic cean actively propose weaving the pbd framework and principles into the fabric of revised privacy laws, and in strengthened systems of regulatory oversightthe best is yet to come. like privacy engineering, pbd teaches that privacy is also a business issue. the building of consumer trust will provide a competitive advantage. just one data breach interferes with this trust. pbd, like privacy engineering, recognizes that both physical design and information technology design are crucial to develop an effective privacy program. the privacy designer needs to carefully construct physical security to protect the privacy of both data facilities and paper records. information technology design can enhance privacy by the use of pets (discussed in detail in chapter 6) like a uniqueness identifier with no specific meaning and by utilizing encryption correctly. security and privacy work together and do not work at cross purposes. it is important that privacy be embedded into the it system as part of the design process, baked in so it will not interfere with the business purpose of the system but will actually enhance the business objectives. how privacy engineering and privacy by design work together privacy engineering is a concept for which pbd is a facilitator. pbd provides valuable design guidelines that privacy engineers should follow. in turn, privacy engineering adds to and extends pbd. it provides a methodology and technical tools based on industry guidelines and best practices, including the unified modeling language. chapter 3 data ad prac gerace ccept 72 in the rest of this book, we will discuss the methodologies and the various modeling processes to develop privacy mechanisms that can be used independently or can be plugged into new and existing enterprise systems to enhance their ability to implement enterprise privacy policies. conclusion this chapter explained how privacy and other data management frameworks overlap and can be leveraged as an overall governance framework for personal information. data management teams and privacy functions have common goals: the health, hygiene, and well-being of the data under their respective custodianship. while there may be different approaches to data management and different privacy frameworks, there are strong points of similarity that can be harmonized to arrive at a functional set of policies and requirements for an enterpise. chapter 4 will discuss how these privacy policies are developed and how an organizations privacy policy can be coordinated as the meta document for use case requirements. part 2 the privacy engineering process 75
<Chapter>: chapter 4
<Section Text>: developing privacy policies if at first the idea is not absurd then there is no hope for it. albert einstein dont skip this chapter because the information presented seems obvious or is something you might feel you want to pass off to your legal team. the search for solid engineering requirements starts with solid policy. by policy, we mean the rules that govern, not the privacy policy we associate with the web site that is never read. this is not a chapter about traditional policy creation. the privacy policy is the silk road (in the classic sense of the ancient asian silk road, not the contemporary online black market web site). it leads the organization to this new world of innovation and privacy engineering. it brings multidisciplinary actors and actions together and combines the best of legal, technical, and process-oriented teams for fair and legitimate processing of personal information (or privacy). this privacy policy becomes the basic map or blueprint for the build out. it ultimately should be viewed as the meta set of use-case requirements. this chapter covers the development of policies that will be used as the basis for development of the controls and measures to protect personal information (i.e., privacy standards, guidelines, business rules, and mechanisms). when we discuss policy creation in this context, we are talking about starting with business requirements (a task or series of tasks needed to serve a goal) and functionality goals. once defined for goals and basic functions, we add requirements driven by applicable law. we then fit and bend our requirements to view the policies we must create through a lens of functionality (i.e., each action taken or demanded may be viewed as a requirement specification that must be included in a system). that system may be an enterprise, a subunit, end-to-end processing cycle, application, an element of functionality, a person-managed governance activity, among others. there is no exclusive list of what constitutes a system. every discussion in this chapter must be considered in this operational, requirement-driven context otherwise it will be easy to slip into traditional policy mode. this is not a discussion chief privacy officers (cpos; or whomever is leading the privacy function) will have with every privacy engineer; however, every cpo must consider the output of his or her labor in terms of the concrete and measurable requirements and the outcomes discussed here. chapter 4 deep prac pce 76 following chapters will show unified modeling language (uml) and systems creation techniques for metadata as a methodology for taking the requirements derived from privacy policies and other technical sources and creating solutions that reflect those requirements. where neither systems nor features nor privacy enhancing technologies can meet the requirements set forth, governance, training, and leadership systems involving the human players in the privacy engineering drama are discussed. elements of privacy engineering development privacy engineering is the discipline of developing privacy solutions that consist of procedures, standards, guidelines, and mechanisms. part 2 covers the process of developing privacy solutions, as depicted in figure 4-1. enterprise goal user goals privacy policy requirments procedures & processes privacy awareness training privacy mechanisms quality assurance quality assurance feedback figure 4-1. privacy engineering development process chapter 4 deep prac pce 77 the elements of the process of developing a privacy solution, based on a set of privacy policies, are: enterprise goals: they must be reflected and aligned with privacy engineering solutions, including their privacy policies, standards, and guidelines. to make this happen, a privacy development team1 must first understand the goals and objectives of the enterprise in which the solution will operate. for the purposes of this book, enterprise includes organizations large and small that manage or otherwise process data. this definition would, of course, include government entities that may be governed by specific or additional rules and regulations and the organizing principles will still apply. user/individual goals: these must be incorporated to develop effective and flexible privacy policies that will be accepted by the end user and individuals. the team members must understand the goals and objectives (and privacy sensibilities) of the end users and individuals who will participate in the system or become the data subjects for pi managed by the system. privacy policy: development of a privacy policy is discussed in chapter 4. the policy plays a key role in guiding how privacy engineering is applied. privacy requirements: requirement gathering is critical for effective policy creation and solution development. chapter 5 describes the application of use cases for requirement collection and introduces a unique use-case metadata model. privacy procedures and processes: these are the overall privacy activities (procedures) and their human or automated tasks (processes). chapters 5 and 6 cover developing and using these as part of the privacy engineering discipline. mandated standards and recommended guidelines factor into the creation of procedures and processes. it is procedures, processes, standards, and guidelines that translate policy into reality. privacy mechanisms: these are the automated solutions built with software and hardware to enforce privacy policies. examples are created for illustration in chapters 7, 8, and 9 using the development process presented in chapter 6, including a privacy engineering component and how it can fit within an application system environment. 1this team will consist of members from a formal privacy function, business-oriented data stewards, privacy engineers, security analysts, and it data analysts. data governance was discussed in chapter 2. organizational aspects of privacy engineering will be addressed in chapter 11. chapter 4 deep prac pce 78 privacy awareness and readiness preparation: as part of developing a privacy engineered solution, the team will engage with various stakeholders so they are aware of what the privacy policy is and what it does. the privacy team works together with these stakeholders to address how the privacy-engineered solution could affect their roles and responsibilities. this subject is addressed in chapter 10. quality assurance: this is required to ensure that the privacy engineering solution functions properly, as well as satisfies enterprise goals, user goals, and accepted privacy standards within the context they are to operate. quality assurance for privacy solutions is discussed in chapter 10. feedback loop: this will ensure that the privacy engineering solution is improved continuously as it will periodically quality assess or audit the solution and build in the ability to do so as a technical and procedural requirement. after reading part 2, whether you are a privacy professional or an engineer without a privacy background, you should have an understanding of how privacy is engineered into systems. privacy policy development balanced with the enterprise requirements (where the data value of the solution should always exceed its risks when used in context), individual or user goals must be considered as part of the final articulation of the enterprise goals. the mission, goals, and objectives of the enterprise must be recognized, understood, and analyzed to determine a privacy-engineered solutions requirements. from these, the privacy policies that will govern the privacy engineering solution can be determined. the privacy policy development should be done at two levels: a general level, relevant to all parts of the enterprise, and at an enterprise-specific level, which will often be more specific and detailed than an enterprise-wide policy. although drafting privacy policies can be the subject of entire legal or organizational tomes, this chapter will go into enough depth so that the principles that comprise privacy policies are sufficiently understandable as the foundational layer of privacy engineering and use-case requirements. these policies enable the management of the principles as a framework, which in turn can also lead to: the development and deployment of privacy engineered systems the exciting missing beastthe framework to build and innovate the privacy engineered data-centric networks, tools, and solutions of the future chapter 4 deep prac pce 79 what is a good policy? a policy is considered good based on the manner in which it functions as well as its contextual fit (i.e., how well it balances the needs and objectives of the enterprise with the objectives of the users or customers or employees whose data ultimately flows through that organization). a good policy: arises from well-articulated enterprise goals, which are based on a clear statement of belief or purpose describes what is wanted or intended by the various parties of interest impacted by the enterprise explains why these things are wanted provides positive direction for enterprise employees and contractors provides transparency to the users of systems or individuals interacting with the enterprise is flexible enough so there can be adjustments to changing conditions without changing the basic policy itself is evaluated regularly can be readily understood by all policy statements should be written in clear, concise language. a privacy policy should contain everyday words and short sentences and avoid the use of acronyms. if actions are compulsory, must should be used. if actions are recommended, should should be used. the policy must be practical and easy to implement. designing a privacy policy some organizations begin taking action on mitigating business risks before an official privacy policy is published, but defining the policy should be a high priority. sadly, many enterprises copy policies they find on other companies web sites and post what amounts to an ad hoc policy of their own before any due diligence has been exercised with regard to knowing their personnels, processs, or technologys requirements. its a sad fact, but a vast majority of enterprises own what we call compliancewarestuff that they purchase, license, or otherwise acquire just in case there is a data breach or a regulatory inquiry at a later date but that they never actually completely deploy. an example of this is where an enterprise purchases an identity management suite of products and sets the roles to employee or nonemployee without regard to a good policy that would illuminate why individuals required access to process data or how the roles or employees themselves should be protected and governed. a good privacy policy should be linked closely to this type of deployment. it will set its requirements before deployment or, better yet, before purchase or development if the identity solution is homegrown. the next section describes the key considerations for crafting an effective privacy policy as well as how to maintain it. chapter 4 deep prac pce 80 what should be included in a privacy policy? policies must be designed to meet a complex set of competing needs: local and international legal, jurisdictional, and regulatory necessities, depending on the scope of the enterprise organization or business requirements permission for the marketingcustomer relationship for management or business intelligence brand identity industry standards usability, access, and availability for end users of information systems economic pressure to create value through efficient sharing or relationship building enforceability and compliance ethical obligations realistic technology capabilities and limitations everything with a digital heartbeat is connected through dynamically formed relationships governed by privacy, security, and trust policies. this means there may be multiple interactive or cascading privacy policies based on the role of the various parties of interest: customers employees or contractors third parties impacted by the enterprise intellectual property owners data types each privacy policy should start with the data type and its anticipated lifecycle and be aligned with the enterprise brand and the enterprise standards of conduct. the policy should add value by managing data: respecting and managing regulatory and industrial standards compliance using personal information and confidential data related to it safely and ethically reconciling differences and leveraging synergies between overlapping or competing enterprise policies and goals for other areas, such as audit or litigation data preservation, records management, and physical and it security chapter 4 deep prac pce 81 establishing a basis for objective respect and trust between an enterprise and its customers, employees, and other impacted groups as discussed in chapters 2 and 3, there are several sets of external standards and guidelines defining privacy requirements, including the oecd guidelines for the protection of privacy and transborder flows of personal data, gapp, pbd, sectorial and competition laws in the united states, apec privacy accountability frameworks, and the european union (eu) data protection directive (and member-states implementation of its requirements).2 these external guidelines and principles can provide a framework for ensuring that the privacy policy will offer compliance within the related jurisdictional area. it should, of course, be noted in the privacy requirements that: not all laws are granular enough to provide one objective interpretation that must be instantiated all rules and regulations can always be harmonized to be free of directly conflicting standards and so-called best practices what is possible is an objective working framework that will become the policy for the enterprise and, ultimately, the basis for process and technology policies, as described in the sidebar. internationalization: developing a global privacy policy by dr. mark watts, head of nformation technology aw, bristows europe is not a country. t isnt. and while this will be blindingly obvious to most people reading this book, its surprising how often hear it assumed that europe is essentially a country, with a single, homogenous data privacy law that sets out the rules applicable across the entire region (50 or so countries). f only life were that simple. f only european privacy rules were that simple. adly, theyre not. and the point here is not to ridicule anyones understanding of european geography or laws, but rather to make the point that, although when working internationally in privacy we all make assumptionswe have to, to rationalize the almost overwhelming legal complexity involvedmaking the wrong assumptions can quickly cause a project to go astray. 2 oecd guidelines on the protection of privacy and transborder flows of personal data are available at www.oecd.org/document/18/0,3343,en_2649_201185_1815186_1_1_1_1,00. html. a downloadable version of the generally accepted privacy principles (gapp), along with additional information about the development and additional privacy resources, can be found at www.aicpa.org/privacy. information about the european unions directive on data protection is available at http://ec.europa.eu/justice_home/fsj/privacy/index_en.htm. chapter 4 deep prac pce 82 perhaps the most common working assumption see crossing my desk is that the data privacy laws of a particular country are either (i) completely and utterly different from those that apply at home (usually the country of the parent company) so none of our existing data privacy policy can possibly apply, or (ii) absolutely identical to those that apply at home and so we dont need any special consideration or handling in the privacy policy; in other words, the international privacy policy can simply be the same as a domestic one. unfortunately, most of the time, neither working assumption works particularly well. a sensible, well-drafted data privacy policy written to meet, say, orth american legal requirements will contain much of relevance and application to europe and beyond because good information handling practices, such as transparency, data quality, and security, are just thatgood practices that should transcend country borders. but equally, to assume that thats all there is to it and that, say, orth american laws can be exported globally would be complacent and would be to ignore significant cultural differences and priorities, not to mention historical sensitivities. many an international company has come unstuck making this assumption. for example, assuming the laws that relate to monitoring employee communications in, say, finland are the same and so just as permissive as those in the united tates (an assumption we see a lot) could easily land a company in hot water. equally, for a european-headquartered company to assume that there are no security breach notification laws in the united tates simply because there are so few at home in europe at the moment can be just as problematic. a privacy policy built on shaky, overly broad assumptions can put a company, even a company that is trying very hard to do the right thing, in breach of applicable law, despite it following its privacy policy to the letter. perhaps more worryingly, sometimes a breach can occur precisely because a company followed a privacy policyadmittedly, a poor privacy policyto the letter. haky assumptions can lead to another, more subtle but equally problematic risk the risk of unnecessary overcompliance. ow, this isnt to suggest that companies should develop policies requiring only the minimum amount of compliance required by local law (essentially as little as the company can get away with) but would a company really want to apply the highest common denominatorthe strictest standard anywhereto all of its operations worldwide? urely not. for example, would it really be wise to export the highly restrictive finnish laws on monitoring employee communications to every country where a company does business? most unlikely, because although this approach would ensure compliance with the communication monitoring laws of almost all other countries where the company has employees, it could seriously hamper its business operations in countries with more permissive regimes. this isnt a risk of noncompliance; it isnt a risk of breach. ts a risk of overcompliance that can fetter existing business processes, potentially inhibit sales, and, just as importantly for the privacy professional and chapter 4 deep prac pce 83 privacy engineer, can damage their internal credibility within the company. all in all, overcompliance can be as much of a problem for the company as undercompliance. the problem here is not that broad international assumptions are being made. they have to be. a global company with operations subject to the data privacy laws of hundreds of different countries cannot realistically be expected to identify every last detailed requirement of every last applicable law because, at least from a regulatory point of view, the world is still a very big place. o developing an international privacy policy (including all procedures, consent statements, contracts, and other supporting documents that go with it) has to involve making certain assumptions. ts just that they have to be the right assumptions. ou have to know when its safe to assume (or indeed, force) conformity between countries at a privacy policy level and when to leave enough room to accommodate important local differences in countries laws. where does one start? as good a place as any for most companies is to think carefully about what it actually wants its international privacy policy to do. s it meant to be some all singing, all dancing document that seeks to set out the various compliance requirements for each of the countries where the company does business? r is it intended to be something with less lofty ambitions, merely a common set of requirements that will improve compliance everywhere while accepting that in certain countries there will be a delta between the requirements of the policy and those of applicable law? well-advised companies adopt the second approach, prioritizing the simplicity of a common, global policy that leads to a good (and hopefully even very good) level of compliance everywhere over the more comprehensive and unwieldy, not to mention expensive, approach directed at full compliance everywhere, at least on paper and most likely only on paper. by adopting the second approach, companies are recognizing that there will inevitably be some specific (but hopefully minor) country legal requirements that are not covered by the policy in detail and which may not be complied with to the letter and only in spirit. n an attempt to plug the most significant of any known gaps like this, companies often develop country-specific annexes or sections in their privacy policy. an example of this would be a section specific to data collected in witzerland that extends the privacy policys requirements to information about legal entities (e.g., companies) as well as individuals (i.e., human beings). to include such an onerous requirement in the main body of the data privacy policy would be to export the wiss requirement globally unnecessarily, requiring all companies to apply the policy in full to information about legal entities even though it is not legally required where they operate. ncluding the obligation in an additional annex to the policy and restricting it to data collected in witzerland enables compliance with the local requirement while limiting its impact geographically. chapter 4 deep prac pce 84 buttweaking the facts slightlywhat if the parent company developing the privacy policy is, say, a wiss bank? n this case it may be desirable or even essential to require its global operations to handle data about legal entities as if they were all subject to wiss data privacy law. this would suggest that the wiss provision should be included in the body of the privacy policy rather than being buried in an annex limited to data collected in witzerland. and this is how international privacy works; there are few if any invariably true assumptions that can be built into any global privacy policy. they always have to be considered and reconsidered on the particular facts for the company developing the policy. done well, the result can be a robust privacy policy with a good degree of conformity from country to country, capable of generating clear technical requirements that give the privacy engineers a chance of coding privacy. done poorly, the result can be a policy thats unnecessarily strict, or with too many exceptions, or which is simply too vague to be useful, any one of which can require last minute changes to the privacy policy (and consequently any technical requirements based on it), something which, in my experience, coders really dont seem to like. general-level privacy policy development one of the first things to be determined when drawing up privacy policies is which geopolitical regions or jurisdictions impact the enterprise. privacy policies for a global enterprise, for example, can start the foundational development process by basing a strategy on the oecd guidelines and gapp. in some cases, other localized articulations of fair information processing may be the foundational basis for policy creation. for whatever framework is chosen, the policy creators will need to be able to translate how the various principles are managed if the policy is going to be an effective tool for process and privacy-enhanced systems and features in a privacy engineering context. for example, a policy statement might require that data be collected relevant to services provided by the current enterprise. the general policy would require a well-defined privacy notice to provide for transparency between the collector of data and the data subject as well as to build an enforceable governance structure where the data asset is known as it enters and moves through its predicted lifecycle. an enterprise must be able to articulate and document how much personal information would be collected for specific purposes according to proportionality principle. a policy statement should cover proportionality requirements: the benefit derived from the processing of the data should be proportional to its impact to privacy of the individual whose data is being processed. to achieve data proportionality at the time of collection, the data subjects perspective needs must be balanced within the enterprises objectives. the privacy policy should require a storage and archiving strategy. encryption, obfuscation, or other security tactical requirements should be covered in the privacy policy and have associated standards and guidelines for operational implementation. chapter 4 deep prac pce 85 allowances for revisions and exceptions should be included in privacy policies to address the fact that policy needs will change. there are occasions when a customers, employees, suppliers, or other party of interests feedback or requirements may lead to the need to modify privacy policies or grant exceptions. when an enterprise operates internationally, privacy policies should address the transfer of data among various jurisdictions. the underlying strategies should be people- process and technology oriented and include governance mechanisms that must be designed and executed to follow the data wherever they travel. this is the point at which many initiatives often fail due to the lack of coordination and integration of effort. the lawyers head off to draft elaborate legal documents neatly tucked away behind a small link that says privacy notice at the bottom of a web page or buried in the terms and conditions statement of an application. the technical teams can rush off to buy products that obscure or encrypt enough data to satisfy the annual return of the audit team and so on among the teams. an institutional anthropologist could build an entire career analyzing the fascinating and often divergent goals of these now forever-parted teams. anthropologic observations aside, the course of behavior that should be charted is an ongoing dialogue between the key stakeholders so that a privacy policy (i.e., requirements for processing personal information) can evolve and continue to meet the needs of individuals and the organization and keep pace to aid and not hinder innovation. enterprise-specific privacy development the nature and culture of an enterprise business impacts privacy policies and the creation process. for instance, in the united states, the legal approach is often sectorial governed. an example of this is health care in the united states, where the health insurance portability and accountability act of 1996 (hipaa) policies and privacy rules should be incorporated. this type of enterprise will always be extremely open with many third parties, operating in a nonstop high-stakes context (in some cases, life and death). getting the balance between use, sharing, access, and accuracy will be a supreme consideration. the rights and sensitivities of the data subjects within this context are highly subjective while also the subject of extensive regulation. although other jurisdictions may not have standalone health data protection statutes, this type of context, and health data specifically, is governed as a protected classor even an enhanced protected class, as in the european union, a sensitive data class of data worldwide. a health care-, financial-, or politically sensitive type of context is actually the proving grounds for many other types of businesses. these enterprises require personalization and intimate knowledge of personal information, but also value a certain level of autonomous innovation with data and financial models based on data. innovating for high-risk data is a bit like the lyrics from the song new york, new york: if i can make it there, ill make it anywhere. a similar illustration can be drawn for financial data in the united states where the gramm-leach-bliley act requires financial institutionscompanies that offer consumers financial products or services like loans, financial or investment advice, or insuranceto explain their information-sharing practices to their customers and to safeguard sensitive data. these types of data are covered by other comprehensive global laws such as the personal information protection and electronic documents act (pipeda) in canada or chapter 4 deep prac pce 86 under the argentine data protection laws but may not be called out under a specific law or called out as sensitive data calling for enhanced protections beyond the comprehensive requirements. the point here is that although not all data is created equal (nor do they call for exactly the same type of privacy policy treatment), personal information should be considered a controlled substance, and close partnerships and legal considerations are certainly necessary before we innovate on top of the foundational policy. internal vs. external policies data protection standards such as the oecd guidelines and gapp, among others, require that privacy policies should be published both internally in enterprises and externally (actually, externally, it is usually a statement or notice of an enterprise practices that is posted, not the actual policy) to give notice to users of systems, customers, or other data subjects interacting with the enterprise. failure to comply with the enterprise public notices can lead to: dissatisfied customers: customers and other users will expect compliance to the privacy protection actions as indicated within the notice. it may be considered an implied contract. if there is a breach, users will tend to look to safer sites. if a user discovers identity theft that seems to have come from personal information collected by an enterprise, that user will take it out on the enterprise maintaining the site that failed them. regulatory investigations: where an enterprise has not lived up to its notice commitments, regulators from one or more jurisdictions will likely investigate the problems and may take either criminal or civil actions or both against both the enterprise and, conceivably, against employees within the enterprise. bad publicity: forty-six us states, the district of columbia, plus other us territories have security breach notification laws that involve personal information. there are comparable laws throughout the world. the media keep a lookout for such notifications and determine when breaches are significant. any breach scares people, and serious breaches equal bad publicity. litigation: potential liability in privacy-related lawsuits has been increasing steadily in recent years. this expanding legal exposure has been fueled by plaintiffs class action lawyers targeting privacy litigation as a growth area. moreover, federal and state government agencies, as well as data protection agencies throughout europe and asia, are becoming increasingly aggressive in their efforts to investigate and respond to privacy and data security concerns and incidents. the federal trade commission (ftc) is imposing stricter standards on businesses, while state attorneys general are pursuing enforcement actions and conducting high-profile investigations in response to data breaches and other perceived privacy violations. chapter 4 deep prac pce 87 harm to brand: for most enterprises, the equity invested in their brands is an invaluable but fragile asset. when privacy protection problems occur, the reaction of the enterprise is crucial to the maintenance of a very positive brand. weak innovation: effective innovation comes from making improved products that deliver what people want. to find what customers and potential customers want requires the collection of data. an enterprise that does not protect the privacy of data will weaken the ability to collect the data needed to determine where innovation is required. employee distrust: just as customers can be turned off when privacy notice failures occur, employees can begin to distrust their enterprise when their data is not protected as the privacy notice promise. an enterprise should consider creating training based on internal privacy rules that are more granular, specific, and more restrictive than externally posted notices. these internal policies should be coordinated with a human resources policy team to ensure that staff and business partners know exactly what to do, how to get help when they need it, and how and when these may be enforced and encouraged. these policies must all be reflected and are instantiated in product and systems development as discussed further in chapters 5 and 6. engineers and lawyers in privacy protection: can we all just get along? by dr. annie . antn, professor in and chair of the chool of nteractive computing at the eorgia nstitute of technology peter wire, ancy j. and awrence p. huang professor, cheller college of business, eorgia nstitute of technology n march 2013 we participated in a panel titled re-engineering privacy aw at the nternational association of privacy professionals privacy ummit. the topic of the panel closely matches the topic of this book, how to bring together and leverage the skill sets of engineers, lawyers, and others to create effective privacy policy with correspondingly compliant implementations. as a software engineering professor (antn) and a law professor (wire), we consider four points: (1) how lawyers make simple things complicated; (2) how engineers make simple things complicated; (3) why it may be reasonable to use the term reasonable in privacy rules but not in software specifications; and (4) how to achieve consensus when both lawyers and engineers are in the room. 1. how lawyers make simple things complicated. a first-year law student takes torts, the study of accident law. a major question in that course is whether the defendant showed chapter 4 deep prac pce 88 reasonable care. f not, the defendant is likely to be found liable. ometimes a defendant has violated a statute or a custom, such as a standard safety precaution. more often, the answer in a lawsuit is whether the jury thinks the defendant acted as a reasonable person. the outcome of the lawsuit is whether the defendant has to pay money or not. we all hope that truth triumphs, but the operational question hinges on who can prove what in court. the legal style is illustrated by the famous palsgraf case.3 a man climbs on a train pulling out of the station. the railroad conductor assists the man into the car. n the process, the man drops a package tucked under his arm. t turns out the package contains fireworks, which explode, knocking over some scales at the far end of the platform. the scales topple onto a woman, causing her injury. from teaching the case, here is the outline of a good law student answer, which would take several pages. the answer would address at least four issues. for each issue, the student would follow rac (ssue, rule, analysis, conclusion) form, discussing the issue, the legal rule, the analysis, and the conclusion: (1) was the man negligent when he climbed on the moving train? (2) when the railroad conductor helped the man up, was the conductor violating a safety statute, thus making his employer, the railroad, liable? (3) when the man dropped the fireworks, was it foreseeable that harm would result? (4) was the dropping of the package the proximate cause of knocking over the scales? n sum, we seek to determine whether the railroad is liable. the law student would explain why it is a close case; indeed, the actual judges in the case split their decision 4-3. engineers design and build things. as such, they seek practical and precise answers. nstead of an rac form, engineers seek to apply scientific analytic principles to determine the properties or state of the system. the mechanisms of failure in the palsgraf case would be analyzed in isolation: (1) the train was moving, therefore, the policy of only allowing boarding while the train is stopped was not properly enforced, thereby introducing significant safety risk into the system. (2) the scales were apparently not properly secured, thus a vibration or simple force would have dislodged the scales, introducing safety risk into the system. s the railroad liable? an engineer would conclude the compliance violation and unsecured scales means that it would be liable. the engineering professor would congratulate the engineering student for the simple, yet elegant, conclusion based on analysis of isolated components in the system. n engineering, simplicity is the key to elegance. 3palsgraf v. long island railroad co., 248 n.y. 339 (n.y. 1928). chapter 4 deep prac pce 89 the lawyer may agree in theory that simplicity is the key to elegance, but law students and lawyers have strong reasons to go into far more detail. the highest score in a law school exam usually spots the greatest number of issues; it analyzes the one or two key issues, but also creates a research plan for the lawyers litigating the case. for example, the railroad has a safety rule that says the conductor shouldnt help a passenger board when the train is moving, but surely there are exceptions? n the actual case (or the law school exam), the lawyer would likely analyze what those exceptions might be, especially because finding an applicable exception will free the railroad from liability. the good exam answer may also compare the strange chain of events in palsgraf to other leading cases, in order to assess whether the plaintiff can meet her burden for satisfying the difficult-to-define standard for showing proximate cause. n short, lawyers are trained to take the relatively simple set of facts in palsgraf and write a complex, issue-by-issue analysis of all the considerations that may be relevant to deciding the case. the complexity becomes even greater because the lawyer is not seeking to find the correct answer based on scientific principles; instead, the lawyer needs to prepare for the jury or judge, and find ways, if possible, to convince even skeptical decision-makers that the clients position should win. 2. how engineers make simple things complicated. a typical compliance task is that our company has to comply with a new privacy rule. for lawyers, this basically means applying the fair nformation privacy principles (fpps), such as notice, choice, access, security, and accountability. the law is pretty simple. the engineer response is: how do we specify these rules so that they can be implemented in code? tage one: specify the basic privacy principles (fpps). tage two: specify commitments expressed in the company privacy notice. tage three: specify functional and nonfunctional requirements to support business processes, user interactions, data transforms and transfers, security and privacy requirements, as well as corresponding system tests. as an example, some privacy laws have a data minimization requirement. iving operational meaning to data minimization, however, is a challenging engineering task, requiring system-by-system and field-by-field knowledge of which data are or are not needed for the organizations purposes. tuart hapiro , principal nformation privacy & ecurity engineer, the mtre corporation, notes that an implementation of data minimization in a system may have 50 requirements and 100 associated tests. nput to the system is permitted only for predetermined data elements. when the system queries an external database, they are permitted only to the approved data fields. there must be executable testsapply to test data first and then confirm that data minimization is achieved under various scenarios. for the lawyer, it is simple to say data minimization. for the engineer, those two words are the beginning of a very complex process. chapter 4 deep prac pce 90 3. why it may be reasonable to use the term reasonable in privacy rules. wire was involved in the drafting of the hpaa medical privacy rule in 19992000. antn, the engineer, has long chastised wire for letting the word reasonable appear over 30 times in the regulation. words such as promptly and reasonable are far too ambiguous for engineers to implement. for example, consider hpaa 164.530(i)(3): the covered entity must promptly document and implement the revised policy or procedure. engineers cant test for promptly. they can, however, test for 24 hours, 1 second, or 5 milliseconds. as for reasonable, the rule requires reasonable and appropriate security measures; reasonable and appropriate polices and procedures for documentation; reasonable efforts to limit collection and use to the minimum necessary; a reasonable belief before releasing records relating to domestic violence; and reasonable steps to cure the breach by a business associate. the engineers critique is: how do you code for promptly and reasonable? the lawyers answer is that the hpaa rule went more than a decade before being updated for the first time, so the rule has to apply to changing circumstances. the rule is supposed to be technology neutral, so drafting detailed technical specs is a bad idea even though thats exactly what engineers are expected to do to develop hpaa-compliant systems. there are many use cases and business models in a rule that covers almost 20% of the u economy. ver time, the department of health and human ervices can issue faqs and guidance, as needed. f the rule is more specific, then the results will be wrong. n short, lawyers believe there is no better alternative in the privacy rule to saying reasonable. the engineer remains frustrated by the term reasonable, yet accepts that the term is intentionally ambiguous because it is for the courts to decide what is deemed reasonable. f the rule is too ambiguous, however, it will be inconsistently applied and engineers risk legal sanctions on the organization for developing systems not deemed to be hpaa compliant. n addition, promptly is an unintentional ambiguity that was preventable in the crafting of the law. by allowing engineers in the room with the lawyers as they decide the rules that will govern the systems the engineers must develop, we can avoid a lot of headaches down the road. 4. how to achieve happiness when both lawyers and engineers are in the same room. rganizations today need to have both lawyers and engineers involved in privacy compliance efforts. an increasing number of laws, regulations, and cases, often coming from numerous states and countries, chapter 4 deep prac pce 91 place requirements on companies. awyers are needed to interpret these requirements. engineers are needed to build the systems. despite their differences, lawyers and engineers share important similarities. they both are very analytic. they both can drill down and get enormously detailed in order to get the product just right. and, each is glad when the other gets to do those details. most engineers would hate to write a 50-page brief. most lawyers cant even imagine specifying 50 engineering requirements and running 100 associated tests. the output of engineering and legal work turns out to be different. engineers build things. they build systems that work. they seek the right answer. their results are testable. most of all, it works if it runs according to spec. by contrast, lawyers build arguments. they use a lot of words; brief is a one-word oxymoron. awyers are trained in the adversary system, where other lawyers are trying to defeat them in court or get a different legislative or regulatory outcome. for lawyers, it works if our lawyers beat their lawyers. iven these differences, companies and agencies typically need a team. to comply, you need lawyers and engineers, and it helps to become aware of how to create answers that count for both the lawyers and the engineers. to strike an optimistic note, in privacy compliance the legal and engineering systems come together. our own work improves if you become bilingual, if you can understand what counts as an answer for the different professions. we look forward to trying to find an answer about how to achieve happiness when both lawyers and engineers are in the room. antn presumably is seeking a testable result. wire presumably will settle for simply persuading those involved. however, we both agree that the best results come from collaboration because of the value, knowledge, and expertise that both stakeholder groups bring to the table. policies, present, and future policies have to be living documents that can be readily changed as a business changes or as the regulatory environment changes; however, they should not be changed lightly or at whim. there is overhead associated with policy changes, especially in the privacy space. for instance, a change in policy may indicate a change in use of data, which then may require an enterprise to provide notice of the change to whomevers data is affected and get permission for the new uses of the data. even without a pressing need for change, it is important to review policies on a regular basis, perhaps annually, to determine if change is necessary. a good policy needs to be forward looking and, at the same time, accurate to the current state. it should be sufficiently detailed as to give direction and set parameters, but not so detailed as to be overly specific or to require excessive change. each enterprise will need to find the balance between what is communicated as policy and what is chapter 4 deep prac pce 92 communicated as an underlying standard or guideline for meeting the requirements of the policy. key stakeholders should review policies and practices at least annually to see if revisions are warranted. engineered privacy mechanisms can ease the change and improvement of the policies, especially with the specific procedures, standards, guidelines, and privacy rules that need to change if there are policy revisions. the privacy component discussed in chapters 6, 7, 8, and 9 addresses this crucial need. conclusion privacy policies are powerful tools in the overall privacy engineering process. privacy professionals, lawyers, and compliance teams can use them to communicate expected behaviors and leverage them to create accountability measures. in the process of policy creation, internal and externalincluding systems users and regulatorsrequirements and expectations must be gathered. these same requirements and expectations in the traditional lexicon can also be leveraged as engineering requirements in the privacy engineering model and execution sense. we will explore how such requirements fit into a systems model in chapters 5 and 6. in the remaining chapters of part 2, we will continue to call on these policy requirements in the context of discrete tools and features that rest in the privacy engineering toolkit. 93
<Chapter>: chapter 5
"<Section Text>: developing privacy engineering requirements the expectations of life depend upon d(...TRUNCATED)
<Chapter>: chapter 6
"<Section Text>: a privacy engineering lifecycle methodology they always say time changes, but you (...TRUNCATED)

No dataset card yet

New: Create and edit this dataset card directly on the website!

Contribute a Dataset Card
Downloads last month
2
Add dataset card