Harboring Data: Information Security, Law, and the Corporation

Harboring Data: Information Security, Law, and the Corporation

by Andrea M. Matwyshyn (Editor)
Harboring Data: Information Security, Law, and the Corporation

Harboring Data: Information Security, Law, and the Corporation

by Andrea M. Matwyshyn (Editor)

eBook

$41.49  $55.00 Save 25% Current price is $41.49, Original price is $55. You Save 25%.

Available on Compatible NOOK devices, the free NOOK App and in My Digital Library.
WANT A NOOK?  Explore Now

Related collections and offers


Overview

As identity theft and corporate data vulnerability continue to escalate, corporations must protect both the valuable consumer data they collect and their own intangible assets. Both Congress and the states have passed laws to improve practices, but the rate of data loss persists unabated and companies remain slow to invest in information security. Engaged in a bottom-up investigation, Harboring Data reveals the emergent nature of data leakage and vulnerability, as well as some of the areas where our current regulatory frameworks fall short.

With insights from leading academics, information security professionals, and other area experts, this original work explores the business, legal, and social dynamics behind corporate information leakage and data breaches. The authors reveal common mistakes companies make, which breaches go unreported despite notification statutes, and surprising weaknesses in the federal laws that regulate financial data privacy, children's data collection, and health data privacy. This forward-looking book will be vital to meeting the increasing information security concerns that new data-intensive business models will have.


Product Details

ISBN-13: 9780804772594
Publisher: Stanford Law Books
Publication date: 10/06/2009
Sold by: Barnes & Noble
Format: eBook
Pages: 368
File size: 2 MB

About the Author

Andrea M. Matwyshyn is an Assistant Professor of Legal Studies and Business Ethics at The Wharton School at the University of Pennsylvania. Her research and consulting focus on U.S. and international issues of information policy, corporate best practices, data privacy, and technology regulation.

Read an Excerpt

HARBORING DATA

Information Security, Law, and the Corporation

STANFORD LAW BOOKS

Copyright © 2009 Board of Trustees of the Leland Stanford Junior University
All right reserved.

ISBN: 978-0-8047-6008-9


Chapter One

Looking at information Security through an interdisciplinary Lens Computer Science as a Social Science: Applications to Computer Security Jonathan Pincus, Sarah Blankinship, and Tomasz Ostwald Computer scientists have historically identified either as mathematicians (ah, the purity) or physicists (pretty good purity and much better government funding) ... my response to yet another outbreak of the "math vs. physics" debate was "we don't want to admit it, but we should really be debating whether we're more like sociologists or economists." Jonathan Pincus, Computer Science is really a Social Science

ALTHOUGH COMPUTER SCIENCE is not traditionally viewed as a social science, problems in its domain are inherently social in nature, relating to people, their interactions, and the relationships between them and their organizational contexts. Applying social science perspectives to the field of computer science not only helps explain current limitations and highlight emerging trends, but also points the way toward a radical rethinking of how to make progress on information security. The social aspects of computing arebecoming particularly visible in the discipline of computer security.

Computer security has historically been regarded primarily as a technical problem: if systems are correctly architected, designed, and implemented-and rely on provably strong foundations such as cryptography-they will be "secure" in the face of various attackers. In this view, today's endemic security problems can be reduced to limitations in the underlying theory and to failures of those who construct and use computer systems to choose (or follow) appropriate methods. The deluge of patches, viruses, trojans, rootkits, spyware, spam, and phishing that computer users today have to deal with-as well as societal issues such as identity theft, online espionage, and potential threats to national security such as cyberterrorism and cyberwar-illustrate the limitations of this approach.

In response, the focus of much computer security research and practice has shifted to include steadily more aspects of economics, education, psychology, and risk analysis-areas that are traditionally classified as social sciences. The relatively recent definition of the field of "security engineering" explicitly includes fields such as psychology and economics. "Usable security" includes perspectives such as design, human-computer interaction, and usability. The commonality here is a view that computer security today can be better addressed by defining the field more broadly.

The early work to date, however, only scratches the surface of these disciplines-or the greater potential for a redefinition of computer security. Anthropology, cultural studies, political science, history of technology and science, journalism, law, and many other disciplines and subdisciplines all provide important perspectives on the problems, and in many cases have useful techniques to contribute as well. Continued progress on computer security is likely to require all of these perspectives, as well as traditional computer science.

Social Science Perspectives on Computer Security

There are many different definitions for computer science. Wikipedia, for example, defines it as "the study of the theoretical foundations of information and computation and their implementation and application in computer systems," 9 but devotes a parallel page on "Diversity of computer science" to alternative definitions. For the purposes of this paper, we will use Peter Denning's definition from the Encyclopedia of Computer Science.

The computing profession is the people and institutions that have been created to take care of other people's concerns in information processing and coordination through worldwide communication systems. The profession contains various specialties such as computer science, computer engineering, software engineering, information systems, domain-specific applications, and computer systems. The discipline of computer science is the body of knowledge and practices used by computing professionals in their work.

Denning's definition is broader than most, but most accurately captures the role that "computing" fills in the world of the twenty-first century. The list of subdisciplines associated with computer science gives a very different picture. The Encyclopedia of Computer Science, for example, groups its articles into nine main themes: Hardware, Software, Computer Systems, Information and Data, Mathematics of Computing, Theory of Computation, Methodologies, Applications, and Computing Milieux. Where are the people?

The social sciences, conversely, are fundamentally about the people; viewing computer science as a social science thus fills this huge gap in the discipline of computer science. A billion people use computers directly; even more interact with banking, credit cards, telephones, and the computer-based systems used by government, hospitals, and businesses. It is not just that computer systems today are invariably used in a human, organizational, and societal context; it is that, increasingly, the human context dominates. In 2005, Pincus called out "network and software security" as one of the areas in computer science where social science perspectives have a major foothold:

In the security space, the now-obvious economic aspects of the problem, "social engineering" attacks, and what is often mistakenly referred to as "the stupid user problem" make it hard to avoid. Many people point to the relatively new field of "usable security" (starting with Alma Whiten's seminal "Why Johnny Can't Encrypt") as another example of considering broader perspectives. Work by people like Ross Anderson at Cambridge, Hal Varian at UC Berkeley, Shawn Butler at CMU, and Eric Rescorla at RTFM starts from an economic perspective and asks some very interesting questions here; it seems to me that traditional computer science techniques aren't really able to address these problems. There are now workshops devoted to Economics and Information Security.

Second, social sciences, like security threats, are by their nature evolutionary, while hard sciences are less so. Studying human norms of behavior and interaction trains sensitivity toward the types of evolutionary situations that pervade information security. As software security evolves, for example, so does the nature of threats. The focus of attackers moves away from the core operating system toward other, less-secure components such as third-party drivers, proprietary applications deployed on closed networks, user applications, and finally users themselves. Attackers use constantly evolving methods for tricking users and refine these methods based on knowledge gained from failed attacks.

Indeed, perspectives from diverse social science disciplines are relevant to the field of computer security and are beginning to yield important insights. For example, in economics, the costs and benefits of various information security responses are weighed in terms of both microeconomic and macroeconomic efficiency outcomes. Game theorists model security to gain insight into optimal strategies, frequently modeling security as a two-player game between an attacker and a network administrator. In psychology, computer criminal behavior is being studied empirically, as well as the psychological reactions to threats and countermeasures. Ethnographers are studying information technology professionals' responses to security incidents using traditional ethnographic techniques, as well as examining the culture of security researchers. Design increasingly affects the usability of privacy and security features. Epidemiology provides useful frameworks for examining the spread of computer viruses and malware. Sociologists and anthropologists examine aspects of hacker culture(s) and observe the different choices of mainstream and marginalized teens in panoptic social networks. International relations theorists are considering issues of cyberwar, including concerted attacks on an entire country's infrastructure. The discipline of science and technology studies is debating the evolution of information security as a field. Finally, the legal academy is engaged in active debate over legislation requiring disclosure of security incidents and legal remedies against spammers, for example. The movement toward an interdisciplinary lens in the discipline of information security is beginning to take hold.

Three Information Security Topics for Further Interdisciplinary Study

Three information security topics in particular are ripe for further interdisciplinary study-user error, measurement, and vulnerability disclosure.

User Error-and Human Error

User errors cause or contribute to most computer security failures.

People are an integral part of all computer-related social systems, and so issues related to fallibility or "human error" are an integral part of system engineering. The security of systems depends on user decisions, actions, and reactions to events in the system. Actions or decisions by a user that result in an unintended decrease in a system's security level are typically classified as "user error." Focusing on this term ignores the role of the rest of the system in causing user confusion. User errors frequently have more to do with the system's failure to reduce the likelihood of mistakes by users-or failure to limit the damage from these mistakes.

Consider the case in which a system is broken into because the administrator set up an account with a password that was guessed via a "dictionary attack." While the natural initial response is to blame the administrator's "user error" for the vulnerability, there are other questions worth asking as well, for example:

Why did the system fail to check proposed passwords and reject those that were likely to be vulnerable to such an attack? Why did the system allow multiple repeated failed attempts to log in (a necessary requirement for dictionary attacks)? Why is the system's security compromised by the guessing of a single password?

The social science aspects of this problem have long been recognized; Jerome Saltzer and Michael Schroeder specifically refer to user psychology in two of their ten Principles-"fail-safe defaults" and "psychological acceptability." Until relatively recently, however, these user-related issues were treated as separate from the "hard" aspects of computer security such as cryptography and secure system architecture. Alma Whitten and J. D. Tygar's 1999 paper "Why Johnny Can't Encrypt" examined students' failed efforts to encrypt their email using PGP, and is generally regarded as a catalyst to the relatively new subfield of "Usable Security." "Usable security" has had a significant impact, and the papers from many workshops and publications show how pervasively social science perspectives currently inform this aspect of computer security. However, little, if any, work has focused on the more general issues around "user error."

Meanwhile, the changing domains of computer security only serve to highlight the need for more work adopting the user perspective. Given richer user interfaces and interactions, we expect to see attacks aimed at affecting users' perceptual and cognitive capabilities. These are practical problems that cannot be solved by technologies and methodologies that are known today. For example, phishing has become a common and very real threat, but the effectiveness of currently available technical mitigations is still very limited. In particular, the rise of social networking applications helps not only users but also attackers understand where trust relationships exist within social networks and between individuals. Attacker awareness of trust relationships may allow attacks to become very precise and effective, even when conducted in a highly automated way, such as in context-aware phishing.

Several ideas from the philosophy of technology and science help explain the lack of earlier and faster progress on this important issue, as well as pointing to possible remaining obstacles. For example, standpoint theories suggest that it is the exclusion of the user from the software design process that leads to the failure of the system to meet the user's needs. Similarly, it is the marginalization of the user in the evaluation and analysis process that leads to the "blame" of user error. Combining these concepts with the perspective of user-centered design may respond to Whitten and Tygar's insight that security requires different design methodologies than those currently in use.

Other avenues point to additional possibilities. "Human error" has been studied for years in fields such as safety engineering and patient safety, and starting in the 1980s researchers began taking new perspectives, arguing, for example, that the right approach was to take a more general view of human action, including the role of users in compensating for weaknesses in the system caused by failures of the system designers. This work has strongly influenced the relatively new field of "resilience engineering." This field's paradigm for safety management focusing on helping people cope with complexity under pressure to achieve success "strongly contrasts with what is typical today, a paradigm of tabulating error as if it were a thing, followed by interventions to reduce this count."

Finally, from a different perspective, "human errors" also occur in the creation of software systems. A software security vulnerability results from one or more human errors made during the software development process, introduced at different phases of the development by different actors, inaccurate management decisions, flawed design, and coding bugs. One or more human errors typically also contribute to situations where the vulnerability was not identified and removed before the software shipped-for example, insufficient testing or incomplete communication with end users.

Measurement

Although essential, empirical validation of design methods is not easy ... I seek help in the research methods of social scientists, organizational behaviorists, and design researchers and I believe that their research methods will improve the empirical validation techniques of other software engineer researchers. and sensitivity analyses, this approach explicitly takes into account the multiple (and potentially conflicting) objectives of providing security for a computer-based system. SAEM (security attribute evaluation method) takes a similar multiple-attribute approach to cost-benefit analyses of alternative security designs. Paul Li has also applied actuarial techniques from the insurance industry to topics such as defect prediction.

(Continues...)



Excerpted from HARBORING DATA Copyright © 2009 by Board of Trustees of the Leland Stanford Junior University. Excerpted by permission.
All rights reserved. No part of this excerpt may be reproduced or reprinted without permission in writing from the publisher.
Excerpts are provided by Dial-A-Book Inc. solely for the personal use of visitors to this web site.

Table of Contents

Contents

Acknowledgments....................ix
Author Biographies....................xi
Introduction Andrea M. Matwyshyn....................3
1 Looking at information Security through an Interdisciplinary Lens Computer Science as a Social Science: Applications to Computer Security Jonathan Pincus Sarah Blankinship Tomasz Ostwald....................19
2 The Information Vulnerability Landscape Compromising Positions: Organizational and Hacker Responsibility for Exposed Digital Records Kris Erickson Philip N. Howard....................33
3 Reporting of Information Security Breaches A Reporter's View: Corporate Information Security and the Impact of Data Breach Notification Laws Kim Zetter....................50
4 Information Security and Patents Embedding Thickets in Information Security? Cryptography Patenting and Strategic Implications for Information Technology Greg R. Vetter....................64
5 Information Security and Trade Secrets Dangers from the Inside: Employees as Threats to Trade Secrets Elizabeth A. Rowe....................92
6 Information Security of Health Data Electronic Health Information Security and Privacy Sharona Hoffman Andy Podgurski....................103
7 Information Security of Financial Data Quasi-Secrets: The Nature of Financial Information and Its Implications for Data Security Cem Paya....................121
8 Information Security of Children's Data From "Ego" to "Social Comparison"-Cultural transmission and Child Data Protection Policies and Laws in a Digital Age Diana T. Slaughter-Defoe Zhenlin Wang....................145
9 Information Security and Contracts ContractingInsecurity: Software Licensing Terms That Undermine Information Security Jennifer A. Chandler....................159
10 Information Security, Law, and Data-Intensive Business Models Data Control and Social networking: Irreconcilable Ideas? Lilian Edwards Ian Brown....................202
Conclusion Andrea M. Matwyshyn....................228
Notes....................235
Bibliography....................295
Index....................333
From the B&N Reads Blog

Customer Reviews