This page is no longer maintained. Please go to current publications, research, or overview.

Design for Values, Design for Trust

How is it that values such as privacy and autonomy become embedded in technical designs? How do cultural concepts of privacy, property, and propriety become assumptions about trust embedded in the coded infrastructure? Design for values theory, method, bibliography, and practitioners are described at

Design for values is a methodological approach based on a soft technological determinism, based on itterative evaluation of technology using the tools of the social science and detailed technical examination.

Above all design for values is design of technology in its social, economic, and political context.
An understanding of design for values begins with the major strands of theoretical work and must include methodological approaches. on the interactions of technological development and social values.

First is technological determinism: what is technologically possible will inevitably be developed and the characteristics of the newly developed technologies will alter society as the technology is adopted. The second view is social constructed: technologies are constructed by the stakeholders, including inventors and governments, on the basis of social values. Some proponents of this view hold that users are the only critical stakeholders, that adoption is innovation and thus technology is defined by the users. The third view is that values emerge in a dynamic fashion -- while technologies have biases the way in which technologies are adopted alters the values in the technology, and thus the future design of the technology in a interactive, almost evolutionary, manner. All three theoretical frameworks support the argument that values can be embedded at any stage in the development process: invention, adoption, diffusion, and iterative improvement.
In addition to the general work on science, technology, and values of which science studies is a great contributor there is also computer-specific work. For example, the technologically determinant, socially constructed, and dynamic iterative models of technological development have clear parallels in, receptively, the technical, preexisting, and emergent models proposed for computing systems by Friedman & Nissenbaum. Technical bias is that which is inherent in or determined by the technology. Preexisting bias is that bias which had been previously socially constructed and is integrated into the technology. Emergent bias develops as a system is used and develops in a specific social context.

My current theoretical foundation for the examination of values in communications technology is built upon past research in the classification of specific values, in particular security and privacy. Security failures are defined as either coding errors or emergent errors, where coding errors are further delineated into either logical flaws in the high level code or simple buffer overruns. The logical, human error, and environment errors correspond loosely to technical determinant, (accidental) social determinant, and iterative embedding of security values in the code. In privacy there are definitions of private based on system design (as in the American Code of Fair Information Practice) and based on data use (as in the EU Directive) which have the same aims. In both cases the privacy values can be argued as resulting from inherent characteristics of the technology, elements of specific product design, or implementation environment as well as the interaction of these three. In the case of communications protocols even the least technological deterministic arguments do not contend that most biases are explicit choices. Biases in communications technologies often result from the omission of variables in the design stage (e.g. packet-based networks are survivable and incidentally censorship is difficult). Complicating the lack of examination of inherent and emergent (but economically predictable) biases is the reality that once adopted, technical standards are difficult to replace.

Some decisions which may exist in reality as values -determinant, for example backward compatibility for nth generation wireless systems, which is a determinant of cost and therefore of accessibility. Conversely backwards compatibility enables the adoption of obsolete technology for regions which have less capital. In this case a choice on the basis of expense and compatibility can reasonably be said to built a system which values the marginal first world consumer more or less against the infrastructure needs of the third world consumer. Such economic biases are inherent in engineering and design decisions.

Sometimes it is the case that values are embedded through the assumptions of the designers, a case well made in a study of universal service and QoS. Similarly much of the discussion of privacy and commerce can be arguably based on the assumption of middle-class designers about the existence of identity-linked accounts. Such assumptions can be avoided. In contrast, design for computer security requires developing mechanisms to allow and refuse trust, and thus it may be necessary to embed particular assumptions in the designs of security mechanisms. The issue of human/computer interaction further complicates the design for values system. Any interface must make some assumption about the nature of simplification, and the suitable metaphors for interface (e.g., why does a button make sense rather than a switch or path?). Yet choosing against a simplifying interface is itself a values-laden choice, limiting access to a technology.

Even when technologists set out to design for a specific value sometimes that value is not the result according to the social consensus. For example, the Platform for Privacy Preferences has been described by CPSR as a mechanism for ensuring that customer data are freely available to merchants while its designers were clearly seeking customer empowerment. Similarly PICS has been described as a technology for human autonomy and as "the devil" for its ability to enhance the capabilities of censorship regimes worldwide. In both of these cases (developed by the World Wide Web Consortium) the disagreement about values is a result of assumptions of the relative power of all the participants in a transactions or other interactions. If the designers' assumptions of fundamental bargaining equality are correct then these are indeed technologies of freedom (to quote the famed book by Poole). On the other hand the critics of these technologies are evaluating the discussion of the implementation of these technologies in a world marked by differences in autonomy ranging from the clients of Amnesty International to the clients of the Savoy.

In what cases can the assumption of values be prevented by the application of superior engineering and in what cases are assumptions about values and humans an integral part of the problem-solving process? Can these two cases be systematically distinguished so that the engineer can know when the guidance of philosophers or social sciences is most needed?

Thus design for values includes the evaluation of past designs with a critical eye on the initial design, improvement of specific designs, and the development of guidelines for designs. There is a specific design focus distinct from those methods that are focused on critique rather than design. As opposed to traditional technical approaches to socially responsible design, there is a focus on iteration and the use of legal and social scholarship to refine or correct designs that builds upon computer supported cooperative work.

Design for values is technological design with explicit recognition of the economic and political context. It is inherently interdisciplinary.

For future design for values events and events which can encompass design for values elements please see the listing of interdisciplinary events, projects, and researchers in the area at the Comprehensive listing of interdisciplinary events and calls.

Current Researchers

Warigia Bowman
Allan Friedman
Carlos Osorio
Serena Chan
Taiyu Chen
Alla Genkina
Gayathri Athreya
Tony Moore


Sara Wilford on privacy
Sabine Schaffer on trust in the Internet
Carlos Osorio on open code
Serena Symne
Serena Chan
Tony Vila
Brian Anderson

Paper and Reports

By type
By topic