Witnessing of systems: systems do not witness people, people witness systems

For most people, there is a great discrepancy between perceiving and understanding computers. Many people look at a computer, but do not really see nor understand what they are dealing with. When people tune their presence to systems, they train themselves to communicate in the way systems require. Often people will blindly do what the computer tells them to do and suspend their own judgment. A classic example is people who blindly follow their GPS units to narrow footpaths because the GPS unit said to go there. In crisis situations, trust in systems only ameliorates and chances for proper judgment diminish as is shown by research into crisis management (interview Quillinan 2009).

When designing systems, the mapping between human values (what behaviour would they like) and technical values (what are the technical constraints) is very complex. Dynamics of self-organization and adaptation deeply influence how technology is used. However, systems remain dependent on hardware infrastructure, which is located in specific places that are subjected to specific systems of law. In the global network environment, surpassing national boundaries, clashes happen on issues of privacy, data retention, control and monitoring. Values like autonomy, transparency, traceability, security and privacy define the design and social impact of distributed systems. Values of systems are understood locally and are defined by the political and economical cultures within which they function (interview van Splunter 2009).

CN