Although interaction designers can strive for perfection and engineers endeavor to achieve complete reliability, there will always be situations where interactive systems produce incorrect outputs. In both human factors engineering and human computer interaction research, there is consensus that it is not feasible to completely test all aspects of these increasingly complex systems. This is particularly true for finding user interaction issues which have a negative impact on the overall user experience.
Our trust in interactive systems is not dissimilar to the complexities we encounter in human relationships, particularly with new acquaintances; we can mistakenly put too much trust or too little trust in them before we get to know them better and learn their faults. Users need to trust similarly imperfect interactive systems to avoid over-trusting an imperfect function, or mistrusting an accurate function; in other words, appropriately trusting an inaccurate system.
A person who is catching a bus may have a mobile phone application that tells schedule information; they can over-trust delay information and miss a bus that is early, or they can mistrust delay information and end up stuck in the rain waiting for the delayed bus. As interactive systems become more complex, users may accept system imperfections much as they can accept imperfections in their relationships with people.
Much like their human relationships, users expect their systems to show their vulnerabilities so that they can trust them appropriately. Showing uncertainties up front will not cause a breach of trust when the system fails. Conversely, hiding these faults from the user to present an image of infallibility will surely destroy a trusting relationship when that system fails. This is, without doubt, different from the design of previous technologies that aimed to promote trust through showing strength and perfection. A toaster needed to show that it would work every time and there was no room for uncertainty. How different from a GPS system that we know will be uncertain and at times completely wrong.
An Inaccurate System that Works
It is not hard to find a GPS horror story; it is a device so notorious for its mistakes that it has become easy fodder for cheap laughs on sitcoms. Whether it is wrong directions, an out-of-date map, or even simply the signal not working, these inaccuracies can weigh heavily on the experiences of frustrated users. As one of the inaccurate technologies that we studied in a larger project covering trust in technology, issues with the inaccurate GPS technology became very apparent.
GPS can present the user with an inaccurate location, often the result of a poor signal strength caused by bad weather or tall buildings. Although the signal shown on the GPS device is calibrated by satellite signals and telephone towers, this information is not presented to the users. The inaccuracy of the signal is a source of frustration to all users, but those with technical understanding of GPS might not over-trust the inaccuracies nor distrust accurate readings. All users have a better chance of appropriately trusting the GPS if the function of the device is transparent in the design of the interface. Transparency supports the user’s understanding of how the system works, thereby supporting their trust in it.
As a step in the right direction, some map applications do take into account the inaccuracies of the GPS signal. When a location pinpoint is being generated, instead of showing an exact position, a larger circular area is shown to indicate general area. Although it is not exact, this circle admits the system’s possible inaccuracies, rather than showing an approximate pinpoint that appears to move. Users do not over-trust its inaccurate position, nor mistrust a moving position point. This use of ambiguity shows the system’s vulnerability up front, allowing the user to trust in its capabilities appropriately.
Another aspect of GPS that can promote trust is the capability for users to use the system in their own way. Maps used by the system can be out-of-date or simply not take into account real traffic information. Contextual information and the user’s knowledge of the route are not accounted for in the directions given by the system. Well-designed GPS systems change the route if a user takes a turn different from the suggested one, perhaps because a road is closed or a bridge is under construction. “GPS recalculating” is a way of allowing the device to be used in conjunction with the user’s knowledge and the immediate context of use. This function allows the user to openly interpret the use of GPS, and to trust it more as a fallible companion than as an authority. Designing a device for open interpretation allows users the freedom to use it as they see fit, promoting trust in its capabilities.
The designer of any interactive system can take lessons from the unorthodox role model of GPS. This technology has many faults and inaccuracies, but users still adopt these systems and rely on them. The designers of more successful GPS applications acknowledge the faults and, instead of hiding them, put them in full view of users, complete with design qualities such as transparency, ambiguity, and open interpretation. These experiential qualities, if used carefully, can promote user trust in imperfect systems.
The Human Example
If people don’t share their vulnerabilities, they either can’t see their weaknesses or they do not want to share them; either case breeds doubt in another person. If someone is to trust you, you need to be willing to show yourself as you are without a façade to cover your weaknesses and imperfections. The same can be true with complex technology. Since it cannot be perfect, it should be designed to present its weaknesses and imperfections to users. A system that presents itself with its weaknesses or uncertainties in full view of the user allows the user to put appropriate trust in it. This will lead to a trusting human-system relationship. If the system does fail, this failure will not cause a damaging breach of trust, similar to the dynamics of a trusting human-human relationship.
As trusting relationships with technologies become more human-like, new responsibilities arise. Designers should develop these systems to respect the users the same way human relationships involve respect. By showing their vulnerabilities, neither systems nor people hide aspects under a façade—one component of a respectful relationship. When designers acknowledge the vulnerable aspects of an interactive system, users may gain respect. Through the use of design qualities such as transparency, ambiguity, and open interpretation, system designers can show vulnerabilities, promote understanding, and allow freedom for users to interpret the system themselves and use it as they see fit.
Retrieved from https://oldmagazine.uxpa.org/trusting-inaccurate-systems/