Author(s)
Source
Penn State Law Review, Vol. 114, No. 3, pg. 809, 2010
Summary
Anthropomorphic interfaces present challenges to privacy and opportunities to enhance privacy protection of users.
Policy Relevance
Regulators should be wary of the chilling effects that anthropomorphic technology can have on speech and expression, and they could consider warning consumers of non-obvious discomfort and social inhibition it may trigger. Anthropomorphic interfaces might also be used to improve privacy notices, by making people more careful of how they disclose their personal information—a point elaborated at length by the author in subsequent work.
Main Points
-
Machines and interfaces are increasingly becoming more human-like in their appearance and interaction, and they are being introduced into more contexts, including houses, cars, offices, mobile devices and computers.
-
In the near future, searches will be conducted through voice commands, and will take the form of a conversation rather than a text query.
-
Online marketing tools will be highly interactive and mimic human interactions.
-
Anthropomorphic machines will be more present in cars, in the home, and accompanying a person as a mobile device.
-
People react to human-like machines as if they were other humans.
-
They feel observed and evaluated.
-
They present themselves in a more positive light.
-
They lessen how much they reveal about themselves.
-
They give more.
-
They cheat less.
-
They are more anxious about their performance of tasks.
-
Technology and design can harm privacy not only by collecting information about the user, but also by manipulating the user’s experience.
-
As anthropomorphic machines increasingly enter traditionally private spaces, people will suffer more privacy harms, chilling speech, self-expression and curiosity, and the legal zone of privacy may decrease.
-
Anthropomorphic interfaces could be used to improve the privacy practices of websites, by increasing users’ sensitivity to disclosing information.