When design fails, there can be enormous consequences, such as physical and emotional harm to people. And, when technology design defies the legal system, organizational policies, or even informal cultural norms, it can also become something negatively newsworthy, destroy public trust, create destructive consumer connotations for a company, and disappoint shareholders and clients. In other words, designers hold a lot of power and design is a domain of great responsibility.
Robots are part of a human-technology system we understand is culturally unique in many ways. For example, it is still relatively uncommon for someone to have had real life experience interacting with a robot, yet many people believe they have a common understanding of what a robot interaction might be like—and base their expectations of a robot’s design and behavior and abilities—not on real experience, but on science fiction cultural touchstones. Sometimes, these misaligned expectations cause people to overattribute intelligence and abilities to the robot beyond the real scope of its true functionality, which can cause overreliance or disappointment in a system. Additionally, robots, as mechanical tools, can be dangerous in their physicality, functions, or in the context in which they are used. Yet, research also shows that in some situations, people tend to treat robots in social ways, similar to a pet, another person, or as an extension of Self. Furthermore, sometimes people develop affection for or emotional attachment to robots, which leads to special considerations developers must contemplate throughout the design process.
In this talk, Ethics and Emerging Sciences Group Research Fellow Dr. Julie Carpenter will discuss specific ethical issues and challenges of robot design, what human factors are unique to robot design considerations, and how we can approach design with values and morals.
Access slide deck here: slideshare.net/secret/1DU7iLWtQuXEUb