How automated agents treat humans and other automated agents in situations of inequity: An experimental study

Ron Katz, Sarit Kraus

Research output: Contribution to journalArticlepeer-review

Abstract

This paper explores the question of how agent designers perceive and treat their agent's opponents. In particular, it examines the influence of the opponent's identity (human vs. automated agent) in negotiations. We empirically demonstrate that when people interact spontaneously they treat human opponents differently than automated agents in the context of equity and fairness considerations. However, these difference vanish when people design and implement agents that will interact on their behalf. Nevertheless, the commitment of the agents to honor agreements with people is higher than their commitment to other agents. In the experiments, which comprised 147 computer science students, we used the Colored Trails game as the negotiation environment. We suggest possible explanations for the relationships among online players, agent designers, human opponents and automated opponents. Copyright © 2008, International Foundation for Autonomous Agents and Multiagent Systems (www.ifaamas.org). All rights reserved.

Fingerprint

Dive into the research topics of 'How automated agents treat humans and other automated agents in situations of inequity: An experimental study'. Together they form a unique fingerprint.

Cite this