How automated agents treat humans and other automated agents in situations of inequity: an experimental study

Ron Katz, S. Kraus

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

Abstract

This paper explores the question of how agent designers perceive and treat their agent's opponents. In particular, it examines the influence of the opponent's identity (human vs. automated agent) in negotiations. We empirically demonstrate that when people interact spontaneously they treat human opponents differently than automated agents in the context of equity and fairness considerations. However, these difference vanish when people design and implement agents that will interact on their behalf. Nevertheless, the commitment of the agents to honor agreements with people is higher than their commitment to other agents. In the experiments, which comprised 147 computer science students, we used the Colored Trails game as the negotiation environment. We suggest possible explanations for the relationships among online players, agent designers, human opponents and automated opponents.
Original languageAmerican English
Title of host publication7th international joint conference on Autonomous agents and multiagent systems
PublisherInternational Foundation for Autonomous Agents and Multiagent Systems
StatePublished - 2008

Bibliographical note

Place of conference:Portugal

Fingerprint

Dive into the research topics of 'How automated agents treat humans and other automated agents in situations of inequity: an experimental study'. Together they form a unique fingerprint.

Cite this