We consider settings where owners of electric vehicles (EVs) participate in a market mechanism to charge their vehicles. Existing work on such mechanisms has typically assumed that participants are fully rational and can report their preferences accurately via some interface to the mechanism or to a software agent participating on their behalf. However, this may not be reasonable in settings with non-expert human end-users.Thus, our overarching aim in this paper is to determine experimentally if a fully expressive market interface that enables accurate preference reports is suitable for the EV charging domain, or, alternatively, if a simpler, restricted interface that reduces the space of possible options is preferable. In doing this, we measure the performance of an interface both in terms of how it helps participants maximise their utility and how it affects deliberation time. Our secondary objective is to contrast two different types of restricted interfaces that vary in how they restrict the space of preferences that can be reported. To enable this analysis, we develop a novel game that replicates key features of an abstract EV charging scenario. In two experiments with over 300 users, we show that restricting the users' preferences significantly reduces the time they spend deliberating (by up to half in some cases). An extensive usability survey confirms that this restriction is furthermore associated with a lower perceived cognitive burden on the users. More surprisingly, at the same time, using restricted interfaces leads to an increase in the users' performance compared to the fully expressive interface (by up to 70%). We also show that some restricted interfaces have the desirable effect of reducing the energy consumption of their users by up to 20% while achieving the same utility as other interfaces. Finally, we find that a reinforcement learning agent displays similar performance trends to human users, enabling a novel methodology for evaluating market interfaces.
Bibliographical notePublisher Copyright:
© 2017 AI Access Foundation. All rights reserved.