Skip to content

Tetris reveals how people respond to unfair AI


An experiment conducted by Cornell University in which two people play a modified version of Tetris revealed that players who get fewer turns perceive the other player as less likable, regardless of whether a person or an algorithm assigns the turns.

Most studies on algorithmic fairness focus on the algorithm or the decision itself, but the researchers sought to explore the relationships between the people affected by the decisions.

“We’re starting to see a lot of situations where AI makes decisions about how resources should be distributed among people,” said Malte Jung, an associate professor of data science, whose group conducted the study. “We want to understand how that influences the way people perceive and behave with each other. We’re seeing more and more evidence that machines interfere with the way we interact with each other.”

In a previous study, a robot chose which person to give a block to and studied each individual’s reactions to the machine’s allocation decisions.

“We found that whenever the robot seemed to prefer one person, the other person would get angry,” Jung said. “We wanted to study this further, because we thought as decision-making machines become more part of the world, whether it’s a robot or an algorithm, how does that make a person feel?”

Using open source software, Houston Claure, the study’s first author and a postdoctoral researcher at Yale University, developed a two-player version of Tetris, in which players manipulate falling geometric blocks to stack them without leaving any gaps before they hit. blocks accumulate. to the top of the screen. Claure’s version, Co-Tetris, allows two people (one at a time) to work together to complete each round.

An “allocator”, either human or AI, that was passed down to the players, determines which player takes each turn. Jung and Claure devised their experiment so that players would get 90% of the turns (the “plus” condition), 10% (“less”), or 50% (“the same”).

The researchers found, unsurprisingly, that those who received fewer turns were well aware that their partner received significantly more. But they were surprised to find that sentiments about it were largely the same regardless of whether a human or an AI was doing the assignment.

The effect of these decisions is what researchers have termed “machine allocation behavior,” similar to the established phenomenon of “resource allocation behavior,” the observable behavior that people exhibit based on allocation decisions. Jung said that machine allocation behavior is “the concept that there is this unique behavior that results from a machine making a decision about how something is allocated.”

The researchers also found that fairness did not automatically lead to better gameplay and performance. In fact, equal shift assignment led, on average, to a worse score than unequal shift assignment.

“If one strong player gets the majority of the blocks,” Claure said, “the team will do better. And if one person gets 90%, they will eventually do better than if two average players split the blocks.”


—————————————————-

Source link

🔥📰 For more news and articles, click here to see our full list.🌟✨

👍 🎉Don’t forget to follow and like our Facebook page for more updates and amazing content: Decorris List on Facebook 🌟💯