AI manipulation will force us to rethink privacy

The gaming industry confronts an AI designed specifically to turn players into big spenders.

Abstract art

This is a classic example of modern AI - take easily available toolsets developed by one of the platforms (Amazon, Google, Microsoft), apply them to a problem that has a lot of data available for training and deploy in an application that generates even more granular data, fast, vast and in real-time. Use the AI to predict user behavior, then use the AI to guide users into behaviors that make them even more predictable.

Gaming is perfect, because the objective is monetization of users, which is a defined and measurable goal, affected by multiple, subtle, behavioral factors that only an AI can capture, aggregate, understand and respond to.

Henry Fong, the CEO of Yodo1 which owns the game, describes himself as “lazy” which is why he likes to have AI do the work. In 2018 he decided to teach an AI to moderate a community of millions of users, find the potential whales and then figure out how to get them to stay and spend even more. The AI looks for patterns in spending velocity, the amount of time players spend in the game, how many sessions, what guilds they're in and what they are likely to buy. It then predicts what the player will do if offered certain paths or game-in-bundles. The accuracy of the AI after two weeks of training was 87% and Fong thinks he can get it up to 95%.

AI is great at finding things that are non-intuitive to humans so, perhaps unsurprisingly, the AI found behavioral patterns that ran counter to the expert’s intuition.

"The funny thing is, I always used to think that if you monetise your audience too hard, they'll leave the game. But it's actually the other way around. Once they start spending, they don't leave. They want to stay in the game and preserve their investment, and when they stay in the game, they spend more." - Fong

Also unsurprisingly, the revelation kicked off a debate about whether this is an ethical use of AI. On one side, people are free to spend whatever they want. On the other side, the designers have a responsibility to users - whether it’s understanding whether they are in a position to spend that much money through to whether it’s ok to prey on addicts who are unable to control their impulses. As Handrahan pointed out in his op-ed, the designers completely neglected a third option - have people stay in the game and not spend beyond what they can afford.

All of this raises a modern privacy question, one we should ask in the Age of AI. In a situation where a powerful intelligence knows more about a user’s future behavior than they do, what is the user’s right to have their autonomy preserved?

Great! You’ve successfully signed up.

Welcome back! You've successfully signed in.

You've successfully subscribed to Artificiality.

Success! Check your email for magic link to sign-in.

Success! Your billing info has been updated.

Your billing was not updated.