Sony Says Games Will Read Emotions in 10 Years

Sony is talking crazy, indicating that games may be able to tell if you’re lying or depressed just ten years down the road. We’ll stick with growing crops, thanks.

Seriously, when do games stop being games and cross over into virtual reality? This was the question I asked Nvidia months ago at ECGC 2011, and was told there will always be a market for the high-end PC gamer with the rig nearly the size of a bookcase. But putting visual realism aside, what will happen when games suddenly stop acting like games, and become more like a self-aware super AI that could possibly one day sing you happy birthday or annihilate the human race?

According to Sony Worldwide Studios chief Shuhei Yoshida, platform holders will be able to offer “almost dangerous kinds of interactivity” with the player within the next ten years. Games will know more about the player on a whole, know how they could be feeling by reading more than just player movements. Titles will be so “immersive” that players will serve as actors, as a true participant within the virtual realm.

“As far as I’m concerned, the motion control of today is like the 8-bit phase of video games,” Yoshida said last week at a behind-closed-doors Gamescom panel debate. “There are so many limitations. Talking about sensors, the game will eventually know more about the player. Not just movement, but where you are looking and how you could be feeling. It’s really difficult to judge this, but I’d like to think that in ten years game developers will have access to player information in real-time. We can create some really… almost dangerous kinds of interactivity.”

Mick Hocking, a senior director at Sony Worldwide Studios, chimed in when asked if Sony was currently testing technologies relying on biometric data. Naturally he dodged answering the question directly by stating that Sony does lots of R&D in these areas.

“Having a camera being able to study a player’s biometrics and movements [is possible],” Hocking said. “So perhaps you can play a detective game that decides whether you’re lying due to what it reads from your face.”

“In ten years’ time I’d like to think we’ll be able to form a map of the player, combining other sorts of sensory data together, from facial expressions to heart rate,” he continued. “You can see how, over a period of time, you can form a map of the player and their emotional state, whether they’re sad or happy. Maybe people in their social network can comment on it. The more accurate that map can become, the more we can tailor it to the experience.”

Hocking seems to hope that AI in ten year’s time won’t still feel like “acting,” but will react more naturally, independent of scripts and pre-determined movements. “In Uncharted you can see games are getting closer to lifelike actor performances, but [despite] the more accurate they are becoming as an acting performance, it’s still acting. Will we have AI that allows us to talk to and truly interact with a character? Will we be able to show the character objects it can recognize?”

Do gamers really need that kind of interaction? Again, when do games stop serving as games, and become more like virtual reality experiences? As long as the AI doesn’t start popping off family members in fear of being disconnected from the (home or space station) network, we should be good to go.

Written by: Kevin Parrish, Tom’s Guide
Posted by: Situated Research

This Post Has 12 Comments

  1. Quite interesting and this software can probably be used for other purposes too like tracking smugglers or terrorists on airports (when implemented in face recognition programs and camera control installations).

Leave a Reply

Back To Top
Search