“Gaming has always been in our radar”, says Affectiva’s co-founder.
Affectiva, a Walthaman based startup born from MIT Media Lab, will release a piece of software so powerful it can deduce many emotions by scanning your face, yet so light it only needs a webcam and can run inside a mobile app.
By analyzing your facial expressions in real time, this emotion-sensing tech can pinpoint your feelings (from simple smiles and frowns to subtler clues of anxiety or sadness), keep track of them and many other stats, and use them for videogame calculations.
What this means is, with this software you can develop, among many other things:
- Videogame AI’s that will proactively react to your player’s feelings to cause the biggest impact. This of course benefits horror/action/shooter games the most (remember the creepy af alien from Alien: Isolation? How about it could smell your fear for real?).
- Games whose gameplay becomes smoother or give you a warning if they know you are feeling frustrated or tired (Earthbound reminded their players to turn off the console after a certain amount of gameplay, and mechanics like New Super Mario Bros’ player bubble and auto play demonstrate anti-stress features are welcomed).
- Directly integrating your emotions with the gameplay (think the emotion buttons on Super Princess Peach, or for more refined approaches, if Geometry Dash and other rhythm game stages and music could change to better fit – or confront, your emotions).
- Outright changing the story of a game according to how you feel about the decisions you are making (think how much story-heavy games like The Walking Dead would change if the NPCs could know you are lying, and act accordingly?).
- Difficulty levels that adjust to YOU rather than to pre-set rules and numbers the devs have to figure out. “Intense” modes and the like could just rev it up until they detect you are at your limit and settle over time thanks to the statistics they will keep about you.
- Multiplayer experiences where you can receive feedback on the other players emotion. This is really taking it up a bit far, as competitive gaming can take its emotion reads very seriously, but it would be a nice substitute to face-to-face interactions (for competitive gaming, mind you) when playing online, and your “poker face” techniques would still shine. It could maybe put online gaming on par of tabletop gaming in the bluffing/reading department.
- Scan emotions on playtesting sessions or eSport streams. The former will help developers greatly (as verbal reports can get a bit skewed but your face won’t lie unless you try – and are good), while the latter can help both determine the involvement of the audience and let them know the mental state of the players in high competitive matches.
I’m just touching the tip of the iceberg, of course.
When the started organizing their upcoming hackaton for local coders to test their new facial-reading technology, the campaign was developed with the idea of attracting a gender-balanced audience (and they succeeded, half of the attendees will be women). What no one saw coming, though, was a partnership with video game developers was their plan all along, too.
According to Rana el Kaliouby, Affectiva’s co-founder, “games tend to bring out very strong emotions, so we decided that as a company we would enter that in a big way”.
Nevermind, a first-person psychological thriller developed by Flying Molusk, will be the first game to showcase this technology – the more frightened you are, the harder the game will be. This is in tandem with their already-implemented heart rate monitor, another way tech is incorporating emotions in gaming.
This update to Nevermind will only need the webcam in your machine to track your facial expressions and present you with harder challenges the more stressed out you are.
The game’s goal, though, is to force you to keep calmed as you stroll through seemingly draconian tasks – after all, the only way to complete the game is to control yourself. “It’s a stress management tool disguised as a game,” says Erin Reynolds, Flying Mollusk’s founder. By reflecting and rewarding your ability to control your negative emotions, Reynolds hopes mastering the game will lead to practical uses in the real world, say, when you are under heavy traffic or going to a stressful meeting.
Of course, this facial recognition technology is already being licensed for all kinds of purposes, from marketing and advertisement to helping robotic partners and AIs interact with people (eg: helping nurse robots to take better care of patients, or smart houses to accommodate light, music volume and such dynamically).
But gaming is a a very important aspect to Affectiva, and as such, they have released their free API and an Unity SDK for developers to start working ASAP with the technology. The good thing is, since it requires no more hardware than a regular web/phone cam, the only limit is the developer’s imagination on how to implement the data, and the company promotes its joint use with other “Biofeedback” tech like Nest’s thermostat, Amazon’s echo speaker and Pavlok’s shock wristband (right now used to control bad habits).
Remember there’s also Kinect and VR (with all their unusual gadgets, NSFW) and those super realist virtual universes like Tron or the Matrix stop looking like “could happen” to rather be “hey, the cards are dealt”.
Btw, sign ups to the hackaton are still open, the event is March 4th and Affectiva really wants you to take emotion gaming to its fullest.
Courtesy of BetaBoston.Tags: Affectiva AI Artifial Inteligence Biofeedback Emotion gaming Unity