TAMED is a University of Malta project developing ‘affective computing’ in video games, which would enable them to interpret human emotions. Jonathan Firbank goes into detail with Dr Konstantinos Makantasis.
2004 was a landmark year for the gaming industry. The most revolutionary release was Half Life 2, Valve’s flagship title. A great deal of its technological innovation was in service to making characters that players could empathise with. Valve wanted to ‘broaden the emotional palette in games’, allowing ‘players to see the characters as real people’. In-game characters were given accurate facial muscles to make expressions which were informed by the work of psychologist Dr Paul Ekman. Hundreds of pages of dialogue were honed for years, then given to an experienced cast of Hollywood actors.
The work paid off, and Half Life 2 set a standard that has been improved upon since. The universally acclaimed Last of Us and Red Dead Redemption series applied the same principles to much more powerful game engines and hardware. They cemented themselves as classics thanks largely to a cast of characters that players empathised with. Over and over again, new standards have been set for visual assets, virtual physics, and storytelling as the industry raced toward a standard that competes with live-action media. But merely improving existing methods has diminishing returns; the gaming industry is overdue for a technological leap of Half Life 2’s magnitude. And the next leap in technology might not make you empathise with characters. It might make characters empathise with you.
TAMED stands for ‘Tensor-based Machine learning towards genEral moDels of affect’. The University of Malta’s (UM) Institute of Digital Games is developing ‘affective computing.’ This is essentially artificial psychology technology that reads a user’s emotions. Video games are uniquely suited for researching artificial psychology and uniquely positioned to benefit from it.
Dr Konstantinos Makantasis, Post-Doctoral Researcher at UM and key researcher at TAMED, elaborates: ‘Models of affect refers to machine learning models capable of predicting humans’ emotional states when they are performing a task, such as playing a game, watching a movie, listening to music.’ As Makantasis explains, research began in the 90s, but the models that were developed weren’t powerful enough. ‘Emotions are subjective, while computers are deterministic and operate in a fully objective manner.’ Prior models suffered from a common issue in machine learning, the ‘generalisation problem’. ‘Models can accurately predict the emotional state of users whose data has been processed and used for building the models but fail to predict the emotional state of other users.’ This is an issue that has plagued the AI space, with limited data sets resulting in biased systems.
‘The TAMED project tries to address this problem by introducing novel methodologies that cover the whole chain of affect modelling: models’ input, models’ output, and models’ form.’ Here, the ‘input’ data includes but is not limited to heart rate, facial expressions, and utterances, which can be captured with sensors as well as your device’s camera and microphones. Another source of data comes from how a player interacts with a game. Gamers might be familiar with ‘going on tilt’, where frustration impacts how they play. This and any other emotional response in-game is an important source of data. Instead of simply cataloguing these seemingly endless variables, TAMED takes a holistic approach. ‘We believe that all these modalities are not independent. On the contrary, they should be combined together into unified, complex but compact data representations revealing all potentially useful information for predicting users’ emotions.’
TAMED’s solution to emotions’ subjectivity is an elegant one. ‘Instead of predicting the intensity of emotions, our models predict changes in emotions’ intensity.’ Rather than attempting what Makantasis calls ‘absolute predictions’, TAMED seeks to make ‘relative predictions’. If a user responded positively to one part of the game and negatively to another, TAMED could compare their responses and use machine learning to build a profile of that user: ‘for example, the heart rate of different users may vary, but the temporal changes in their heart rate may follow similar patterns.’ That data could then be used to personalise the user’s experience.
Video games are uniquely placed to collect emotional data. They can be extremely fluid in terms of genre and tone. The aforementioned Half Life 2 takes players through bombastic action sequences, dystopian tension, outright horror, and moments of calm and levity with relatable characters. Makantasis describes video games as ‘eliciting a plethora of emotions’. TAMED is positioned to take full advantage of this as the project is partnered with Massive Entertainment. This game developer is under Ubisoft’s umbrella and has been releasing progressively more ambitious products for years, currently developing games under the Star Wars and Avatar licences a decade after successful releases in the Far Cry and Assassin’s Creed chronologies. This allows TAMED’s technology to be paired with complex commercial games in addition to simpler, in-house testbeds. It also hints at how leading developers see the future of gaming.
The most vital (and unique) aspect of video games for TAMED is their interactivity. Interpreting user interactions is a novel source of data. Makantasis explains, ‘the content of an interactive task has some very important properties.’ Unlike other data capture methods that TAMED utilises, ‘capturing this kind of information does not require specialised sensors and is not an invasive task.’ This information leads to very accurate models of affect, since the emotional state of a user is embedded in the content of interaction. And most importantly, this kind of information does not consist of users’ direct measurements, such as facial or physiology data.’ Interpreting interactivity does not require users to sacrifice privacy, a virtue that seems increasingly rare in machine learning spaces.
‘While modelling players’ behaviour is a well-studied problem, building players’ emotional profiles has not been given much attention.’ Out of all the data capture methods that TAMED incorporates, this is the most useful and the most revolutionary. Makantasis believes this approach could be crucial for the sake of making the best user experience possible, which in turn generates consumer loyalty. TAMED’s vision is ‘games that, in a dynamic and automatic manner, adapt to players’ moods.’
It’s clear the applications of this technology extend far beyond gaming. The term ‘Artificial Psychology’ actually predates the first commercial video game. It was coined by Dan Curtis in 1963 for what was then a theoretical academic discipline. As more and more AI systems process our data, the theoretical is becoming actual. TAMED accelerates this by proposing a novel way of acquiring subjective emotional data that doesn’t depend on vast datasets, which means less computational power is needed and less ‘generalisation problems’ are created. Curtis’s vision of an AI that works autonomously with new, abstract, or incomplete data is manifest in this concept. As Makantasis states, ‘Artificial intelligence is ubiquitous in our everyday life. Mobile applications suggest movies to see, songs to listen to, they help us to find the best route to go to our work, and so on. Equipping these applications with emotional intelligence is the next big thing. Recommendations based not only on users’ historical data but also on their current mood will be more accurate and more to the point.’
The most glaring use for emotionally intelligent AI is in the field of mental health, which closes the loop that Half Life 2 began. Dr Paul Ekman’s facial expression expertise, which was so crucial to the game’s development, was originally intended for treating mental disorders. TAMED’s empathic AI, on the other hand, is intended for video games. It could one day be a major tool in the field of mental health. ‘Healthcare solutions can benefit from emotion modelling to automatically assess the mental health of patients,’ continues Makantasis. ‘Understanding emotional states is the first crucial step for assessing and ensuring mental well-being. This can be done through games, by developing games that adapt to users’ mood and “stir” users’ mood in a specific direction. But this can also be done almost everywhere else as well — in workplaces, in medical centres, hospitals, university, and school classes.’
The use cases for empathic AI extend further still. It could herald a new age for the marketing industry. It could realise science-fiction’s visions of empathic operating systems. It could reinforce the abuses of modern surveillance states (which Makantasis believes can be prevented with robust regulation and public awareness). It’s fitting that a technology with so much utility has been born from a multi-disciplinary space. Makantasis explains, ‘affective computing lies at the intersection of computer science, artificial intelligence, and psychology. Computer science skills aren’t enough to develop artificial intelligence agents for predicting emotions. I have to understand and investigate the emotion manifestation mechanisms from a psychology point of view. This multidisciplinarity is what excites me most.’ In the same vein, TAMED’s results are ‘not the effort of an individual, but the final outcome of a synergy. I would like to thank Prof. Georgios Yannakakis and all the people from the Institute of Digital Games and Massive Entertainment who were involved in a way or another with the TAMED project.’
Comments are closed for this article!