NewsPronto

 
Men's Weekly

.

The Conversation

  • Written by Ekaterina Muravevskaia, Assistant Professor of Human-Centered Computing, Indiana University
imageTechnology can be isolating, but it can also help kids learn emotional connection.Dusan Stankovic/E+ via Getty Images

Empathy is not just a “nice-to-have” soft skill – it is a foundation of how children and adults regulate emotions, build friendships and learn from one another.

Between the ages of 6 and 9, children begin shifting from being self-centered to noticing the emotions and perspectives of others. This makes early childhood one of the most important periods for developing empathy and other social-emotional skills.

Traditionally, pretend play has been a natural way to practice empathy. Many adults can remember acting out scenes as doctor and patient, or using sticks and leaves as imaginary currency. Those playful moments were not just entertainment – they were early lessons in empathy and taking someone else’s perspective.

But as children spend more time with technology and less in pretend play, these opportunities are shrinking. Some educators worry that technology is hindering social-emotional learning. Yet research in affective computing – digital systems that recognize emotions, simulate them or both – suggests that technology can also become part of the solution.

Virtual reality, in particular, can create immersive environments where children interact with characters who display emotions as vividly as real humans. I’m a human-computer interaction scientist who studies social-emotional learning in the context of how people use technology. Used thoughtfully, the combination of VR and artificial intelligence could help reshape social-emotional learning practices and serve as a new kind of “empathy classroom” or “emotional regulation simulator.”

Game of emotions

As a part of my doctoral studies at the University of Florida, in 2017 I began developing a VR Empathy Game framework that combines insights from developmental psychology, affective computing and participatory design with children. At the Human-Computer Interaction Lab at the University of Maryland, I worked with their KidsTeam program, where children of 7-11 served as design partners, helping us to imagine what an empathy-focused VR game should feel like.

In 2018, 15 master’s students at the Florida Interactive Entertainment Academy at the University of Central Florida and I created the first game prototype, Why Did Baba Yaga Take My Brother? This game is based on a Russian folktale and introduces four characters, each representing a core emotion: Baba Yaga embodies anger, Goose represents fear, the Older Sister shows happiness and the Younger Sister expresses sadness.

The VR game Why Did Baba Yaga Take My Brother? is designed to help kids develop empathy.

Unlike most games, it does not reward players with points or badges. Instead, children can progress in the game only by getting to know the characters, listening to their stories and practicing empathic actions. For example, they can look at the game’s world through a character’s glasses, revisit their memories or even hug Baba Yaga to comfort her. This design choice reflects a core idea of social-emotional learning: Empathy is not about external rewards but about pausing, reflecting and responding to the needs of others.

My colleagues and I have been refining the game since then and using it to study children and empathy.

Different paths to empathy

We tested the game with elementary school children individually. After asking general questions and giving an empathy survey, we invited children to play the game. We observed their behavior while they were playing and discussed their experience afterward.

Our most important discovery was that children interacted with the VR characters following the main empathic patterns humans usually follow while interacting with each other. Some children displayed cognitive empathy, meaning they had an understanding of the characters’ emotional states. They listened thoughtfully to characters, tapped their shoulders to get their attention, and attempted to help them. At the same time, they were not completely absorbed in the VR characters’ feelings.

imageCharacters in the researchers’ VR game express a range of emotions.Ekaterina Muravevskaia

Others expressed emotional contagion, directly mirroring characters’ emotions, sometimes becoming so distressed by fear or sadness that it made them stop the game. In addition, a few other children did not connect with the characters at all, focusing mainly on exploring the virtual environment. All three behaviors can happen in real life as well when children interact with their peers.

These findings highlight both the promise and the challenge. VR can indeed evoke powerful empathic responses, but it also raises questions about how to design experiences that support children with different temperaments – some need more stimulation, and others need gentler pacing.

AI eye on emotions

The current big question for us is how to effectively incorporate this type of empathy game into everyday life. In classrooms, VR will not replace real conversations or traditional role-play, but it can enrich them. A teacher might use a short VR scenario to spark discussion, encouraging students to reflect on what they felt and how it connects to their real friendships. In this way, VR becomes a springboard for dialogue, not a stand-alone tool.

We are also exploring adaptive VR systems that respond to a child’s emotional state in real time. A headset might detect if a child is anxious or scared – through facial expressions, heart rate or gaze – and adjust the experience by scaling down the characters’ expressiveness or offering supportive prompts. Such a responsive “empathy classroom” could give children safe opportunities to gradually strengthen their emotional regulation skills.

This is where AI becomes essential. AI systems can make sense of the data collected by VR headsets such as eye gaze, facial expressions, heart rate or body movement and use it to adjust the experience in real time. For example, if a child looks anxious or avoids eye contact with a sad character, the AI could gently slow down the story, provide encouraging prompts or reduce the emotional intensity of the scene. On the other hand, if the child appears calm and engaged, the AI might introduce a more complex scenario to deepen their learning.

In our current research, we are investigating how AI can measure empathy itself – tracking moment-to-moment emotional responses during gameplay to provide educators with better insight into how empathy develops.

Future work and collaboration

As promising as I believe this work is, it raises big questions. Should VR characters express emotions at full intensity, or should we tone them down for sensitive children? If children treat VR characters as real, how do we make sure those lessons carry to the playground or dinner table? And with headsets still costly, how do we ensure empathy technology doesn’t widen digital divides?

These are not just research puzzles but ethical responsibilities. This vision requires collaboration among educators, researchers, designers, parents and children themselves. Computer scientists design the technology, psychologists ensure the experiences are emotionally healthy, teachers adapt them for curriculum, and children co-create the games to make them engaging and meaningful.

Together, we can shape technologies that not only entertain but also nurture empathy, emotional regulation and deeper connection in the next generation.

Ekaterina Muravevskaia does not work for, consult, own shares in or receive funding from any company or organization that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.

Authors: Ekaterina Muravevskaia, Assistant Professor of Human-Centered Computing, Indiana University

Read more https://theconversation.com/how-vr-and-ai-could-help-the-next-generation-grow-kinder-and-more-connected-263181