UN-Affiliated Research Sparks Debate with AI Refugee Avatars: Innovation or Insensitivity?

UN-Affiliated Research Sparks Debate with AI Refugee Avatars: Innovation or Insensitivity?

July 13, 20252 min read

In a provocative fusion of artificial intelligence and humanitarian storytelling, a research initiative linked to the United Nations has unveiled two AI-generated avatars designed to simulate the lived experiences of individuals impacted by conflict in Sudan. The project, developed by a class at the United Nations University Centre for Policy Research (UNU-CPR), raises both technological curiosity and ethical scrutiny.

The two virtual agents—Amina, a fictional Sudanese refugee residing in a camp in Chad, and Abdalla, a digital representation of a soldier from Sudan's Rapid Support Forces—were created to offer users a conversational interface through which they could explore the human dimensions of displacement and conflict. According to project lead and Columbia University professor Eduardo Albrecht, the exercise was an experimental endeavor meant to explore storytelling and engagement, not an official UN-endorsed tool or policy recommendation.

While users were intended to interact with Amina and Abdalla via an experimental website, technical issues over the weekend prevented registration, suggesting the platform may still be in its early or unstable stages.

A paper summarizing the class project indicates potential use cases that extend beyond education. One suggestion: using such avatars to “quickly make a case to donors,” potentially as emotionally resonant tools to stimulate financial support. However, this vision has been met with mixed reactions. Many individuals involved in workshops with the avatars reportedly voiced discomfort, asserting that real refugees are more than capable of telling their own stories—and should be the ones doing so.

The initiative raises important questions about the role of AI in humanitarian advocacy. While AI can simulate empathy, deliver complex narratives, and reduce barriers to engagement, critics argue it also risks trivializing or replacing authentic human voices, particularly in emotionally charged issues like war, migration, and human rights.

In an era where AI is increasingly shaping global narratives—from politics and finance to health and education—its encroachment into humanitarian storytelling opens a Pandora’s box of ethical, cultural, and political dilemmas.

As AI grows more persuasive in simulating lived experience, where should we draw the line between raising awareness and appropriating reality?

Back to Blog