Mo Gawdat Introduces Emma At SEF 2026, Urging Founders To Take Responsibility As AI Enters Emotional Life
As artificial intelligence moves beyond productivity tools and into emotionally sensitive domains, founders must take full responsibility for how these systems are designed, governed, and deployed. That was the message delivered by Mo Gawdat on the closing day of the ninth Sharjah Entrepreneurship Festival (SEF 2026), where he offered the first public look behind Emma, an AI platform designed to support human relationships, ahead of its launch on February 14.
A four-time bestselling author and Co-Founder of emma.love, Unstressable, and One Billion Happy, Gawdat used the platform to pull back the curtain on how Emma was built, why clear limits were embedded into its design, and what accountability looks like when technology engages with people at an emotional level.
Speaking on the festival’s Impact Platform during a session titled “The Future with Artificial Intelligence and Its Impact on Human Connection,” moderated by Sanad Yaghi, Co-Founder of DTEK.ai and emma.love, Gawdat delivered a measured assessment of both the opportunity and the risk involved in building emotionally responsive AI.
“Most people are worried about the future of humanity in the age of artificial intelligence. There is nothing inherently good or evil about AI itself. The danger comes from allowing powerful intelligence to serve greed and the pursuit of power. That will not serve humanity”, Gawdat said.
He explained that Emma was created to help users develop greater self-awareness and healthier relationships with others, while deliberately avoiding the role of a human replacement. Acknowledging the sensitivity of the space, Gawdat warned against systems that invite dependency or offer therapeutic guidance beyond what technology should provide.
Privacy, Gawdat emphasized, is foundational rather than optional. He noted that users often disclose more to Emma than they do to people close to them, precisely because the system feels non-judgmental.
“That makes protecting those conversations an ethical obligation,” he said. “No human has access to user conversations. Instead, improvements rely on internal monitoring tools where AI reviews AI, allowing flaws to be detected without exposing personal content.”
Challenging common assumptions about product design, Gawdat addressed how personality in AI systems is shaped. Rather than emerging organically, he said, it is the result of deliberate behavioral and linguistic constraints.
“Personality in AI comes from carefully designed behavioral and linguistic instructions that define how the system responds and stays within its limits,” he said. “Emma is intentionally designed to feel supportive without being mistaken for a human coach or therapist. The focus of the conversation must always remain on the user, not the machine.”
He contrasted the speed of today’s AI experimentation with earlier software eras, when testing and deployment took years and large engineering teams. That acceleration, he warned, raises the stakes for governance and ethics.
“When iteration becomes this cheap and this fast, governance and ethics matter more,” he said.
Turning to entrepreneurship, Gawdat urged founders not to delay AI adoption, but to approach it with clarity and restraint.
“If you are building a business today and not using AI, your competition will be faster, more efficient, and will eventually overtake you,” he said. “Don’t start with a polished interface. Start with something useful. Learn through iteration. And understand that making powerful tools widely available comes with responsibility.”
Gawdat closed by linking the discussion back to SEF 2026’s theme, “Where We Belong,” framing AI not as a substitute for human connection but as a tool that should strengthen it.
SEF 2026, organized by the Sharjah Entrepreneurship Center (Sheraa), was held from January 31 to February 1 at the Sharjah Research, Technology and Innovation Park (SPARK).




