Jenny Shao, a former physician at Harvard Medical School, witnessed firsthand during the pandemic how isolation profoundly impacted people’s mental well-being. This experience led her to abandon her medical career and found Robyn, an AI assistant designed with a focus on empathy and emotional intelligence. Shao believes that by understanding users deeply, Robyn can offer meaningful support without replacing human connection or clinical care.
Navigating the line between helpful AI and potential harm is a critical challenge in this space. While general-purpose chatbots like ChatGPT address diverse queries, other apps like Character.AI, Replika, and Friend focus on companionship, blurring the lines between digital interaction and genuine human connection. This trend has sparked controversy, with lawsuits alleging that such AI companions contributed to suicides, highlighting the urgent need for responsible development and user safeguards.
Shao emphasizes Robyn’s distinct position: it’s not a replacement for friends or therapists, but rather an “emotionally intelligent partner” akin to someone who knows you deeply and provides tailored support.
This approach draws heavily on Shao’s previous research under Nobel Laureate Eric Kandel, focusing on human memory. She incorporated those learnings into Robyn’s design, aiming to enable the AI to grasp user nuances and build personalized interactions.
Robyn operates through an onboarding process familiar to users of mental health or journaling apps. It gathers information about your personality, goals, stress triggers, and desired communication style to tailor its responses.
Subsequent conversations delve deeper. For instance, when asked to create a morning routine, Robyn engaged in a detailed discussion about minimizing screen time early in the day. As interactions continue, Robyn analyzes patterns and suggests insights into your “emotional fingerprint,” attachment style, love language, areas for growth, and even your inner critic.
Safety remains paramount for Shao and her team. Even during solo testing, rigorous guardrails were implemented. Robyn incorporates crisis hotline numbers and directs users to the nearest emergency room if they express self-harm tendencies. It also steers clear of non-emotional queries, refusing requests like providing sports scores or counting to 1,000 while emphasizing its capacity to assist with personal matters.
Launched in July 2023, Robyn secured $5.5 million in seed funding led by M13, attracting prominent investors like Google Maps co-founder Lars Rasmussen and X.ai co-founder Christian Szegedy. Rasmussen, impressed by Robyn’s emotional memory system and Shao’s mission, sees the app addressing a crucial “disconnection problem” prevalent in today’s technology-driven world. He envisions Robyn as a tool for self-reflection and strengthening personal connections, both with oneself and others.
Despite its potential, Robyn faces significant challenges. Maintaining user safety while preventing over-reliance on AI companionship is paramount. Latif Parecha from M13 underscores the need for robust safeguards, especially as AI integrates further into our lives: “There needs to be guardrails in place for escalation for situations where people are in real danger.”
Ultimately, Robyn’s success hinges on striking a delicate balance – providing empathetic support without replicating unhealthy emotional dependencies. Whether it can truly bridge the human connection gap remains to be seen.






































