Are We Replacing Friends with AI? An Honest Conversation
With the rise of AI companion apps like Character.AI, Replika, and Friend.com, millions of people are forming deep emotional bonds with artificial intelligence—some spending 6-8 hours daily in conversation. While these technologies promise connection, they also raise critical questions about dependency, loneliness, and what it means to design responsibly.
The stories emerging are concerning:
A 14-year-old's suicide allegedly linked to dependency on a Character.AI chatbot
Users reporting parasocial relationships with AI companions
People choosing AI conversations over human connection
Therapists warning about AI replacing professional mental health care
Lack of safety mechanisms in tools designed for vulnerable users
But here's what's missing from the conversation:
Most discourse has been dominated by Western perspectives and Silicon Valley framings. How do these questions play out in Singapore and Asia, where collectivist values, different mental health stigmas, and multilingual contexts shape how we relate to technology differently?
The core question: Are we building tools that support human connection, or are we replacing it entirely?
What This Event Is
A 90-minute interactive roundtable where we:
Lightning Provocations (15 min)
Three 5-minute reflections from builders and researchers:
Arif Woozeer (Kura Collective) - From copilots to companions: designing for connection vs. dependency
Kellie Sim (SUTD PhD) - When AI mediates peer support: what do we lose in translation?
Arin Pantja (Designer, ocsn) - Designing for connection in an age of synthetic intimacy
Interactive Discussion (35 min)
Scales exercise: Where do you stand on key questions? (Jubilee-style positioning)
Whiteboard brainstorming: What surprised you? What concerns you?
Facilitated dialogue on ethics, design choices, and lived experiences
Open Q&A & Reflection (10 min)
Synthesis, final thoughts, lingering questions
Continued Conversation (30 min)
Optional: Stay for informal discussion and networking
What This Event Is NOT
A panel with experts talking at you
A product pitch (though we'll share our work if relevant)
A hot-takes competition (nuanced discussion > Twitter dunks)
Dominated by Western perspectives on AI ethics
Who Should Come
Founders building in AI, mental health, or social tech
Students interested in HCI, psychology, ethics, or design
Researchers exploring human-AI interaction
Designers and product folks thinking about responsible tech
Anyone with lived experience using AI for emotional support
Skeptics, optimists, and everyone in between
If you care about thoughtful tech and want to actually talk about this stuff IRL, this is for you.
🐢 About This Series
This is the inaugural Kura Collective event—a series of intimate conversations exploring how technology can support human connection instead of replacing it.
We believe reflection comes before connection. You can't genuinely connect with others until you understand yourself. These gatherings create space for both.
Future topics: loneliness and technology, building ethical AI, what it means to use tools thoughtfully.
🗣️ Facilitators & Contributors
Arif Woozeer is the co-founder of Kura Collective (BIG'21). Previously, he built AI copilot systems at Over The Rainbow, a local nonprofit, and has been exploring the intersection of technology, mental health, and policy through work with Reach Alliance on AI tools for migrant workers.
Kellie Sim is a PhD student at SUTD researching Human-AI Interaction and Mental Health. Her work focuses on leveraging AI to support mental health through empathetic interactions, with research on peer support systems and LLM-mediated conversations. She's involved in initiatives like Youth Corps Singapore's Project Re:ground Community Peer Supporters, and was previously a Youth Fellow under MOHT's mindline.sg.
Arin Pantja is currently a founding designer on OC Social Network, a social media roleplaying platform. He previously lead design initiatives for Kura Kura, and explores our intimate relationships to interfaces and websites through his net art practice.
📚 Context & Reading
If you want to dive deeper before the event:
Character.AI & Teen Suicide:
NBC News: https://www.nbcnews.com/tech/characterai-lawsuit-florida-teen-death-rcna176791
CNN Coverage: https://www.cnn.com/2024/10/30/tech/teen-suicide-character-ai-lawsuit
Friend.com Controversy:
Fortune (CEO interview): https://fortune.com/2024/08/12/avi-schiffmann-ai-necklace-friend-interview/
Fast Company (ad campaign): https://www.fastcompany.com/91413814/friend-ai-ad-campaign-founder-qa
What we're building:
Kura: https://kurakura.io
📍 Venue
The Jay and Marilyn Ng Greenhouse
SMU Connexion, Level 4
Vibe: Casual, come as you are. No slides, no formality ; just honest discussion.
What's Included: Drinks and light snacks provided.
🐢 Optional: Kura merch available if you want to support future gatherings.
Capacity: Limited to 20 people to keep the conversation intimate.
🔗 Links
Kura: https://kurakura.io
Instagram: https://www.instagram.com/kurakura.io
❓ Questions?
Instagram: @kurakura.io
See you there! 🐢