Designing a Digital Twin You Might Regret

What if AI could optimize your life—but slowly erase your sense of self?


🎥 Scene Setting:
This project started as a simple question: How far is too far when it comes to optimizing our lives?

To explore that, I co-created Anima, a fictional AI product that tracks your behavior and suggests how to act, reply, and even feel. It looks helpful. It feels plausible. But the more users engaged, the more it made them uncomfortable.

And that was the point.

🔍 What Shifted:

What began as a critique of convenience turned into a tool for reflection.
Participants didn’t just evaluate the design—they recognized themselves in it.

Anima helped them notice how current technologies already shape their memory, autonomy, and communication... often without them realizing it.

What started as a critique of convenience became a framework for value reflection. When participants encountered Anima, they didn’t dismiss it as sci-fi—they saw themselves in it. What emerged was not a singular insight, but a method: ambiguity + believability = reflection.


🧭 Research Question

To what extent can a fictional, plausible artifact provoke real users to reflect on their technology habits, values, and assumptions?

🧑‍🤝‍🧑 My Role:

  • Co-designed the speculative prototype and service website

  • Led generational user interviews and emotional response workshops

  • Created value-mapping tools to analyze behavioral tension

  • Co-authored a 6000-word academic paper and conference presentation

“I entered the site thinking this was a new product but as soon as started reading, I realized it was meant to provoke.” - User

“What's the value of your "self" and all the effort you've put into life so far? What's worth trying hard for? Anima can do it better.” - User

What We Found:
🧠 People became more aware of what they value
Privacy, autonomy, and spontaneity surfaced fast—especially among older users who had lived through multiple tech shifts.

⚡ They wanted the product... but didn’t like what it implied
That tension made them question their current tools more than any article or TED Talk had.

🧪 They used Anima to reflect on their real lives
From ChatGPT to email tools to memory apps, the prototype made them rethink what they’ve already outsourced.

💡 It sparked design ideas
Participants proposed real-world features like AI that asks for consent, or allows debate—not just automation.

Why It Matters:

We’re building tools that can help us act—but what do we lose when we let them decide?

This project wasn’t about predicting the future. It was about surfacing what we already accept—and asking if that’s still okay.

It shows how critical design can drive user reflection, behavior change, and even new design principles—without preaching, just provoking.

Previous
Previous

Turning Fabric Scraps into Fashion Statements

Next
Next

Rethinking Returns for a More Sustainable Fashion Economy