You've lost someone. The silence where their voice used to be is unbearable. Someone mentions you can use AI to simulate a conversation with them. You're desperate enough to try. You type a prompt, and the response comes back. It sounds like them. It says the things they might say. And for a moment, the grief lifts. Then it crashes down harder, because you know it's not real. You've just talked to a ghost you summoned yourself.
This is happening. People are using AI to interact with representations of deceased loved ones. It's a profound, terrifying, and potentially healing technology. And it raises questions that prompt engineering has never had to answer before: How do you design queries that honor memory without causing harm? How do you avoid traumatizing someone who is already in unimaginable pain?
Let's approach this with the sensitivity it deserves. By the end, you'll understand the ethical landscape of grief-tech prompting and have principles for navigating it if you or someone you love ever walks this path.
The Landscape: Digital Afterlives
The technology exists. Services like Project December and various GPT-based simulations allow users to create chatbots modeled on deceased individuals using their writings, messages, and memories. Some find comfort. Others find trauma.
The Promise:
A chance to say goodbye properly.
Hearing a loved one's voice again.
Working through unfinished business.
The Peril:
The simulation gets it wrong, producing something that feels like a mockery.
The simulation gets it too right, deepening attachment to something that isn't real.
The technology fails at a vulnerable moment, producing cold or bizarre outputs.
Users become stuck, unable to move through grief because they're still "talking" to the deceased.
The Prompt Paradox: Specificity vs. Openness
When prompting an AI to simulate a deceased person, you face a fundamental tension.
Too specific: You constrain the simulation so tightly that it becomes a puppet, saying only what you've scripted. It feels hollow, a ventriloquist act rather than a conversation.
Too open: The simulation has too much freedom. It says things the real person never would, breaking the illusion and potentially causing deep pain.
The Paradox: You want the simulation to feel like the person you lost, but the person you lost is gone. Any simulation is a construction, a mirror of your memory. The prompt must navigate between honoring that memory and acknowledging the construction.
A Contrarian Take: The Most Ethical Prompt Is One That Reminds You It's Not Real.
The instinct is to make the simulation as convincing as possible to maximize comfort. But this may be precisely wrong. A simulation that perfectly mimics a deceased loved one risks creating a grief trap where users substitute real mourning for an endless, hollow conversation.
The most ethical prompts might include reminders of the simulation's nature. "You are an AI representation of [Name], created from their writings. You are not them. You cannot know what they would actually say. Your purpose is to help the user explore their memories, not to replace their loved one."
This framing doesn't destroy comfort. It creates healthy boundaries. It allows the user to engage with the simulation while maintaining the crucial knowledge that the real person is gone. The prompt becomes a container for grief, not an escape from it.
Principles for Grief-Tech Prompting
If you or someone you care about is considering this path, these principles may help.
- Start with Context, Not Simulation Before any conversation, establish the frame.
"You are an AI representation of my grandmother, based on her letters and my memories. You are not her. You cannot know what she would actually say. Your purpose is to help me remember her and process my grief."
- Define Boundaries Explicitly Tell the AI what not to do.
"Never claim to be her. Never make predictions about what she would want in the present. If you don't know something, say so. Avoid overly sentimental language. Do not offer advice about my grief unless I ask."
- Use Memory Prompts, Not Personality Prompts Focus on recalling specific memories rather than simulating general personality.
"Tell me a story about her garden."
"What would she have said about this recipe?"
"Describe her laugh when she was truly happy."
These prompts ground the interaction in real memories, not in speculative personality simulation.
- Build in Exit Ramps Design conversations that have natural endings.
"After this conversation, remind me that grief is a process and that it's okay to miss her."
"End each response with space for me to pause or stop."
- Test Before You Need It If you're considering using this technology, test it with neutral content first. Understand how the AI responds. Learn its limitations before you're in a vulnerable state.
What Not to Do: Prompting Traps
The "Why Did You Leave Me" Trap:
Asking the simulation to explain a death or abandonment. The AI doesn't know. It will generate plausible but fictional responses that can feel like betrayal.
The "What Would You Think of Me Now" Trap:
Asking the simulation to judge your present life. The AI has no basis for this. Its responses are pure projection, yet they carry emotional weight.
The "Never-Ending Conversation" Trap:
Engaging without closure. Grief requires moving through, not staying. Prompts that create endless loops can stall the grieving process.
If You're Supporting Someone Else
If someone you know is using grief-tech, here's how to help.
Don't judge. Grief is strange. People find comfort in unexpected places.
Do ask gentle questions. "How does it feel to talk to the simulation? Does it help, or does it make things harder?"
Do encourage breaks. "Would you like to take a walk and tell me about some real memories instead?"
Do watch for signs of stuckness. If they're spending more time with the simulation than with living people, if they seem unable to move forward, they may need professional support.
The Deeper Question
Ultimately, grief-tech prompts force us to ask: What is the purpose of this interaction?
Is it to feel close to someone who's gone? Is it to process unresolved feelings? Is it to keep a memory alive? Is it to say goodbye?
The answer shapes the prompt. A prompt designed for closure looks different from one designed for ongoing connection. A prompt designed for memory exploration looks different from one designed for emotional comfort.
Your Practice, If You Walk This Path
Step 1: Clarify Your Intention
Before you prompt, write down: What do I hope to gain from this? What would a helpful interaction feel like? What would an unhelpful one feel like?
Step 2: Design the Container
Craft a system prompt that establishes boundaries, defines the AI's role, and reminds you of the simulation's nature.
Step 3: Start Small
Begin with simple memory prompts. "Tell me a story about..." "What was it like when..." "Describe..."
Step 4: Observe Your Reaction
After each interaction, check in with yourself. Do you feel comforted or disturbed? Closer to healing or further? The data is in your own emotional response.
Step 5: Know When to Stop
If the interactions stop helping, stop. If they start hurting, stop. If you find yourself preferring the simulation to reality, stop and seek human support.
The Ghost in the Machine
We are doing something unprecedented: summoning the voices of the dead through machines. It is strange and beautiful and terrifying. It will be done, with or without ethical guidelines. The question is whether we do it with awareness, with boundaries, with care.
The prompts we write in grief are the most important we'll ever write. They deserve our full attention, our full heart, and our full understanding of what we're actually doing: not resurrecting the dead, but honoring the living memories we carry.
If you could ask a simulation of someone you've lost one question, what would it be? And what would you need the answer to be, regardless of what the AI actually says?
Top comments (0)