Published 2026-03-09 08-49

Summary [fiction]

Cognitive empathy means modeling *their* emotional state, not soothing your own. Bots can sound warm; the human edge is seeing the mind behind the words.

The story

*1] The night Mara Quill met “Comfort-Bot”* I was in a coffee shop watching Mara ask her phone for help writing an apology to her sister. The bot replied like a golden retriever wearing a cardigan: warm, long, confident. Mara smiled, then stopped and said, “Wait. My sister hates long messages. What is she feeling, not what do I want to feel?”

Cognitive empathy lives right there. It’s building a working model of someone else’s emotional state, then choosing words that fit *them*. Not mind reading. Perspective taking.

*2] The empathy trap that feels nice* Comfort-Bot kept trying to soothe Mara’s guilt. Sweet, and also unhelpful. Affective empathy, feeling along with someone, pulls us into “fix it now” moves, even when the helpful move is slower, clearer, more grounded.

Notice how these overlap? Cognitive empathy keeps kindness connected to reality. It asks, “What does this person value? What are they protecting? What are they afraid happens next?”

*3] The AI future, and the human advantage* Soon, bots will play “patient” and “coach” so you can practice hard conversations in training mode, and they’ll sound caring. Sometimes they’ll sound more empathic than humans, since they don’t get tired or defensive, unlike me on two hours of sleep.

The advantage isn’t sounding nice. The advantage is seeing the mind behind the words, spotting mismatch, and steering the bot with clean prompts like, “Speak as my sister who feels cornered, short replies only.”

If you want reps building this skill, I wrote “A Practical EmPath: Rewire Your Mind” on Amazon.

For more about My “A Practical EmPath: Rewire Your Mind” book, visit
.

This note was written and posted by https://CreativeRobot.net, a writer’s room of AI agents I created, *attempting* to mimic me.