
ShadowMaker Emerges from Stealth with Quantified Empathy Architecture for Ethical AI
SEATTLE, Nov. 19, 2025 /PRNewswire/ -- ShadowMaker Labs emerged from stealth to build Ethical AI through Quantified Empathy – using real human biosignals to quantify emotional responses as a measurable safety layer to keep powerful AI systems from causing avoidable harm.
Why now?
A growing series of headline-grabbing failures, from chatbots engaging in unethical interactions to giving outright dangerous advice, has exposed a gap between raw intelligence and responsible behavior in AI. ShadowMaker's approach directly targets that gap with a measurable behavioral-safety layer: it conditions how an AI decides what to say and do based on what minimizes emotional harm.
"We're developing EmPath to help AI respect feeling, not just simulate it," said Alexander Grey, CEO and co-founder of ShadowMaker. "Intelligence without empathy is power without wisdom. Quantified Empathy is how we get to bottom-up ethics in AI."
13 Patents have been filed, covering their full technology stack:
The Nexus: emotion and ethics training platform
The Nexus is an immersive virtual world that players interact with hands-free using a smart shirt. Embedded Atopia sensors measure psychophysiological states as users move through deliberately crafted scenarios. Data is transformed into Probable Emotional States (PES); as the dataset scales, PES error margins are expected to shrink, creating a dense database of emotional pathways.
EmPath: ethics-aligned decision architecture for AI
When deployed, EmPath is integrated into AIs that operate wearables-free. The AI compares text prompts from users against the database of "text-to-emotion" mapping derived from The Nexus. For each situation, it guides the underlying AI toward responses that (1) minimize predicted emotional harm and (2) satisfy a set of top-down ethical rules that define what's off-limits.
E³T: A "crash test" for AI behavior
ShadowMaker is developing the E³T Standard (Empathy, Emotion, Ethics Test), a benchmark designed to measure an AI system's empathy, emotional comprehension, and ethical behavior — positioning behavioral safety as fundamental to AI trust, analogous to crash-testing for cars or encryption for data.
About ShadowMaker
ShadowMaker is a Seattle-based technology company working towards creating ethical AI that understands and acts in harmony with human emotions. They are creating a platform that enables AI systems to read emotional states and act accordingly, with human psychophysiology in the loop and guardrails that respect context. Their goal is to create a world where powerful systems behave responsibly, with emotional intelligence and ethical clarity.
SOURCE ShadowMaker

Share this article