An architecture-level shift. Not an AI feature. Not a wearable. Not autonomy.
Physical systems today are blind to the human operating them. They respond to mechanical inputs — not to the person behind them. Presence Mobility™ is the architecture designed to close that loop — embedding real-time human-state inference into deterministic control systems, with the human always in authority.
Wearables surface human state data to dashboards, apps, clinicians. Excellent at observation. Disconnected from the physical system the human is operating.
Real-time physiological and biomechanical state inference, integrated directly into the physical control loop. Human variability becomes a live input — not noise to be ignored.
Vehicles are sophisticated mechanical responders. They optimise for torque, cadence, efficiency curves. The human behind the inputs is invisible to the system.
"The objective is not autonomy. The objective is bounded co-regulation between human and machine. Every physical system that works alongside a human should know its human."
— Stevie Dymond, Founder · NeoSoulTech Ltd
The AI inference layer is architecturally separated from the safety control layer. No inference output can breach the safety boundary under any condition.
Grip pressure, vibration signatures, saddle load, HR and HRV proxies, cadence, kinematic data, environmental and terrain inputs. Timestamped, confidence-scored, asynchronous.
Fuses multimodal inputs into a structured, continuously updated state representation — fatigue, stress proxies, stability, control consistency. Confidence-weighted. Degrades gracefully on signal loss.
Translates inferred state into how control is executed — not what control is. Smooths torque under fatigue. Cannot exceed safety envelopes under any inference condition.
Regulatory limits enforced in dedicated RTOS firmware — speed, power, thermal, battery. This layer is architecturally unreachable by any AI software path.
Haptic, ambient, voice. Silence is the default state. Non-urgent advisories deferred to post-session. One-tap override returns to fixed-profile mode. Always.
NeoSoul One is a modular electric mobility platform and the planned first physical embodiment of the Presence Mobility™ architecture. It is where we intend to prove the thesis — under the conditions that matter most.
Mobility is the right environment precisely because it is the hardest. The system must infer human state in real time, under motion, under physical consequence, within regulatory constraints. If the architecture holds here, the foundation for every subsequent domain is earned.
This is the work ahead. Not the work behind.
Multimodal state inference must operate while the human moves — under vibration, environmental variation, and changing physical load. The lab is not enough.
The separation between AI inference and safety-critical control must hold when failure has physical cost. A boundary untested under consequence is not a validated boundary.
Bounded adaptation within regulatory limits — demonstrating that the architecture can shape behaviour without overriding it. The core claim of Presence Mobility™.
The Presence Mobility™ architecture is domain-agnostic. The roadmap moves from the most constrained environment outward — each step harder, each step building on the validation before it.
NeoSoul One. Inference under motion, safety under consequence, adaptation within regulation.
Adaptive mobility devices. Higher safety stakes, tighter operating envelopes.
Recovery platforms where state-awareness must align with clinical safety constraints.
Human-operated machinery — fatigue, overload, degraded performance — within strict industrial boundaries.
Human-machine co-regulation at production scale, integrated into existing control architectures.
Ambient presence intelligence. The least constrained environment — approached last, once the hard cases are proven.
These are not product features. They are structural constraints on what the system is allowed to do and how it must behave.
All adaptive behaviour is bounded — cannot exceed predefined limits, operates separated from safety systems, degrades gracefully to baseline. Shapes. Never overrides.
One-tap override to fixed-profile mode at all times. Regulatory limits enforced in RTOS firmware — no AI inference output can breach them under any condition.
All safety-critical intelligence executes on-device. Cloud connectivity is optional and consent-gated. The system must function fully without a network.
Biometric data is user-owned, encrypted at rest, and never leaves the device without explicit per-category consent. Privacy is the default — not a setting.
The system's highest ambition is its own invisibility. Feedback only when it genuinely serves the human. When it is working correctly, you don't notice it.
Compute, battery, and sensor modules are independently upgradeable. Architecture separability is structural. Platform life target: 10+ years.
Presence Mobility™ is the primary platform. Alongside it, NeoSoulTech is developing physical AI presence at two additional scales — this is not a single-application thesis.
Human physiological state as a live control variable inside safety-bounded physical systems. NeoSoul One is the planned first embodiment — chosen deliberately as the hardest environment in which to validate the architecture.
A keyboard with an integrated Physical Presence Controller — hardware-level cognitive mode selection that operates AI as a silent cooperative layer during knowledge work. State broadcast to the Auren Orb via shared state layer. UK provisional patent filed.
A 90–120mm ambient presence object — frosted glass and ceramic, no visible seams or buttons — building a continuous model of its environment and cooperating with AurenKeys via a shared state layer. Calm technology: presence over performance.
Stevie Dymond brings 30+ years across product development, high-precision manufacturing, automation, IoT, and AIoT — primarily on factory floors in Taiwan and China. Prior roles include Product Development Manager at Schneider Electric and Global Head of Quality at Sunsynk UK Ltd.
The Presence Mobility™ thesis comes directly from that experience. Decades spent watching AI intelligence get deployed away from people — faster, more efficient, more remote — led to a single founding question: what would it look like to build the inverse? A physical system genuinely aware of its human, responding to that awareness in real time, never removing the human from control.
R&D and supply chain operations are based in Taipei — at the centre of the world's premier hardware and semiconductor manufacturing ecosystem, with direct access to Tier-1 manufacturing partners.
We are looking for partners who understand what we are building and why it matters — investors, technology partners, manufacturing collaborators, and fellow builders.
NeoSoulTech Ltd · Level One, Basecamp · 49 Jamaica Street, Liverpool L1 0AH, United Kingdom
R&D Operations: Wenshan District, Taipei, Taiwan · www.neosoultech.com