Version history and release notes for NForge.
NForge v1.1.0 introduces a complete robotics integration layer designed for the Galbot G1 humanoid robot. This release bridges the gap between computational neuroscience and physical robotic systems, enabling humanoid robots to understand and respond to human emotional states by decoding predicted brain activity patterns in real time.
The Galbot G1 is a 173cm general-purpose humanoid robot with a 4-microphone array, RGB-D cameras, tactile sensors, and 8-core AI processing. Its 95%+ grasping accuracy and 10-hour runtime make it the ideal platform for sustained therapeutic interaction. NForge v1.1.0 bridges computational neuroscience and physical robotics — giving the G1 the ability to understand human emotion at the neural level.
The centrepiece of this release is the Emotional Therapy pipeline — a closed-loop system where the Galbot robot observes a patient via its cameras and microphones, NForge predicts the cortical response patterns those stimuli would evoke, an emotion engine decodes valence/arousal/dominance from the predicted brain state, and the robot adapts its behaviour to provide optimal therapeutic interaction.
Maps NForge cortical surface predictions to three emotional dimensions — valence (positive/negative), arousal (calm/excited), and dominance (submissive/dominant) — using neuroscience-grounded ROI mappings. The engine averages activation across HCP MMP1.0 brain regions associated with each dimension: orbitofrontal cortex for valence, temporal pole and anterior insula (amygdala proxies) for arousal, and frontal operculum for dominance. Outputs a confidence-weighted EmotionState at each timestep.
Translates decoded emotion states into Galbot G1 action commands across four channels:
speech_tone (calm, warm, energetic, concerned),
facial_expression (neutral, smile, empathy, concern),
gesture (nod, open_palms, lean_forward, thumbs_up),
and approach_distance (0.8–1.8m adaptive proximity).
Built on an abstract RobotBridge base class — swap in any robot platform by implementing a
single translate() method.
Accepts wearable EEG (10-20 system, up to 17 channels) and fNIRS (8 source-detector pairs)
sensor data and projects it into NForge's feature space via a learned linear transform.
The fused "biometric" modality tensor plugs directly
into StreamingPredictor.push_frame(), enabling real-time therapy sessions where the robot
receives both environmental perception (camera/microphone) and direct physiological brain
signals simultaneously.
Orchestrates full therapy sessions by logging timestamped emotion states and computing engagement metrics: emotional stability (valence consistency), engagement score (arousal magnitude), and progress (valence trend slope via linear regression). The session manager tracks emotional trajectory and provides trajectory-aware action suggestions — if a patient's valence is declining, the robot automatically shifts to gentler tones and open-palm gestures. Full session reports export as JSON for clinical review.
Current therapy robots rely on explicit signals — facial expressions, voice tone, body language — to gauge patient emotion. These surface-level cues miss the deeper neural reality: a patient may smile while their brain's threat-processing regions remain hyperactive, or appear calm while reward circuits are disengaging. NForge changes this by predicting implicit brain responses — the actual cortical activation patterns a stimulus would evoke — giving robots access to a fundamentally richer understanding of human emotional state.
By pairing Galbot's physical presence and natural interaction capabilities with NForge's brain-encoding predictions, we create a closed-loop therapeutic system: the robot acts, NForge predicts the neural impact, the emotion engine decodes the patient's internal state, and the session manager adapts the robot's next action to maximise therapeutic benefit. This is not replacing human therapists — it's augmenting therapy with continuous, objective neural monitoring that no human can perform in real time.
Predict which robot interactions best engage the temporoparietal junction and superior temporal sulcus — regions critical for social cognition. The robot adjusts its facial expressions and speech patterns based on real-time social processing activation, providing personalised social skills training that adapts moment-by-moment.
Monitor reward system activation (ventromedial prefrontal cortex, striatum) in response to robot encouragement. When the emotion engine detects declining valence or disengaging reward circuits, the robot shifts to higher-warmth interactions — slower speech, open-palm gestures, closer proximity — to re-engage the patient.
Cross-subject adaptation enables the robot to calibrate to each patient using minimal data. NForge tracks which memory-related regions activate during reminiscence therapy, identifying the most effective conversation topics and stimuli for each individual. Session reports give caregivers objective engagement metrics over time.
During graduated exposure therapy, the robot monitors amygdala-proxy regions (temporal pole, anterior insula) for threat response activation. If predicted arousal exceeds safe thresholds, the robot automatically reduces stimulus intensity — pausing, switching to calming gestures, or increasing physical distance — preventing re-traumatisation while maintaining therapeutic progress.
Children often struggle to articulate emotions verbally. NForge's brain-state predictions bypass this barrier entirely, giving the Galbot robot insight into the child's internal experience. The robot becomes a non-judgmental companion that truly understands how the child feels, adapting play-based therapeutic activities in real time.
For isolated elderly individuals, the Galbot robot provides consistent emotional engagement. The therapy session manager tracks engagement and emotional stability across days and weeks, alerting caregivers to declining trends. Biometric fusion with wearable EEG enables passive monitoring without requiring active patient participation.
Planned features for upcoming NForge releases.
First public release of NForge. Restructured from Meta's TRIBE v2 with professional src/ package layout, four new features (ROI Attention Maps, Real-time Streaming, Modality Attribution, Cross-Subject Adaptation), torch.compile support, and comprehensive test coverage.