Real-Time EEG - Attention Drift Detection

Your brain drifts. We catch it first.

Not to judge attention - but to watch it, understand it, and speak up early enough to matter.

// 01 The Problem
The Silent Performance Killer
Every day, students lose focus without realising it, professionals make errors from undetected fatigue, and critical decisions get made by cognitively drained minds all invisible to the person experiencing it.
STATISTIC

1 in 3

Road accidents are linked to driver fatigue and reduced sustained attention

STATISTIC

23%

Average productivity drop from undetected cognitive fatigue in working professionals

STATISTIC

$3000+

Cost of existing clinical EEG systems completely impractical outside laboratory settings

// 02 How It Works
Forehead to Alert in 500ms
Every half-second, a complete cycle of capture, clean, analyse, and output runs end to end.
// STEP 01HARDWARE

Capture the Signal

Two dry electrodes placed on the forehead (Fp1-Fp2) pick up the brain electrical activity. An Arduino reads this signal and streams it to a Python backend over USB. No gel, no lab, no technician fully wearable and portable.

// STEP 02SIGNAL PROCESSING

Clean the Signal

Every 2 seconds, 512 raw samples are filtered using a stateful bandpass filter (0.5-40 Hz) to remove DC drift and noise, plus a 50 Hz notch filter for power line interference. A Signal Quality Index is computed per window to measure how trustworthy the data is.

// STEP 03FEATURE EXTRACTION

Extract 9 Features

The cleaned window is converted into 9 numbers capturing the brain state: Theta power, Alpha power, Beta power, Theta/Alpha ratio, SEF95, Line Length, Hjorth Mobility, Hjorth Complexity, and an Artifact Flag.

// STEP 04AI PIPELINE

Score and Detect Drift

A neural network scores the 9 features from 0 to 1. The last 30 scores, 60 seconds of history, are then fed into an LSTM which detects whether the sequence is gradually declining. This two-stage approach catches drift that any single window score would miss entirely.

// STEP 05OUTPUT

Zone and Alert

The Zone Engine classifies the current state as IN_ZONE, UNSTABLE, DRIFTING, or DEGRADED using a fast EMA band and a slow soft baseline simultaneously. Alerts fire progressively. The dashboard updates via WebSocket all within 500ms.

// 03 The Science
What Your Brain Is Actually Telling Us
The brain generates electrical oscillations at different frequencies. Each band maps to a distinct mental state.
4 - 8 Hz

Theta Waves

Rise with fatigue and mind-wandering. The primary biological marker of attention drift strongest in the frontal lobe directly behind the forehead.

8 - 13 Hz

Alpha Waves

Rise when the brain is idle and disengaged. Combined with theta, the Theta/Alpha Ratio is the most peer-validated EEG marker of cognitive fatigue.

13 - 30 Hz

Beta Waves

Present during active thinking and alertness. Drop during drowsiness confirming what theta and alpha are already suggesting about cognitive state.

Key insight: When theta rises and alpha rises simultaneously, the Theta/Alpha Ratio climbs sharply. The brain is slipping from active engagement into fatigue. This ratio is the most reliable single EEG marker of attention drift and is the primary signal this system watches most closely.
// 04 Attention Zones
Real Time. Four States. Zero Guesswork.
Every 500ms, the Zone Engine classifies attention state using two references simultaneously a fast EMA band and a slow long-term soft baseline.
IN_ZONE
Inside band - Above baseline
Focus is healthy and sustained within your personal normal range. System monitors silently.
UNSTABLE
Outside band - Above baseline
Focus is flickering beyond normal variance but has not fallen below capability floor yet.
DRIFTING
Outside band - Below baseline
Attention has genuinely declined below personal capability. Warning alert after 10 sustained windows.
DEGRADED
Inside band - Below baseline
Attention has collapsed and stabilised low for an extended period. Critical alert take a break.
Zones require sustained evidence before firing. Persistence windows and hysteresis eliminate false alarms. Recovery from DEGRADED must pass through DRIFTING. No direct jumps back to IN_ZONE.
// 05 AI Pipeline
Two Stage Intelligence
Neither model was trained on labelled data. Both learned what normal attention looks like and flag anything that deviates from it.
STAGE 01

Attention Scorer

  • Dense neural network takes 9 features from one 2-second window
  • Outputs a 0 to 1 score: how closely does this resemble a focused brain?
  • Monte Carlo Dropout 20 passes per window for stability and uncertainty
  • Score blended with Signal Quality Index noisy windows fall back to a safe prior
  • Trained unsupervised on 5 public EEG datasets across diverse cognitive tasks
STAGE 02

LSTM Drift Detector

  • Receives the last 30 consecutive scores 60 seconds of brain history
  • Trained as an autoencoder on normal, healthy attention sequences only
  • Attempts to reconstruct the incoming sequence from memory at runtime
  • High reconstruction error equals unfamiliar declining pattern equals drift flagged
  • Catches gradual temporal drift that any single window score completely misses
Why two models? The scorer handles the snapshot. The LSTM handles the story. One model cannot do both jobs simultaneously.
// 06 Personalisation
Every Brain Is Different. The System Knows That.
EEG is deeply personal. Two people focusing on the same task can have theta/alpha ratios differing by a factor of two. Fixed thresholds simply do not work.

Session Calibration

First 60 seconds of every session, the system learns your personal brain baseline: your mean, variance, and normal range. Everything is measured relative to you, never a population average.

Cross-Session Memory

After each session your baseline statistics are saved. Next session, the system loads your profile and starts smarter. The more you use it, the more accurate it gets over time.

Dynamic Adaptation

A fast EMA band tracks where your attention is right now. A slow soft baseline remembers your peak capability from calibration. Together they ensure the system never accepts poor attention as your new normal.

// 07 Key Features
Built Different
Every design choice was made to solve a real, specific limitation of existing EEG and attention monitoring systems.
HARDWARE

Single dry electrode channel

No gel, no lab, no technician. Just your forehead.

SYSTEM

500ms real-time updates

WebSocket dashboard reflects brain state every half-second.

CALIBRATION

Fully personalised

No fixed population thresholds. Your brain, your baseline.

SIGNAL

9-feature extraction

Multitaper spectral estimation for reliable short-window analysis.

QUALITY

Signal Quality Index

Blinks and movement artifacts automatically detected and handled.

ARTIFICIAL INTELLIGENCE

Unsupervised AI

No labelled data required. Models learn what normal looks like.

MODELS

LSTM temporal analysis

60 seconds of score history to detect gradual drift patterns.

TRAINING

5 public dataset training

Diverse tasks and environments generalises across people.

// 08 Use Cases
Attention Matters Everywhere
The system is domain-agnostic by design. Anywhere sustained human focus matters and its degradation has real consequences, this system belongs.
USE CASE

Students

Detect study session fatigue before performance is affected. Know exactly when to take a break.

USE CASE

Drivers

Alert before microsleep sets in on long journeys. Real-time fatigue detection on the road.

USE CASE

Surgeons

Monitor sustained focus during long procedures where a cognitive lapse has real consequences.

USE CASE

Pilots and ATC

Continuous attention monitoring across long-haul flights and extended control room shifts.

USE CASE

Remote Workers

Personal productivity and wellbeing monitoring without surveillance or intrusion.

USE CASE

Military

Operator fatigue detection in high-stakes command, control, and intelligence environments.

// 09 Tech Stack
What Powers It
From raw electrode voltage to a live dashboard alert, a complete end-to-end real-time pipeline.
STAGE // 01

EEG Electrodes

Foundational component of the NeuroDrift real-time EEG processing pipeline.
STAGE // 02

Arduino

Foundational component of the NeuroDrift real-time EEG processing pipeline.
STAGE // 03

Python Backend

Foundational component of the NeuroDrift real-time EEG processing pipeline.
STAGE // 04

Bandpass Filter

Foundational component of the NeuroDrift real-time EEG processing pipeline.
STAGE // 05

Multitaper PSD

Foundational component of the NeuroDrift real-time EEG processing pipeline.
STAGE // 06

9 Features + SQI

Foundational component of the NeuroDrift real-time EEG processing pipeline.
STAGE // 07

Neural Network

Foundational component of the NeuroDrift real-time EEG processing pipeline.
STAGE // 08

Monte Carlo Dropout

Foundational component of the NeuroDrift real-time EEG processing pipeline.
STAGE // 09

LSTM Autoencoder

Foundational component of the NeuroDrift real-time EEG processing pipeline.
STAGE // 10

Zone Engine

Foundational component of the NeuroDrift real-time EEG processing pipeline.
STAGE // 11

WebSocket

Foundational component of the NeuroDrift real-time EEG processing pipeline.
STAGE // 12

Live Dashboard

Foundational component of the NeuroDrift real-time EEG processing pipeline.
// 10 The Team
Who Built This
A research-driven team building real-world brain-computer interfaces.
Aditya avatar

Aditya

Software Engineer

Satvik avatar

Satvik

Software Engineer

Aayushman avatar

Aayushman

Software Engineer

Deepanjana avatar

Deepanjana

Software Engineer

Jaanvi avatar

Jaanvi

Software Engineer