| name | neuroskill-skills-readme |
|---|---|
| description | README for the neuroskill-skills npm package — installation, skill listing, and contribution guide. |
A collection of NeuroSkill EXG skills for NeuroLoop™️ — a biometric AI companion powered by a real-time EXG device (Muse or OpenBCI). Skills are loaded contextually: the harness injects the relevant skill file into the system prompt when the user's message matches the skill's domain.
| Skill | Description |
|---|---|
neuroskill-transport |
WebSocket & HTTP transport layer, port discovery, Quick Start, output modes (--json / --full / default), and every global CLI flag. |
neuroskill-status |
status command — full system snapshot: device state, signal quality, EXG scores, band powers, ratios, embeddings, labels, 48 h sleep summary, and recording history. |
neuroskill-sessions |
session and sessions commands — per-session metric breakdowns with first-half → second-half trend arrows, full session listing across all days, and Unix timestamp helpers. |
neuroskill-search |
search and compare commands — ANN search for neurally similar EXG moments across all history, and A/B session comparison with metric deltas, trend directions, and UMAP enqueuing. |
neuroskill-sleep |
sleep and umap commands — EXG-based sleep stage classification (Wake / N1 / N2 / N3 / REM) with efficiency, onset latency, bout analysis; and GPU-accelerated 3D UMAP projection of session embeddings for spatial state comparison. |
neuroskill-labels |
label, search-labels, and interactive commands — creating timestamped EXG text annotations, semantic vector search over labels (HNSW), and a cross-modal 4-layer graph search combining text similarity, EXG similarity, and temporal label proximity. Supports Graphviz DOT export. |
neuroskill-streaming |
say, listen, notify, calibrate, calibrations, timer, and raw commands — on-device TTS speech, real-time WebSocket event streaming (EXG, PPG, IMU, scores, labels), OS notifications, calibration profile management, focus timer, and raw JSON passthrough for unlisted commands. |
neuroskill-data-reference |
Complete metric field reference — EXG band powers, ratios & indices (FAA, TBR, BAR, TAR, APF, SEF95, coherence, …), core scores (focus, relaxation, engagement, meditation, mood, cognitive load, drowsiness), complexity measures (Hjorth, permutation entropy, Higuchi FD, DFA, sample entropy, PAC), PPG/HRV fields, motion & artifact markers, sleep stage codes, neurological correlate indices, and consciousness metrics. |
neuroskill-recipes |
Use-case recipes and scripting patterns — shell snippets for focus monitoring, stress tracking, sleep quality analysis, cognitive load queries, meditation tracking, cross-modal graph search, A/B session comparison, time-range queries, and automation with cron / Python / Node.js / HTTP. |
neuroskill-hooks |
hooks command — Proactive Hooks CRUD, real-time EXG pattern matching with scenario gating (cognitive/emotional/physical), threshold suggestions, WebSocket broadcast triggers, and audit logging. |
neuroskill-dnd |
dnd command — EXG-driven Do Not Disturb automation, rolling focus score average, OS-level DND state, and force-override on/off. |
neuroskill-screenshots |
search-images, screenshots-around, screenshots-for-eeg, and eeg-for-screenshots commands — search screenshots by OCR text (semantic/substring), by visual similarity (CLIP --by-image), find screenshots near a timestamp or EEG session, and cross-modal EEG↔screenshot bridging. |
| Skill | Description |
|---|---|
neuroskill-llm |
Built-in on-device LLM inference server — model catalog management, vision support (mmproj), streaming WebSocket and HTTP chat, automatic tool calling, GenParams tuning, persistent chat history, and OpenAI-compatible API. |
| Skill | Description |
|---|---|
neuroskill-evidence |
Implicit evidence collection and personal effectiveness engine — standardised px: label schema, outcome scoring, personal protocol ranking, life-event tracking, evidence-driven selection rules, and privacy safeguards. |
neuroskill-protocols |
Protocol framework hub — personalisation engine, API integration guide, modality router (7 modalities × 12 EEG triggers), matching guidance, and index of 11 domain sub-skills. Always loaded when protocol intent is detected. |
neuroskill-protocols-focus |
Focus, attention, cognition, consciousness, deep meditation, and energy/alertness protocols. |
neuroskill-protocols-stress |
Stress, relaxation, autonomic regulation, HRV, hemispheric balance, and deep relaxation protocols. |
neuroskill-protocols-emotions |
Emotional regulation, mood, and extended emotional processing protocols (12 specific emotions). |
neuroskill-protocols-sleep |
Sleep, circadian, recovery, NSDR/yoga nidra, and power nap protocols. |
neuroskill-protocols-body |
Body, somatic, neck/cervical, eye exercise, headache/migraine, and motor protocols. |
neuroskill-protocols-routines |
Morning routines, workout/gym, hydration, and movement break protocols. |
neuroskill-protocols-nutrition |
Dietary, nutrition, caffeine, fasting, eating, and gut-brain protocols. |
neuroskill-protocols-music |
Music protocols — genre/BPM/artist suggestions for mood, focus, stress, sleep, and emotional release. |
neuroskill-protocols-digital |
Social media, digital addiction, screen time, and attention restoration protocols. |
neuroskill-protocols-breathfree |
30+ non-breathing alternatives — cognitive, tactile, oculomotor, micro-movement, auditory, and passive physiological. |
neuroskill-protocols-life |
Context-specific protocols for 11 life situations — parenting, aging, teens, neurodivergent, commuters, manual workers, healthcare, relational, accessibility, cultural, and situational. |
# From npm (recommended)
npx skills add NeuroSkill-com/skill
# From GitHub
npx skills add NeuroSkill-com/skill
# From a local clone
npx skills add ./skill
# Install a single skill
npx skills add NeuroSkill-com/skill --skill neuroskill-status
# List all available skills without installing
npx skills add NeuroSkill-com/skill --listOn every user message the harness:
- Runs
npx neuroskill statusand injects the live EXG snapshot into the system prompt. - Detects domain signals in the user's prompt (stress, sleep, focus, protocols, …).
- Runs the relevant neuroskill commands in parallel (
neurological,session,search-labels, …). - If protocol intent is detected, reads
skills/neuroskill-protocols/SKILL.mdand injects the full protocol repertoire for that turn. - Injects the skill index (
SKILL.md) so the model always knows every capability available.
| Tool | Purpose |
|---|---|
neuroskill_run |
Run any neuroskill EXG command and return its output. |
neuroskill_label |
Create a timestamped EXG annotation for the current moment (supports --context and --at flags). |
run_protocol |
Execute a multi-step guided protocol with OS notifications, per-step timing, and EXG labelling. |
prewarm |
Kick off a background npx neuroskill compare so results are ready when needed. |
memory_read |
Read the agent's persistent memory file. |
memory_write |
Write or append to the agent's persistent memory file. |
web_fetch |
Fetch a URL and return its content. |
web_search |
Search the web and return results. |
If you use this library in academic work, please cite it as:
@software{kosmyna2026openbci,
author = {Nataliya Kosmyna and Eugene Hauptmann},
title = {{neuroskill}: Skills to model Human State of Mind using NeuroSkill™️},
year = {2026},
version = {0.0.1},
url = {https://github.com/NeuroSkill-com/skill},
license = {AI100},
}