TachiMate generates dynamic, structured learning curriculums on any topic — entirely on your machine, with no cloud APIs, no heavy downloads, and no internet required.
Features · How It Works · Getting Started · Configuration · Import & Export · Tech Stack
| Feature | Description |
|---|---|
| 🧠 Dynamic Curriculum Generation | Enter any topic and get a fully structured curriculum broken into logical sections with estimated reading times |
| 📄 On-Demand Chapter Generation | Each chapter includes markdown text, Mermaid.js diagrams, and a TTS-ready conversational script |
| ⚡ Async Prefetching | The next chapter is silently generated in the background while you read the current one |
| 🔊 Built-in Text-to-Speech | Podcast-style audio via the browser's native Web Speech API — zero bandwidth used |
| 💾 Persistent Offline Sessions | All progress is saved to IndexedDB; resume any topic exactly where you left off |
| 🎛️ Customizable Settings | Switch LLMs, set chapter count, and toggle diagrams or audio independently |
| 📤 Export & Import Data | Back up your entire learning history as a portable JSON file and restore it on any device |
User Input (Topic)
│
▼
ollamaClient.ts ──► POST localhost:11434 ──► Ollama (Local LLM)
│ │
│ JSON Response
▼ │
Response Parser ◄─────────────────────────────────────
(strip <think> traces · remove markdown fences · parse to TS interfaces)
│
▼
AppContext.tsx (React State)
├──► Tailwind UI Components (render)
├──► IndexedDB (persist & cache)
├──► Web Speech API (TTS audio)
├──► Mermaid.js (diagrams)
└──► requestIdleCallback (prefetch next chapter)
- Request — User enters a topic and hits Enter
- Handshake —
ollamaClient.tssends a REST request tolocalhost:11434with a system prompt enforcing raw JSON output - Parse — The frontend strips
<think>reasoning traces (DeepSeek), removes markdown fences, and parses the response into TypeScript interfaces - Render — Data flows into React Context, rendered via Tailwind components and simultaneously cached to IndexedDB
- Node.js v18+
- npm or yarn
- Ollama installed and running locally
- At least one local LLM pulled (see below)
# 1. Clone the repo
git clone https://github.com/your-org/tachiMate.git
cd tachiMate
# 2. Install dependencies
npm install
# 3. Start Ollama and pull a model
ollama serve
ollama pull deepseek-r1:8b # recommended — or llama3, gemma, mistral, phi3
# 4. Start the dev server
npm run devOpen your browser at http://localhost:5173 and enter any topic to begin.
npm run build
npm run previewAll settings are accessible from within the app — no config files needed.
| Setting | Options | Description |
|---|---|---|
| Model | DeepSeek R1 8B, Llama 3, Gemma, Mistral, Phi-3 | The local LLM used for generation |
| Chapter Count | 1 – 10 | Number of chapters in the curriculum |
| Mermaid Charts | On / Off | Toggle diagram generation per chapter |
| TTS Audio | On / Off | Toggle text-to-speech narration |
TachiMate lets you back up and restore your entire learning history as a single portable JSON file — useful for migrating between devices, sharing sessions with others, or just keeping a local backup.
From the app's session history panel, click Export Data. TachiMate will bundle everything stored in IndexedDB into a single .json file and trigger a browser download:
{
"version": "1.0",
"exportedAt": "2025-01-15T10:30:00Z",
"sessions": [
{
"id": "abc123",
"topic": "Quantum Computing",
"createdAt": "2025-01-10T08:00:00Z",
"progress": 0.75,
"curriculum": {
"sections": ["Foundations", "Core Concepts", "In Practice", "Mastery"],
"chapters": [...]
},
"generatedChapters": {
"0": { "markdown": "...", "diagram": "...", "ttsTranscript": "..." }
}
}
]
}Click Import Data and select a previously exported .json file. TachiMate will merge the imported sessions into your current IndexedDB — existing sessions with matching IDs are preserved and not overwritten.
Note: Imported sessions are fully offline-readable immediately. The AI engine is only needed if you want to continue generating new chapters within an imported curriculum.
- 📱 Device migration — move your sessions from one machine to another
- 🤝 Sharing — send a pre-generated curriculum to a colleague or student
- 🗄️ Archiving — keep external backups of completed courses outside the browser
| Layer | Technology |
|---|---|
| Frontend | React 18, Vite, TypeScript |
| Styling | Tailwind CSS v4 |
| Local AI Engine | Ollama |
| Data Persistence | IndexedDB (idb) |
| Diagrams | Mermaid.js |
| Markdown | remark-breaks + React Markdown |
| Audio | Native Web Speech API |
Once a chapter is generated and cached, TachiMate works 100% offline.
| Action | Needs Ollama | Needs Internet |
|---|---|---|
| Generate new curriculum | ✅ | ❌ |
| Generate new chapter | ✅ | ❌ |
| Read cached chapter | ❌ | ❌ |
| TTS audio playback | ❌ | ❌ |
| View Mermaid diagrams | ❌ | ❌ |
| Resume saved session | ❌ | ❌ |
tachiMate/
├── src/
│ ├── components/ # UI components
│ ├── context/
│ │ └── AppContext.tsx # Global React state
│ ├── lib/
│ │ └── ollamaClient.ts # Ollama REST client
│ ├── db/ # IndexedDB helpers (idb)
│ └── main.tsx
├── public/
├── index.html
├── vite.config.ts
├── tailwind.config.ts
└── tsconfig.json
- Generation speed depends on your hardware and chosen model
- Web Speech API voice quality varies by browser and OS
- Mermaid diagrams are limited to flowchart-style visuals
- Large session histories may require periodic IndexedDB cleanup
Contributions are welcome! Please open an issue first to discuss what you'd like to change.
# Run linter
npm run lint
# Run type checks
npm run typecheckMIT © TachiMate Contributors
