hpcGPT is a shared CLI foundation built on top of the Opencode agent for HPC centers. Each center keeps its own deployment code, configuration, prompts, and integrations in its own top-level directory.
NCSA/- NCSA-specific deployment (Delta-focused prototype and tooling)
curl -fsSL https://opencode.ai/install | bash
export OPENCODE_CONFIG=/absolute/path/to/this/repo/NCSA/opencode.jsonc
opencodePick the center deployment you want to run by setting OPENCODE_CONFIG to that center's config file.
hpcgpt-cli/
README.md
LICENSE
favicon.png
NCSA/
README.md
opencode.jsonc
example.env
example.env.atlassian
prompts/
mcp_servers/
doc-scraping/
When adding support for another university or supercomputing center, create a new top-level directory (for example CenterName/) and keep center-specific content scoped there.
opencode.jsoncwith that center's providers, models, and MCP wiringprompts/for center-specific assistant behaviormcp_servers/for local MCP servers owned by that centerexample.envand optional additional env examples (e.g., Atlassian)README.mddescribing architecture, tools, setup, and operations
- Keep secrets out of git; commit only example env files.
- Keep center-specific naming and endpoints inside that center folder.
- Keep root-level docs and files generic and reusable across centers.
- Update this root
README.mdwhen adding a new center directory. - Prefer shared patterns, but allow center-specific implementation details.
- Deployments can use OpenAI-compatible providers; exact provider configuration is center-specific.
- MCP server commands and permissions should be documented in each center's
README.md. - Operational runbooks, support flows, and escalation contacts belong in each center folder.
For the current NCSA prototype details, use NCSA/README.md.
MIT - see LICENSE.