Add exec-server stub server and protocol docs#15089
Conversation
This adds the standalone exec-server stdio JSON-RPC crate and its smoke tests without wiring it into the CLI or unified-exec yet. Co-authored-by: Codex <noreply@openai.com>
Document the standalone exec-server crate, its stdio JSON-RPC transport, and the current request/response and notification payloads. Co-authored-by: Codex <noreply@openai.com>
Co-authored-by: Codex <noreply@openai.com>
Keep only the standalone handshake transport in the first PR. Leave exec RPC behavior unimplemented here so the working process implementation can land in a separate follow-up.\n\nCo-authored-by: Codex <noreply@openai.com>
Move process-oriented client helpers and exports into the exec follow-up. Keep the first PR focused on spawning the stub server and running the initialize handshake.\n\nCo-authored-by: Codex <noreply@openai.com>
|
All contributors have signed the CLA ✍️ ✅ |
There was a problem hiding this comment.
💡 Codex Review
Here are some automated review suggestions for this pull request.
Reviewed commit: 76071974bb
ℹ️ About Codex in GitHub
Your team has set up Codex to review pull requests in this repo. Reviews are triggered when you
- Open a pull request for review
- Mark a draft as ready
- Comment "@codex review".
If Codex has suggestions, it will comment; otherwise it will react with 👍.
Codex can also answer questions or update the PR. Try commenting "@codex address that feedback".
codex-rs/exec-server/src/server.rs
Outdated
| continue; | ||
| } | ||
|
|
||
| let message = serde_json::from_str::<JSONRPCMessage>(&line)?; |
There was a problem hiding this comment.
Handle malformed JSON without exiting server
run_main deserializes each input line with serde_json::from_str(...)? in the main loop. A single malformed line propagates an error out of run_main, so the binary exits and drops the stdio session instead of replying with a JSON-RPC error and continuing. This lets one bad client message take down the whole server process.
Useful? React with 👍 / 👎.
Introduce API-agnostic server envelope parsing/encoding and a tiny method-registration router in the initialize-only exec-server slice. Co-authored-by: Codex <noreply@openai.com>
Co-authored-by: Codex <noreply@openai.com>
Co-authored-by: Codex <noreply@openai.com>
Co-authored-by: Codex <noreply@openai.com>
Co-authored-by: Codex <noreply@openai.com>
|
|
||
| #[derive(Debug, Parser)] | ||
| struct ExecServerArgs { | ||
| /// Transport endpoint URL. Supported values: `stdio://` (default), |
There was a problem hiding this comment.
we should test over the same protocol that we use in prod (I assume ws?)
not blocking
codex-rs/exec-server/src/protocol.rs
Outdated
| #[derive(Debug, Clone, PartialEq, Eq, Serialize, Deserialize)] | ||
| #[serde(rename_all = "camelCase")] | ||
| pub struct InitializeResponse { | ||
| pub protocol_version: String, |
There was a problem hiding this comment.
I'd nix versioning until we know what we are actually doing with it.
|
|
||
| while let Some(event) = incoming_rx.recv().await { | ||
| match event { | ||
| JsonRpcConnectionEvent::Message(message) => match message { |
There was a problem hiding this comment.
JsonRpcConnectionEvent doesn't exist in app server so I'm surprised to see this here.
| use crate::rpc::RpcRouter; | ||
| use crate::server::ExecServerHandler; | ||
|
|
||
| pub(crate) fn build_router() -> RpcRouter<ExecServerHandler> { |
There was a problem hiding this comment.
I tend to favor a match statement in a loop?
|
|
||
| #[allow(clippy::print_stderr)] | ||
| fn print_websocket_startup_banner(addr: SocketAddr) { | ||
| eprintln!("codex-exec-server listening on ws://{addr}"); |
There was a problem hiding this comment.
We try to ban println and eprintln (there's a lint to block them). Favor the tracing crate.
| eprintln!("codex-exec-server listening on ws://{addr}"); | ||
| } | ||
|
|
||
| #[cfg(test)] |
There was a problem hiding this comment.
Look at core as an example: we do something slightly non-standard and create separate _tests.rs files.
codex-rs/exec-server/README.md
Outdated
|
|
||
| ## Transport | ||
|
|
||
| The server speaks newline-delimited JSON-RPC 2.0 over stdio. |
codex-rs/exec-server/README.md
Outdated
| - `stdout`: one JSON-RPC message per line | ||
| - `stderr`: reserved for logs / process errors | ||
|
|
||
| Like the app-server transport, messages on the wire omit the `"jsonrpc":"2.0"` |
Co-authored-by: Codex <noreply@openai.com>
Co-authored-by: Codex <noreply@openai.com>
|
I have read the CLA Document and I hereby sign the CLA |
|
recheck |
Handle malformed JSON-RPC input by replying with an invalid-request error and keeping the connection open, and trim stale transport wording in the README. Co-authored-by: Codex <noreply@openai.com>
Remove dead stub-only dependencies and disable doctests for the exec-server library target so cargo-shear stays green. Co-authored-by: Codex <noreply@openai.com>
Stacked PR 1/3.
This is the initialize-only exec-server stub slice: binary/client scaffolding and protocol docs, without exec/filesystem implementation.