From 13dfaced22a27f55368f4671fe4147306b8703fe Mon Sep 17 00:00:00 2001 From: ygd58 Date: Fri, 10 Apr 2026 22:23:28 +0200 Subject: [PATCH 1/2] docs: add Quick Summary for Developers section to README Closes #253 Adds a developer-oriented summary table at the top of the README before the detailed Overview section. Answers the most common onboarding questions in plain language: - What OpenGradient is and what problem it solves - How developers interact with the SDK - Where Model Hub and MemSync fit in --- README.md | 14 ++++++++++++++ 1 file changed, 14 insertions(+) diff --git a/README.md b/README.md index f278d67..b9a56ed 100644 --- a/README.md +++ b/README.md @@ -6,6 +6,20 @@ A Python SDK for decentralized model management and inference services on the OpenGradient platform. The SDK provides programmatic access to distributed AI infrastructure with cryptographic verification capabilities. +## Quick Summary for Developers + +> **New to OpenGradient?** Start here. + +| Question | Answer | +|---|---| +| **What is it?** | A decentralized network that runs AI inference inside TEEs and settles every request on-chain | +| **What problem does it solve?** | Centralized AI is a black box. OpenGradient gives cryptographic proof for every inference | +| **How do I use it?** | Install the SDK, get a private key, call llm.chat() like OpenAI but with transaction_hash and tee_signature in every response | +| **What is Model Hub?** | A decentralized registry to upload, discover, and run custom ONNX models on-chain | +| **What is MemSync?** | A long-term memory layer for AI agents with persistent context across sessions | + +--- + ## Overview OpenGradient enables developers to build AI applications with verifiable execution guarantees through Trusted Execution Environments (TEE) and blockchain-based settlement. The SDK supports standard LLM inference patterns while adding cryptographic attestation for applications requiring auditability and tamper-proof AI execution. From 63442ef0f5aa00b29480e71facc3d013f22d07e2 Mon Sep 17 00:00:00 2001 From: ygd58 Date: Fri, 10 Apr 2026 22:25:11 +0200 Subject: [PATCH 2/2] docs: add Quick Summary for Developers section to README Closes #253 Adds a developer-oriented summary table and 30-second quickstart example before the detailed Overview section. Answers the most common onboarding questions in plain language: - What OpenGradient is and what problem it solves - How developers interact with the SDK - Where Model Hub and MemSync fit in - Runnable quickstart code with on-chain proof output --- README.md | 18 ++++++++++++++++++ 1 file changed, 18 insertions(+) diff --git a/README.md b/README.md index b9a56ed..19cfac9 100644 --- a/README.md +++ b/README.md @@ -18,6 +18,24 @@ A Python SDK for decentralized model management and inference services on the Op | **What is Model Hub?** | A decentralized registry to upload, discover, and run custom ONNX models on-chain | | **What is MemSync?** | A long-term memory layer for AI agents with persistent context across sessions | +### 30-Second Quickstart + +Install, grab a private key from the [faucet](https://faucet.opengradient.ai), and run: + + import asyncio, os, opengradient as og + + async def main(): + llm = og.LLM(private_key=os.environ["OG_PRIVATE_KEY"]) + llm.ensure_opg_approval(min_allowance=0.1) + result = await llm.chat( + model=og.TEE_LLM.GEMINI_2_5_FLASH, + messages=[{"role": "user", "content": "Hello!"}], + ) + print(result.chat_output["content"]) # AI response + print(result.transaction_hash) # on-chain proof + + asyncio.run(main()) + --- ## Overview