Physical Execution Interface for science labs — connect AI brains to robot bodies.
Part of the LabClaw ecosystem.
Science labs need to run experiments physically, not just plan them in software. lab-robot provides a unified driver layer that lets AI agents execute real-world lab operations — pipetting, plate handling, incubation — across different robot hardware through a single protocol. No more writing throwaway scripts for each new machine.
- Unified PEI Protocol — abstract robot actions into hardware-agnostic primitives (motion, lab-ops, perception, system)
- Rich ActionResult — every action returns structured results with measurements, state changes, and error details
- Safety-First — chain-of-responsibility safety guards (force limits, workspace bounds, collision detection) at CRITICAL level by default
- Async-Native — built on Python async for concurrent robot orchestration
- Typed & Validated — full Pydantic 2.x schemas and PEP 561 type markers
- Extensible — add new robots by implementing the
RobotDriverprotocol
| Robot | Status | Mode |
|---|---|---|
| Opentrons OT-2 | Active development | Simulate |
pip install lab-robot # core only
pip install lab-robot[opentrons] # + Opentrons OT-2 driverimport asyncio
from lab_robot.types import PipetteAction
from robots.opentrons_ot2 import OT2Driver
from robots.opentrons_ot2.models import OT2DeckConfig, PipetteConfig, PipetteMount, LabwareConfig
# Configure the OT-2 deck layout
deck = OT2DeckConfig(
slots={"1": LabwareConfig(labware_type="nest_96_wellplate_200ul_flat")},
pipette_left=PipetteConfig(
name="p300_single", mount=PipetteMount.LEFT, max_volume_ul=300,
tip_rack_slots=["2"],
),
)
async def main():
driver = OT2Driver(deck_config=deck, simulate=True)
await driver.connect()
result = await driver.execute(PipetteAction(
volume_ul=30.0,
source_well="A1",
dest_well="B1",
))
print(result) # ActionResult(success=True, ...)
await driver.disconnect()
asyncio.run(main())- Copy
robots/_robot_template/torobots/<your_robot>/ - Implement the
RobotDriverprotocol (connect,disconnect,execute,stop,capabilities) - Add your package to
pyproject.tomloptional dependencies
See CONTRIBUTING.md for the full guide.
labclaw (orchestration)
├── lab-robot ← you are here (physical execution)
├── device-use (GUI & visual interaction)
└── device-skills (device drivers & skills)
lab-robot is the Physical Execution Interface (PEI) — Layer 1 of the LabClaw stack. It translates high-level lab operations into hardware-specific commands.
See PEI Specification for the full protocol design.
- Phase 0 — Opentrons OT-2 simulate mode (current)
- Phase 1 — OT-2 real hardware + safety guards
- Phase 2 — Multi-robot orchestration + perception layer
If you use lab-robot in your research, please cite:
@software{labclaw_lab_robot,
author = {LabClaw Team},
title = {lab-robot: Physical Execution Interface for Science Labs},
year = {2026},
url = {https://github.com/labclaw/lab-robot},
license = {Apache-2.0}
}Apache 2.0