You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: content/learning-paths/embedded-and-microcontrollers/device-connect-strands/background.md
+4-16Lines changed: 4 additions & 16 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -26,32 +26,20 @@ Two packages make this work.
26
26
27
27
**`device-connect-agent-tools`** is the agent-side runtime. It exposes `discover_devices()` and `invoke_device()` as plain Python functions. The `robot_mesh` tool in Strands Robots wraps the same interface as a Strands tool, so an LLM can call it too. Both use the same underlying transport, so the same calls work whether the caller is a script or an agent.
28
28
29
-
```
30
-
┌──────────────────────────────────────┐
31
-
│ Agent layer │
32
-
│ discover_devices · invoke_device │
33
-
│ robot_mesh Strands tool │
34
-
└──────────────┬───────────────────────┘
35
-
│ Device Connect
36
-
│ (device-to-device discovery & RPC)
37
-
┌──────────────▼───────────────────────┐
38
-
│ Device layer │
39
-
│ Simulated SO-100 arm |
40
-
| - so100-abc123 │
41
-
│ heartbeat · execute · getStatus │
42
-
└──────────────────────────────────────┘
43
-
```
29
+

44
30
45
31
## What a device exposes
46
32
47
-
This Learning Path uses the SO-100 arm, a simulated robot arm from Hugging Face. When `Robot('so100').run()` starts, it registers on the mesh and exposes three callable functions. These are what `invoke_device()` on the agent side targets — calling `invoke_device("so100-abc123", "execute", {...})` routes a request over Zenoh to the robot process and executes the function there, returning the result back to the caller:
33
+
This Learning Path uses the SO-100 arm, an open-source robot arm. When `Robot('so100').run()` starts, it registers on the mesh and exposes three callable functions. These are what `invoke_device()` on the agent side targets — calling `invoke_device("so100-abc123", "execute", {...})` routes a request over Zenoh to the robot process and executes the function there, returning the result back to the caller:
48
34
49
35
-`execute` — send a natural language instruction and a policy provider to the robot
50
36
-`getStatus` — query what the robot is currently doing
51
37
-`stop` — halt the current task, or `emergency_stop` to halt every device on the mesh at once
52
38
53
39
A motion policy is the component that translates a high-level instruction like "pick up the cube" into a sequence of joint movements. Different policy providers connect to different backends — from local model inference to remote policy servers. For this Learning Path, `policy_provider='mock'` is used, so `execute` accepts the task and returns immediately without running real motion. Replacing `'mock'` with a real provider like `'lerobot_local'` or `'groot'` is a one-line change once you have the connectivity working.
54
40
41
+
Beyond discrete RPC calls, devices can also publish a continuous stream of sensor data over the same mesh. A camera publishes image frames, a depth sensor publishes point clouds, and an IMU reports pose updates — all as Device Connect events that any subscriber on the network receives in real time. The simulated robot in this Learning Path publishes joint state and observation updates at 10 Hz.
5. Starts a 10Hz background loop that emits `stateUpdate` and `observationUpdate` events to any listener on the mesh.
64
64
65
+
: device registration and event publishing over Zenoh")
This section involves two machines. Keep track of which commands run where:
34
34
35
+

36
+
35
37
| Machine | Terminal | Purpose |
36
38
|---------|----------|---------|
37
39
| Host | 1 | Docker Compose infrastructure |
@@ -80,7 +82,7 @@ Note the address returned - for the rest of this section it's referred to as `HO
80
82
81
83
## Step 3 - prepare the Raspberry Pi
82
84
83
-
On the Raspberry Pi, follow the same repository and environment setup from the setup section of this Learning Path: install Python 3.12, clone the `robots` repository, create the virtual environment, and install the packages with the same editable install commands.
85
+
On the Raspberry Pi, follow the same repository and environment setup from the setup section: clone the `robots` repositoryand run `setup.sh` to install all dependencies.
84
86
85
87
Once the environment is ready, export the three variables that tell the SDK to route traffic through the Device Connect router on your host rather than using local network discovery:
86
88
@@ -101,7 +103,7 @@ python <<'PY'
101
103
import logging
102
104
logging.basicConfig(level=logging.INFO)
103
105
from strands_robots import Robot
104
-
r = Robot('so100')
106
+
r = Robot('so100', peer_id='so100-abc123')
105
107
r.run()
106
108
PY
107
109
```
@@ -116,7 +118,7 @@ device_connect_sdk.device.so100-abc123 - INFO - Connected to ZENOH broker: ['tcp
116
118
device_connect_sdk.device.so100-abc123 - INFO - Device registered: registration_id=ecfff6a7-...
117
119
```
118
120
119
-
Note the peer ID (for example `so100-abc123`). You'll need it in the `tell` command below. Leave this process running on the Pi.
121
+
Leave this process running on the Pi.
120
122
121
123
## Discover and invoke using the robot_mesh Strands tool
Send an instruction to the robot. Replace `so100-abc123` with the peer ID shown in your output:
142
+
Send an instruction to the robot using the peer ID set in Step 4:
141
143
142
144
```python
143
145
python <<'PY'
@@ -194,4 +196,4 @@ In this section you've:
194
196
- Connected a Raspberry Pi as a remote device by pointing its SDK at the router's TCP address.
195
197
- Discovered the Pi's robot from your host by querying the persistent registry and sent commands to it across the network.
196
198
197
-
This is a deliberately simple two-device setup, but it demonstrates the foundation for something much larger. Once devices register through a shared infrastructure, agents can discover and command any of them without caring where they run - a fleet of robot arms, a network of sensors, or a mix of physical and simulated devices all become equally reachable. Adding more devices is just a matter of pointing them at the same router. That's the core of what Device Connect makes possible: a mesh of heterogeneous devices that agents can reason about and act on, at any scale.
199
+
This two-device setupdemonstrates the foundation for larger deployments. Once devices register through a shared infrastructure, adding more is a matter of pointing them at the same router — whether that's another Raspberry Pi in the same lab or a device on a different network.
Copy file name to clipboardExpand all lines: content/learning-paths/embedded-and-microcontrollers/device-connect-strands/setup.md
+7-25Lines changed: 7 additions & 25 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -8,50 +8,32 @@ layout: learningpathall
8
8
9
9
## Verify the required tools
10
10
11
-
Before cloning any repositories, confirm that Python 3.12 and Git are available:
11
+
Before cloning the repository, confirm that Git is available:
12
12
13
13
```bash
14
-
python3.12 --version
15
14
git --version
16
15
```
17
16
18
-
These instructions are tested on Python 3.12. Earlier versions of Python 3 may work but are not validated against the `dev` branch used in this Learning Path.
19
-
20
17
## Clone the repository
21
18
22
-
The code run in this Learning Path sits in a branch of the `robots` repository. It contains the robot runtime and the `robot_mesh` Strands tool.
19
+
Clone the `robots` repository, which contains the robot runtime and the `robot_mesh` Strands tool:
The Device Connect integration code for `robots` lives on the `dev` branch. This branch adds the `RobotDeviceDriver` adapter and the updated `robot_mesh` tool that routes calls through the Device Connect SDK rather than the raw Zenoh mesh.
29
+
The repository includes a `setup.sh` script that installs `uv`, creates a Python 3.12 virtual environment, and installs all required packages:
33
30
34
31
```bash
35
32
cd~/strands-device-connect/robots
36
-
git switch dev
37
-
```
38
-
39
-
## Create a Python virtual environment
40
-
41
-
Create a single virtual environment at the workspace root, then activate it:
42
-
43
-
```bash
44
-
python3.12 -m venv .venv
33
+
./strands_robots/device_connect/setup.sh
45
34
source .venv/bin/activate
46
35
```
47
36
48
-
Now install the packages and make sure they are available on your `PYTHONPATH` environment variable:
49
-
50
-
```bash
51
-
pip install -e ".[sim]"
52
-
export PYTHONPATH="$PWD:$PYTHONPATH"
53
-
```
54
-
55
37
## How discovery works - no configuration needed
56
38
57
39
The `strands-robots` SDK uses Device Connect's built-in device-to-device discovery: every `Robot()` instance announces itself on the local network at startup, and any process running `discover_devices()` or `robot_mesh(action='peers')` on the same network segment will find it automatically.
@@ -62,7 +44,7 @@ This means discovery works as long as the device process and the agent process a
62
44
63
45
At this point you've:
64
46
65
-
- Cloned `robots` with the `dev` branch checked out.
47
+
- Cloned the `robots` repository.
66
48
- Created a Python 3.12 virtual environment with the Device Connect SDK, agent tools, and robot simulation runtime all installed.
67
49
68
50
The next section walks you through starting a simulated robot and invoking it from both the agent tools and the `robot_mesh` Strands tool.
0 commit comments