A senior design capstone project that fuses an Intel RealSense RGB-D camera with a TI mmWave radar for real-time pedestrian detection and autonomous vehicle safety. The system detects, classifies, and tracks objects, then issues SAFE, CAUTION, or STOP commands based on distance, velocity, and confidence from both sensors.
# 1. Start brain node + rosbridge in WSL (two terminals)
# 2. Then in Windows/Anaconda terminals, from the src/ folder:
python camera_to_rosbridge.py # Terminal 3
python radar_to_rosbridge.py # Terminal 4
python gui_app.py # Terminal 5See Setup and Running for full instructions.
Full documentation lives in src/documentation/:
- Project Overview -- Goals, sensors, and what the system does
- System Architecture -- 4-process design and data flow
- Math and Science -- Radar processing, camera depth, 3D projection, YOLO, fusion theory
- Code Guide -- File-by-file breakdown with every function and constant
- ROS Topics Reference -- All 14 topics with types and message examples
- Setup and Running -- Prerequisites, dependencies, startup order
Capstone_MD030/
├── README.md ← you are here
├── src/ ← production code
│ ├── brain_node.py
│ ├── camera_module.py
│ ├── camera_to_rosbridge.py
│ ├── gui_app.py
│ ├── radar_module.py
│ ├── radar_to_rosbridge.py
│ ├── calibration.json
│ ├── profile.cfg
│ ├── yolo26n.pt
│ └── documentation/
└── experiments/ ← archived iterations and tools
| Component | Technology |
|---|---|
| Object detection | Ultralytics YOLO (YOLOv8-nano) |
| Camera | Intel RealSense D400 (pyrealsense2) |
| Radar | TI mmWave IWR-series (pyserial) |
| Middleware | ROS 2 Humble + rosbridge |
| GUI | PySide6 + OpenCV |