Self-Reconfigurable Quadruped From Scratch
OVERVIEW
For my winter project, I built a self-reconfigurable quadruped that can switch between walking and rolling modes. I also integrated ROS2 and YOLO for a computer vision task.
DEMO
HARDWARE
- Raspberry Pi 5
- Intel RealSense Depth Camera D435i
- DYNAMIXEL U2D2 Power Hub
- DYNAMIXEL U2D2
- DYNAMIXEL XC430-T240BB-T Motor (x 12)
- Adafruit ISM330DHCX - 6 DoF IMU - Accelerometer and Gyroscope
- 3S Lipo Battery 5200mAh 50C 11.1V RC Batteries
DESIGN
The research paper Scorpio: A biomimetic reconfigurable rolling-crawling robot inspired the design of my robot, which I fully modeled in OnShape.
I chose DYNAMIXEL motors for their high torque output and well-documented ROS2 integration. For the microcontroller, I used the Raspberry Pi 5, which supports Ubuntu and runs ROS2 efficiently.




PROCESSING IMU DATA FOR ROLLING
To determine how to propel the quadruped robot in rolling mode, I process accelerometer and gyroscope data from the ISM330DHCX sensor via I²C on a Raspberry Pi 5.
The key goal is to compute the tilt angle of the robot using acceleration in the Y and Z axes:
\[\theta = \arctan\left(\frac{a_y}{a_z}\right)\]where:
- \( a_y \) : linear acceleration along the Y-axis
- \( a_z \) : linear acceleration along the Z-axis
This angle is converted from radians to degrees to determine the robot’s current orientation and decide which side (blue or yellow) is on the ground:
\[\theta_{\text{deg}} = \theta \cdot \left( \frac{180}{\pi} \right)\]After averaging a short window of tilt measurements, the robot checks if the orientation falls within certain thresholds:
BLUE side under if:
\[\theta_{\text{deg}} \in [-179^\circ, -162^\circ] \cup [167^\circ, 179^\circ]\]YELLOW side under if:
\[\theta_{\text{deg}} \in [0^\circ, 15^\circ] \cup [-23^\circ, 0^\circ]\]Depending on the orientation, the system issues a movement command like to roll forward with YELLOW side propulsion or roll forward with BLUE side propulsion.
The system runs in real time and adjusts its propulsion mode dynamically based on IMU feedback.
COMPUTER VISION
I used YOLOv11 from Ultralytics, trained with a bowling pin dataset to improve robustness. Since the model does not perform as well on blue bowling pins, I added a second red bowling pin to help the robot correctly transform when it is facing the center between the pins.
AUTONOMOUS BOWLING SEQUENCE
- The robot begins turning while simultaneously scanning for bowling pins.
- There is a 7-second pause between turns.
- If the robot detects at least two pins continuously for 3 seconds, it transitions to rolling mode.
- Once in rolling mode, it reads IMU data to determine tilt angles around the X-axis before rolling forward.

NODES
The bowling task is divided into several subsystems:
Control (Action Server and Client Nodes)
These nodes update the robot’s state (TURNING
, STOPPING
, TRANSFORM TO ROLLING
, ROLLING
, etc.) based on the number of detected bowling pins and publish the corresponding robot configuration for the Motor Control Node to set the DXL motors.
Computer Vision (YOLO Node)
This node processes data from the camera and counts the number of bowling pins, publishing the result to a topic.
Dynamixel Motor Control (Quad Motor Control Node)
This node controls the motors based on the robot’s state and processes IMU data to determine tilt angles during ROLLING mode.
