SO-101 Setup Guide
From parts to first data collection. Estimated time: ~3–4 hours (not counting 3D print time).
Assembly
~60 min + print timeThe SO-101 is a fully open-source arm. All parts are either 3D-printed or available as off-the-shelf hardware listed in the LeRobot BOM on HuggingFace.
Parts you need
- 6× Feetech STS3215 servo motors
- 3D-printed structural parts (STL files in the SO-101 GitHub repo)
- USB-to-serial adapter cable (CH340 or CP2102 chip)
- 12V power supply (3A minimum)
- Servo cables and connector hardware (per BOM)
Assembly checklist
- Print all structural components (base, links, end-effector)
- Install STS3215 servos into their respective link housings
- Route servo cables through the printed cable channels
- Daisy-chain servos in the correct order (IDs 1–6 from base to tip)
- Secure the base to a stable surface before powering on
- Read the safety page before applying power
3D Printing the Parts
~8–16 hrs print timeAll structural components of the SO-101 are FDM-printable using standard desktop printers. STL files are organized into single-file prints for each arm, making slicing straightforward.
Recommended Slicer Settings
STL Files — Which to Print
Pre-arranged single-file prints are available for common bed sizes:
- 220×220 mm bed (Ender 3):
- Follower:
STL/SO101/Follower/Ender_Follower_SO101.stl - Leader:
STL/SO101/Leader/Ender_Leader_SO101.stl
- Follower:
- 205×250 mm bed (Prusa / UP):
- Follower:
STL/SO101/Follower/Prusa_Follower_SO101.stl - Leader:
STL/SO101/Leader/Prusa_Leader_SO101.stl
- Follower:
STL/Gauges/ and test them against a Lego brick or an STS3215 servo. A correct fit on the gauge confirms your printer calibration is accurate. Adjust scaling if needed before committing to the full print.
Software Install
~15 minThe SO-101 is natively supported by HuggingFace LeRobot. No additional plugin is needed — just install LeRobot.
Install LeRobot
# Using pip
pip install lerobot
# Or with uv (recommended)
uv pip install lerobot
Linux serial port permissions
On Linux, serial ports under /dev/ttyACM* require the user to be in the dialout group. Run this once and log out and back in:
sudo usermod -aG dialout $USER
# Then log out and back in, or run:
newgrp dialout
Prerequisites
- Python 3.10+
- Linux (Ubuntu 22.04 recommended) or macOS
- USB-to-serial driver installed (CH340 driver on macOS; usually pre-installed on Linux)
Port Detection & Calibration
~20 minFind the correct USB serial port for the arm, then run the LeRobot calibration script to set servo zero positions.
Find the serial port
python lerobot/scripts/find_motors_bus_port.py
Plug and unplug the USB cable when prompted. The script identifies which port the arm is connected to. Typical values:
# Linux: /dev/ttyACM0 (or ttyUSB0 for CH340 adapters)
# macOS: /dev/tty.usbmodem* or /dev/tty.usbserial-*
Run calibration
Move the arm through its full range of motion when prompted:
python lerobot/scripts/calibrate.py \
--robot.type=so101 \
--robot.port=/dev/ttyACM0
First Motion Test
~15 minRun the teleoperate script in single-arm mode to verify all joints respond correctly before connecting a leader arm.
python lerobot/scripts/teleoperate.py \
--robot.type=so101 \
--robot.port=/dev/ttyACM0
What to verify
- All 6 joints respond to commands without skipping
- No servo stall or overload warnings in the terminal
- Gripper opens and closes through full range
- No cable snagging at any joint position
Teleoperation
~30 minThe SO-101 works as a standalone arm or as a follower arm with a leader arm for teleoperation. Using a second arm as leader produces higher-quality demonstrations for imitation learning.
Standalone mode (keyboard / programmatic)
python lerobot/scripts/teleoperate.py \
--robot.type=so101 \
--robot.port=/dev/ttyACM0
With a leader arm (e.g. DK1 leader)
python lerobot/scripts/teleoperate.py \
--robot.type=so101 \
--robot.port=/dev/ttyACM0 \
--teleop.type=so101 \
--teleop.port=/dev/ttyACM1
Data Collection
OngoingRecord demonstrations using record.py. Data is saved in LeRobot format and can be pushed directly to HuggingFace Hub for training.
Basic recording
python lerobot/scripts/record.py \
--robot.type=so101 \
--robot.port=/dev/ttyACM0 \
--dataset.repo_id=your-org/so101-dataset \
--dataset.task="pick cube"
With a USB camera
python lerobot/scripts/record.py \
--robot.type=so101 \
--robot.port=/dev/ttyACM0 \
--robot.cameras.top.type=opencv \
--robot.cameras.top.index=0 \
--dataset.repo_id=your-org/so101-dataset \
--dataset.task="pick cube"
Recording best practices
- Record at least 50 demonstrations per task before training
- Vary object positions and orientations across episodes
- Use descriptive
--dataset.tasknames for later filtering - OAK-D or Intel RealSense cameras work well for depth-enabled data collection
- Verify the dataset uploads to HuggingFace Hub after each session
Next steps
Once you have data collected, train an ACT or Diffusion Policy model using LeRobot's training scripts. Read the full SO-101 learning path for a structured progression from setup to model deployment.