Forge Implementation Guide
SLAM Selector Platforms

From Parts on Desk to Autonomous Hover

You've built the hardware. Now what? This guide walks you through every software step — from flashing an OS to your first FAST-LIO autonomous flight. No prior SLAM experience needed.

Companion Computer
LiDAR
FC Firmware
0 Jargon Buster
1 Flash OS
2 ROS Install
3 Drivers
4 SLAM Stack
5 FC Bridge
6 First Hover
What All These Words Mean
Before you touch a terminal, here's every term you'll encounter. No shame — this stuff is dense and nobody explains it.

Yocto

A build system for creating custom embedded Linux images. It's not an OS — it's the tool that makes an OS. The Orqa APB ships with a Yocto-built Linux. Think of it like a kitchen that bakes a custom Linux cake with only the ingredients your drone needs. You don't need to rebuild it unless you're adding kernel drivers.

BSP (Board Support Package)

The specific Linux config + drivers for your exact hardware. "Orqa Yocto BSP" = Linux that knows how to talk to the APB's i.MX8M, cameras, IMU, etc. Every board vendor provides one.

ROS / ROS2

Robot Operating System. Not really an OS — it's middleware that lets your programs talk to each other via "topics." Your LiDAR driver publishes point clouds on a topic, FAST-LIO subscribes to it. ROS1 (Noetic) is stable legacy. ROS2 (Humble/Iron) is the future.

catkin / colcon

Build tools. catkin_make builds ROS1 packages. colcon build builds ROS2 packages. They compile C++ SLAM code into runnable programs. That's it.

Docker

A container that packages your entire software stack (OS libs + ROS + SLAM) into one portable box. Avoids "works on my machine" problems. On Yocto/embedded boards, Docker lets you run full Ubuntu+ROS inside the minimal Yocto host OS.

MAVROS / mavlink-router

The bridge between your companion computer (running SLAM) and your flight controller (running ArduPilot/PX4). MAVROS is a ROS node. mavlink-router is a lightweight alternative that just forwards MAVLink packets.

FAST-LIO / LIO

LiDAR-Inertial Odometry. Takes LiDAR point clouds + IMU data and outputs "where am I?" 100+ times per second. FAST-LIO2 is the gold standard. It does NOT control the drone — it just provides position to the FC.

EKF / ESIKF

Extended Kalman Filter. The math that fuses noisy sensor data into a clean position estimate. Your FC has one (ArduPilot's EKF3). FAST-LIO has its own (ESIKF). The companion's ESIKF replaces the FC's GPS input.

ikd-Tree

Incremental KD-Tree. FAST-LIO's secret weapon — a data structure that efficiently stores and queries 3D point cloud maps. You don't configure it, but it's why FAST-LIO runs so fast.

Launch file

A config file (.launch for ROS1, .launch.py for ROS2) that starts your SLAM stack with the right parameters. It specifies which LiDAR driver to use, which IMU topic to subscribe to, etc.

Topic / Publisher / Subscriber

ROS communication pattern. LiDAR driver publishes to /livox/lidar. FAST-LIO subscribes to it. FAST-LIO publishes pose to /Odometry. MAVROS subscribes and sends to FC. It's a pipeline.

UART / Serial

The physical wire between your companion computer and flight controller. Usually a 3-wire connection (TX, RX, GND) running MAVLink at 921600 baud. On APB, it's internal (UART3 → FC).

The big picture: LiDAR → ROS Driver → FAST-LIO → MAVROS → FC → Motors. That's the entire data pipeline. Everything in this guide is about getting those links connected.
Flash Your Companion Computer
Get a working Linux environment on your companion computer. The approach differs by hardware.
Jetson: Use NVIDIA's JetPack SDK. It includes Ubuntu 20.04/22.04 + CUDA + cuDNN + TensorRT. This is the easiest path — JetPack is a full Ubuntu desktop with GPU acceleration out of the box.
# Download JetPack SDK Manager on your host PC (Ubuntu x86) # https://developer.nvidia.com/sdk-manager # Connect Jetson via USB, put in recovery mode (hold REC button + power) sdkmanager # Select: JetPack 5.x (for Orin) or 4.x (for TX2/Xavier) # Flash → wait 15-20 min → reboot → you have full Ubuntu + CUDA # After first boot, verify GPU: nvidia-smi # Should show GPU info (Orin won't show this — use jtop instead) sudo pip3 install jetson-stats && jtopJetson
Orqa APB: Ships with Yocto Linux BSP pre-flashed. Yocto is a minimal embedded Linux — it does NOT have apt-get, Ubuntu packages, or a desktop. You have two paths: (A) use Docker on top of Yocto, or (B) use the native Yocto SDK.
# Path A (RECOMMENDED): Run Ubuntu+ROS in Docker on top of Yocto # This gives you apt-get, familiar Ubuntu, and easy ROS install # 1. Connect to APB via USB-C (it appears as ethernet device) ssh root@192.168.75.1 # 2. Check Yocto version cat /etc/os-release # 3. Install Docker (if not pre-installed) # Yocto may need Docker added via Yocto layer — check Orqa SDK docs # Or use the pre-built container from Orqa Developer Program # 4. Pull an ARM64 Ubuntu+ROS container docker pull arm64v8/ros:noetic-ros-base docker run -it --privileged --network host \ -v /dev:/dev \ -v /sdcard:/sdcard \ arm64v8/ros:noetic-ros-base bash # You're now inside Ubuntu 20.04 with ROS Noetic on the APB! # The --privileged flag gives access to LiDAR USB/Ethernet + UARTAPB (Docker)
# Path B: Native Yocto SDK (advanced — for production deployments) # Cross-compile ROS packages on your x86 host, deploy to APB # See: Orqa Developer Program → SDK documentation # The Yocto BSP includes: GStreamer, V4L2, i2c-tools, CAN utils # You'll need to add ROS via meta-ros Yocto layer # APB-specific: FC is on internal UART3 # Route UART3 to FC: gpioset -c gpiochip3 -z 13=1 gpioset -c gpiochip3 -z 0=0 # Now /dev/ttymxc2 talks to the STM32H743 flight controllerAPB (Native)
Why Docker on Yocto? Yocto is optimized for boot speed and minimal footprint — great for production. But for development, you want apt-get, pip, easy ROS install. Docker gives you both: fast Yocto host + full Ubuntu userland. Most embedded drone companies ship this way.
Raspberry Pi: Use the standard Raspberry Pi Imager to flash Ubuntu 22.04 Server (64-bit). Desktop not needed — headless is fine.
# Flash Ubuntu 22.04 Server 64-bit via Raspberry Pi Imager # Enable SSH in imager settings # Boot, connect via SSH ssh ubuntu@raspberrypi.localRPi
VOXL 2: Ships with ModalAI's custom Linux (based on Yocto/Qualcomm Linux). Use voxl-configure-mpa for initial setup. PX4 runs natively on the DSP. Docker is supported for ROS workloads.
# Connect via USB-C or WiFi adb shell # Or SSH after WiFi setup: ssh root@192.168.8.1 # Configure for PX4 + external LiDAR: voxl-configure-mpaVOXL 2
Intel NUC / x86: Flash Ubuntu 22.04 Desktop or Server. This is the easiest platform — standard Ubuntu, no embedded quirks.
Qualcomm RB5: Use Qualcomm's Robotics Linux SDK. Docker recommended for ROS on top of the Qualcomm BSP.
Install ROS
FAST-LIO2 uses ROS1 Noetic. If you're using SUPER (the full nav stack), you need ROS2 Humble. Here's both.
Which ROS? For just SLAM odometry (FAST-LIO2, Point-LIO, S-FAST_LIO) → ROS1 Noetic. For full autonomous navigation (SUPER) → ROS2 Humble. For LIO-SAM → either works.
# ═══ ROS1 NOETIC (Ubuntu 20.04) ═══ sudo sh -c 'echo "deb http://packages.ros.org/ros/ubuntu focal main" > /etc/apt/sources.list.d/ros-latest.list' sudo apt install curl curl -s https://raw.githubusercontent.com/ros/rosdistro/master/ros.asc | sudo apt-key add - sudo apt update sudo apt install ros-noetic-ros-base # Base only (no GUI — we're on a drone) # Add to your shell: echo "source /opt/ros/noetic/setup.bash" >> ~/.bashrc source ~/.bashrc # Install build tools: sudo apt install python3-catkin-tools python3-rosdep sudo rosdep init rosdep update # Create workspace: mkdir -p ~/catkin_ws/src cd ~/catkin_ws catkin_make source devel/setup.bashROS1 Noetic
# ═══ ROS2 HUMBLE (Ubuntu 22.04) ═══ sudo apt install software-properties-common sudo add-apt-repository universe sudo curl -sSL https://raw.githubusercontent.com/ros/rosdistro/master/ros.key -o /usr/share/keyrings/ros-archive-keyring.gpg echo "deb [arch=$(dpkg --print-architecture) signed-by=/usr/share/keyrings/ros-archive-keyring.gpg] http://packages.ros.org/ros2/ubuntu $(. /etc/os-release && echo $UBUNTU_CODENAME) main" | sudo tee /etc/apt/sources.list.d/ros2.list > /dev/null sudo apt update sudo apt install ros-humble-ros-base echo "source /opt/ros/humble/setup.bash" >> ~/.bashrc source ~/.bashrc # Create workspace: mkdir -p ~/ros2_ws/src cd ~/ros2_ws colcon buildROS2 Humble
Docker shortcut: Skip all the above. docker pull ros:noetic-ros-base gives you a working ROS1 environment in seconds. Mount /dev for hardware access.
# ═══ DOCKER SHORTCUT (works on ANY platform including Yocto) ═══ docker run -it --privileged --network host \ --name slam \ -v /dev:/dev \ -v ~/bags:/bags \ ros:noetic-ros-base bash # Inside container, install build essentials: apt update && apt install -y git cmake build-essential libeigen3-dev libpcl-devDocker
Install LiDAR & IMU Drivers
Your SLAM stack needs ROS topics publishing point clouds and IMU data. Each LiDAR has its own driver.
# ═══ LIVOX ROS DRIVER (Avia, Mid-360, Horizon) ═══ cd ~/catkin_ws/src git clone https://github.com/Livox-SDK/livox_ros_driver2.git cd .. catkin_make source devel/setup.bash # Connect Livox via Ethernet (static IP: 192.168.1.1xx) # Set your PC/companion to 192.168.1.50 # Test — you should see point clouds in the terminal: roslaunch livox_ros_driver2 msg_MID360_launch.py # for Mid-360 roslaunch livox_ros_driver2 msg_avia_launch.py # for Avia # Verify topics exist: rostopic list | grep livox # Should see: /livox/lidar and /livox/imuLivox
# ═══ VELODYNE VLP-16 DRIVER ═══ sudo apt install ros-noetic-velodyne # Connect via Ethernet (factory IP: 192.168.1.201) roslaunch velodyne_pointcloud VLP16_points.launch # Topics: /velodyne_points (PointCloud2) # NOTE: VLP-16 has NO built-in IMU — you need an external oneVelodyne
# ═══ OUSTER OS1/OS0 DRIVER ═══ cd ~/catkin_ws/src git clone --recurse-submodules https://github.com/ouster-lidar/ouster-ros.git cd .. && catkin_make # Connect via Ethernet roslaunch ouster_ros driver.launch sensor_hostname:=os-SERIAL.local # Topics: /ouster/points, /ouster/imu (built-in IMU at 100Hz)Ouster
# ═══ ROBOSENSE RS-HELIOS DRIVER ═══ cd ~/catkin_ws/src git clone https://github.com/RoboSense-LiDAR/rslidar_sdk.git cd rslidar_sdk && git submodule update --init cd ../.. && catkin_make # Edit config/config.yaml for RS-Helios-32 roslaunch rslidar_sdk start.launch # Topics: /rslidar_points (PointCloud2) # NOTE: No built-in IMU — use external (e.g., Xsens MTi)RoboSense
External IMU: If your LiDAR has no built-in IMU (Velodyne, RoboSense), you need one. Connect via USB or UART. Common choices: Xsens MTi, Vectornav VN-100, or your FC's IMU via MAVLink. The IMU topic must publish at ≥200Hz for good SLAM.
Build & Configure FAST-LIO2
This is the core. FAST-LIO2 takes your LiDAR + IMU data and outputs 6-DOF position at 100Hz+.
# ═══ BUILD FAST-LIO2 ═══ cd ~/catkin_ws/src git clone https://github.com/hku-mars/FAST_LIO.git cd FAST_LIO git submodule update --init # pulls ikd-Tree cd ../.. catkin_make source devel/setup.bash # Dependencies it needs (install if catkin_make fails): sudo apt install libeigen3-dev libpcl-dev ros-noetic-pcl-rosBuild
# ═══ CONFIGURE FOR LIVOX AVIA ═══ roslaunch fast_lio mapping_avia.launch # Key config in config/avia.yaml: # lid_topic: "/livox/lidar" ← must match your driver topic # imu_topic: "/livox/imu" ← Avia has built-in IMU # extrinsic_est_en: true ← auto-calibrate LiDAR↔IMU # filter_size_map: 0.5 ← map resolution (meters) # Verify it's working — should see odometry output: rostopic echo /OdometryAvia Config
# ═══ CONFIGURE FOR LIVOX MID-360 ═══ roslaunch fast_lio mapping_mid360.launch # 360° FoV = best for indoor GPS-denied # Key config in config/mid360.yaml: # lid_topic: "/livox/lidar" # imu_topic: "/livox/imu"Mid-360 Config
# ═══ CONFIGURE FOR VELODYNE VLP-16 ═══ roslaunch fast_lio mapping_velodyne.launch # config/velodyne.yaml: # lid_topic: "/velodyne_points" # imu_topic: "/imu/data" ← YOUR external IMU topic # extrinsic_T: [x, y, z] ← LiDAR→IMU translation (measure!) # extrinsic_R: [r11..r33] ← LiDAR→IMU rotationVelodyne Config
# ═══ CONFIGURE FOR OUSTER OS1 ═══ roslaunch fast_lio mapping_ouster64.launch # config/ouster64.yaml: # lid_topic: "/ouster/points" # imu_topic: "/ouster/imu" ← built-in at 100Hz # Note: Ouster IMU is 100Hz (low). May struggle in fast maneuvers.Ouster Config
# ═══ CONFIGURE FOR ROBOSENSE (use S-FAST_LIO) ═══ # S-FAST_LIO has native RS-Helios support cd ~/catkin_ws/src git clone https://github.com/zlwang7/S-FAST_LIO.git cd ../.. && catkin_make roslaunch sfast_lio mapping_rs.launch # Needs external IMU (e.g., Xsens MTi on /imu/data)RoboSense Config
Test before flying! Record a rosbag by walking around with your LiDAR: rosbag record /livox/lidar /livox/imu -O test_walk.bag. Then replay it: rosbag play test_walk.bag while FAST-LIO runs. If the map looks good, you're ready for Phase 5.
Bridge SLAM → Flight Controller
FAST-LIO outputs position. Now you need to feed that to your FC so it can hold position without GPS.
# ═══ ARDUPILOT: MAVROS + VISION_POSITION_ESTIMATE ═══ # 1. Install MAVROS sudo apt install ros-noetic-mavros ros-noetic-mavros-extras sudo /opt/ros/noetic/lib/mavros/install_geographiclib_datasets.sh # 2. Launch MAVROS (connect to FC via UART) roslaunch mavros apm.launch fcu_url:=/dev/ttyUSB0:921600 # For APB (internal UART3 → FC): roslaunch mavros apm.launch fcu_url:=/dev/ttymxc2:921600 # 3. Remap FAST-LIO odometry → MAVROS vision pose # Create a simple relay node or use topic_tools: rosrun topic_tools relay /Odometry /mavros/vision_pose/pose_cov # 4. ArduPilot parameters (set via Mission Planner or mavproxy): # AHRS_EKF_TYPE = 3 (use EKF3) # EK3_SRC1_POSXY = 6 (ExternalNav) # EK3_SRC1_POSZ = 1 (Baro — or 6 for full external) # EK3_SRC1_VELXY = 6 (ExternalNav) # EK3_SRC1_YAW = 6 (ExternalNav) # VISO_TYPE = 1 (MAVLink) # SERIAL2_PROTOCOL = 2 (MAVLink2 on the UART to companion) # SERIAL2_BAUD = 921 (921600 baud)ArduPilot
# ═══ PX4: MAVROS + VISION_POSITION_ESTIMATE ═══ sudo apt install ros-noetic-mavros ros-noetic-mavros-extras sudo /opt/ros/noetic/lib/mavros/install_geographiclib_datasets.sh roslaunch mavros px4.launch fcu_url:=/dev/ttyUSB0:921600 # PX4 parameters (set via QGC): # EKF2_EV_CTRL = 15 (enable vision pos+vel+yaw) # EKF2_HGT_REF = 3 (vision for height) # MAV_COMP_ID = 197 (companion computer ID) # SER_TEL2_BAUD = 921600 (companion UART baud)PX4
iNav / Betaflight: These firmwares do NOT natively support external position input (VISION_POSITION_ESTIMATE). You cannot directly feed SLAM odometry to them. Options: (1) Switch to ArduPilot/PX4 for GPS-denied flight, (2) Use the FC only for stabilization and control the drone via offboard commands from the companion computer, (3) Use iNav's GPS emulation (experimental, unreliable).
Critical: The coordinate frame MUST match. FAST-LIO outputs in its own frame (FLU — Forward-Left-Up). ArduPilot/PX4 expect NED (North-East-Down) or ENU (East-North-Up). MAVROS handles this conversion automatically if you publish to the right topic. Do NOT try to manually rotate — let MAVROS do it.
First Autonomous Hover
Everything's wired up. Time to fly. Follow this checklist exactly.
SAFETY FIRST: Tether the drone or fly in a netted area for your first test. SLAM-based flight can fail suddenly if the LiDAR loses tracking (featureless walls, direct sunlight, excessive vibration). Always have a manual override ready.
Props OFF — verify FAST-LIO odometry is stable with rostopic echo /Odometry
Props OFF — verify MAVROS is connected: rostopic echo /mavros/state shows "connected: True"
Props OFF — verify vision position is reaching the FC: check EKF status in Mission Planner / QGC
Move the drone by hand — position in QGC/Mission Planner should update smoothly
Rotate the drone by hand — heading should update correctly (no 90° offset)
Tether attached or inside safety net
RC transmitter in hand, STABILIZE/MANUAL mode as fallback on a switch
Switch to LOITER (ArduPilot) or POSITION (PX4) mode — drone should hold
Gently throttle up — it should hover in place using SLAM position
If it hovers — congratulations. You've built a GPS-denied autonomous drone. From here: tune the EKF noise parameters, add obstacle avoidance (look at SUPER from HKU-MARS), or add waypoint missions via MAVROS.
# ═══ FULL LAUNCH SEQUENCE (all in one terminal session) ═══ # Terminal 1: LiDAR driver roslaunch livox_ros_driver2 msg_avia_launch.py # Terminal 2: FAST-LIO roslaunch fast_lio mapping_avia.launch # Terminal 3: MAVROS roslaunch mavros apm.launch fcu_url:=/dev/ttyUSB0:921600 # Terminal 4: Pose relay (FAST-LIO → MAVROS) rosrun topic_tools relay /Odometry /mavros/vision_pose/pose_cov # Or wrap it all in one launch file (recommended for production)Full Launch