Real-Time Vision.·ROS-Native Power.
Two upstream ROS 2 packages from DronaAviation that turn the Pluto drone into a programmable robotics platform — pluto_cam_ros2 for real-time H.264 video, and pluto_ros2_package for flight control over MSP.
Two upstream DronaAviation ROS 2 packages that expose everything the Pluto drone can do — camera frames in, flight commands out, all as standard ROS 2 messages. No wrappers, no lock-in, no proprietary middleware.
/plutocamera/image_rawPlutoMsg over MSP$ ros2 topic list
/parameter_events
/plutocamera/image_raw
/rosout
$ ros2 topic hz /plutocamera/image_raw
average rate: 29.97
min: 0.032s max: 0.036s
std dev: 0.00112s
$ ros2 topic info /plutocamera/image_raw
Type: sensor_msgs/msg/Image
Publisher count: 1
Subscription count: 0Tested on Ubuntu 22.04 LTS + ROS 2 Humble for pluto_cam_ros2. The upstream pluto_ros2_package was originally authored for ROS 2 Foxy (Ubuntu 20.04) — it builds under Humble as well, but you may need minor message-package tweaks. Windows and macOS users run the same commands inside a lightweight Ubuntu layer.
Already on Ubuntu 22.04? You're done. Skip straight to Step 1.
Best performanceInstall Ubuntu 22.04 via WSL 2. Open PowerShell as admin and run:
wsl --install -d Ubuntu-22.04GUI apps work via WSLgRun Ubuntu 22.04 via Multipass (free, Canonical-maintained):
brew install --cask multipass
multipass launch 22.04 --name pluto --cpus 4 --memory 4G
multipass shell plutoSet a UTF-8 locale and enable the universe repository — ROS 2 requires both.
sudo apt update
sudo apt install -y locales software-properties-common curl
sudo locale-gen en_US en_US.UTF-8
sudo update-locale LC_ALL=en_US.UTF-8 LANG=en_US.UTF-8
sudo add-apt-repository universe -yUse the official ros2-apt-source deb package — the canonical install method per the ROS 2 Humble docs.
# Install the ROS 2 apt source deb (manages key + sources.list)
ROS_APT_SOURCE_VERSION=$(curl -s \
https://api.github.com/repos/ros-infrastructure/ros-apt-source/releases/latest \
| grep -F "tag_name" | awk -F'"' '{print $4}')
CODENAME=$(. /etc/os-release && echo $UBUNTU_CODENAME)
curl -L -o /tmp/ros2-apt-source.deb \
"https://github.com/ros-infrastructure/ros-apt-source/releases/download/${ROS_APT_SOURCE_VERSION}/ros2-apt-source_${ROS_APT_SOURCE_VERSION}.${CODENAME}_all.deb"
sudo dpkg -i /tmp/ros2-apt-source.deb
# Install ROS 2 Humble Desktop + dev tools (colcon, rosdep, etc.)
sudo apt update
sudo apt install -y ros-humble-desktop ros-dev-tools
# Source ROS 2 in every new shell
echo "source /opt/ros/humble/setup.bash" >> ~/.bashrc
source /opt/ros/humble/setup.bashFFmpeg (CLI), OpenCV, and cv_bridge — the pieces pluto_cam_ros2 needs to decode H.264 and convert frames.
sudo apt install -y \
ffmpeg python3-opencv python3-numpy python3-pip \
ros-humble-cv-bridge \
ros-humble-image-transportCreate a standard ROS 2 colcon workspace and pull both DronaAviation repos (default main branch).
mkdir -p ~/pluto_ws/src && cd ~/pluto_ws/src
git clone https://github.com/DronaAviation/pluto_ros2_package.git
git clone https://github.com/DronaAviation/pluto_cam_ros2.gitOne colcon build, then source the overlay. Repeat the source step in every new terminal (or add it to ~/.bashrc).
cd ~/pluto_ws
colcon build --symlink-install
source install/setup.bashConnect to the drone's Wi-Fi (Pluto… or WIFI-1080p-…) then launch the camera publisher and view the stream.
# Terminal 1 — publish camera frames
ros2 run pluto_camera_sense plutocam_publisher --ip 192.168.0.1
# Terminal 2 — view the frames in an OpenCV window
ros2 run pluto_image_sub image_subscriberusbipd-win on Windows if you need to bind a USB gamepad into Ubuntu. Wi-Fi access to the drone works out of the box.Any ROS 2 node can subscribe to /plutocamera/image_raw, convert the message to an OpenCV frame with cv_bridge, and run your CV pipeline. That's the whole recipe.
ros2 run pluto_image_sub image_subscriberros2 bag record /plutocamera/image_rawros2 topic hz /plutocamera/image_rawimport rclpy
from rclpy.node import Node
from sensor_msgs.msg import Image
from cv_bridge import CvBridge
import cv2
class PlutoViewer(Node):
def __init__(self):
super().__init__('pluto_viewer')
self.bridge = CvBridge()
self.create_subscription(
Image, '/plutocamera/image_raw',
self.on_frame, 10)
def on_frame(self, msg):
frame = self.bridge.imgmsg_to_cv2(msg, 'bgr8')
# 👉 your OpenCV / ML code here
cv2.imshow('PlutoCam', frame)
cv2.waitKey(1)
rclpy.init(); rclpy.spin(PlutoViewer())Clone the packages, build the workspace, and see your first camera frame in under 10 minutes.
The main branch exposes a small, focused API. Standard ROS 2 message types mean tools like rviz, rqt, and rosbag2 work out of the box.
| Topic | Type | Direction | Notes |
|---|---|---|---|
/plutocamera/image_raw | sensor_msgs/Image | pub | BGR frames, H.264 decoded via FFmpeg |
/drone_command | custom_msgs/PlutoMsg | sub | RC channels + AUX (1000–2000 range) |
int16 rc_roll # 1000–2000 (1500 = neutral)
int16 rc_pitch
int16 rc_yaw
int16 rc_throttle
int16 rc_aux1
int16 rc_aux2
int16 rc_aux3
int16 rc_aux4 # 1500 = armed, 1200 = disarmed
int16 command_type # 0 = none, 1 = takeoff, 2 = landData flows from the drone's hardware into your ROS 2 workspace, where any node can subscribe, process, or record.
A gentle, 5-minute tour of the jargon, the workflow, and the first experiments you should try — written for people who are brand new to ROS and/or drones.
You run many small programs (called nodes) that talk to each other by sending messages on named channels called topics.
In this project, pluto_camera_sense is a node that publishes camera frames to the topic /plutocamera/image_raw. Any other node that subscribes to that topic will receive every frame — no coupling, no sockets, no glue code.
The mental model:
/plutocamera/image_raw).sensor_msgs/Image).colcon. Publisher Subscriber(s)
┌────────────┐ ┌──────────────────┐
│ camera_ │ /pluto… │ your OpenCV node │
│ sense node │────image───▶│ (runs a filter) │
└────────────┘ └──────────────────┘
│ ┌──────────────────┐
└─────────────────────▶│ rviz / rqt │
│ (just to watch) │
└──────────────────┘