Skip to content

Cameras

Adamo supports USB cameras (V4L2), custom GStreamer pipelines, and ROS camera topics. Each camera is a video_track in your config file. You can run multiple tracks simultaneously.

For USB webcams, Intel RealSense cameras, and any V4L2-compatible device. RealSense cameras generally work best with V4L2.

video_tracks:
- name: "main"
source_type: "v4l2"
v4l2_device: "/dev/video0"
source_format: "yuy2"
encoder: "auto"
bitrate: 4000
fps: 30

Most USB webcams and RealSense cameras output YUY2. See Source Format below for why this matters.

To find available devices:

Terminal window
ls /dev/video*
# or for more detail:
v4l2-ctl --list-devices

Change v4l2_device to match the device path for your camera (e.g. /dev/video2).

To add a second camera, append another entry to the list:

video_tracks:
- name: "front"
source_type: "v4l2"
v4l2_device: "/dev/video0"
source_format: "yuy2"
encoder: "auto"
bitrate: 4000
fps: 30
- name: "rear"
source_type: "v4l2"
v4l2_device: "/dev/video2"
source_format: "yuy2"
encoder: "auto"
bitrate: 2000
fps: 30

For cameras with their own GStreamer plugins or when you need custom capture pipelines. Set source_type: "gstreamer" and provide a gstreamer_pipeline string. The pipeline should produce raw video frames — Adamo handles the H.264 encoding.

Important: Use the full capsfilter caps=... syntax, not GStreamer’s shorthand ! caps syntax. The shorthand does not work.

# Correct — use capsfilter explicitly
video_tracks:
- name: "main"
source_type: "gstreamer"
gstreamer_pipeline: "videotestsrc ! capsfilter caps=video/x-raw,framerate=30/1"
encoder: "auto"
bitrate: 4000
fps: 30

ZED cameras work best with the ZED GStreamer plugin (zedsrc). Install the ZED GStreamer plugin first.

video_tracks:
- name: "main"
source_type: "gstreamer"
gstreamer_pipeline: "zedsrc stream-type=2 camera-resolution=3 camera-fps=60"
encoder: "nvv4l2h264enc"
bitrate: 8000
fps: 60
source_format: "BGRA"
keyframe_distance: -1.0
stereo: true

Set stereo: true for top/bottom stereo output. source_format: "BGRA" tells the encoder the pixel format coming out of zedsrc. keyframe_distance: -1.0 disables periodic keyframes — the viewer requests them on demand, which is better for latency.

For cameras that publish to ROS2 topics. Enable the ROS bridge and point the track at a topic.

ros:
enabled: true
auto_start_bridge: true
video_tracks:
- name: "ros-cam"
source_type: "ros_auto"
ros_topic: "/camera/image_raw"
encoder: "auto"
bitrate: 3000
fps: 30

source_type: "ros_auto" auto-detects the message type from the topic name — it handles sensor_msgs/Image, sensor_msgs/CompressedImage, and FFMPEGPacket/H.264 topics. ros.enabled: true is required.

You can also specify the source type explicitly if auto-detection doesn’t pick the right one:

Source typeMessage typeDescription
ros_autoauto-detectInfers from topic name
ros_imagesensor_msgs/ImageRaw pixels, re-encoded to H.264
ros_compressed_imagesensor_msgs/CompressedImageJPEG/PNG, decoded then re-encoded
ros_h264H.264 streamPassthrough, no re-encoding
ros_h264_transcodeH.264 streamDecode and re-encode (to change bitrate/keyframes)
ros_ffmpeg_packetFFMPEGPacketPassthrough, no re-encoding
ros_ffmpeg_packet_transcodeFFMPEGPacketDecode and re-encode

OAK cameras work best over ROS using the depthai-ros driver, which publishes H.264-compressed video. Use ros_h264_transcode to re-encode the stream at your preferred bitrate:

ros:
enabled: true
auto_start_bridge: true
video_tracks:
- name: "oak"
source_type: "ros_h264_transcode"
ros_topic: "/oak/rgb/h264"
encoder: "auto"
bitrate: 4000
fps: 30

source_format tells Adamo what pixel format your camera outputs. When set, the encoder can skip CPU-based colorspace conversion and use the GPU directly — saving ~15ms of latency per frame. Without it, Adamo falls back to a slow CPU conversion path.

Currently this cannot be auto-detected — you need to specify it in your config.

FormatCamerasNotes
yuy2Most USB webcams, Intel RealSenseThe most common V4L2 format
bgraZED camerasZED GStreamer plugin outputs BGRA
gray8Mono/IR cameras, RealSense IR streamGrayscale, requires CPU conversion
rgbaSome ROS image sourcesDirect GPU path
i420Some GStreamer pipelinesDirect GPU path
nv12Hardware decodersDirect GPU path, most efficient
uyvySome industrial camerasDirect GPU path

To check what format your V4L2 camera supports:

Terminal window
v4l2-ctl -d /dev/video0 --list-formats
EncoderHardwareNotes
autoAuto-detectPicks the best available encoder at startup
nvv4l2h264encJetson (HW)Best on Jetson, uses the dedicated NVENC block
nvh264encNVIDIA desktop GPUFor x86 machines with NVIDIA GPUs (uses nvcodec)
x264encCPUSoftware fallback, works on any Linux system

Use auto unless you have a reason to pin a specific encoder.

Two knobs: bitrate and fps.

  • bitrate — in kbps. Higher means better image quality and more bandwidth. Start at 2000-4000 kbps.
  • fps — frames per second. Use 60 if your camera supports it, otherwise 30. Drop to 15 on constrained links.

If video looks blocky or laggy, raise the bitrate before anything else. If bandwidth is the bottleneck, lower fps first.