How to stream camera video to external devices.
The udpsink
element sends udp-packages to a certain host(target) or by a multicast address.
This is specified by parameters to the launch.sh command or to the executable itself.
Building the streamer
The GStreamerCameraAppSource is built with
jetson-scripts/build-gstreamer-camera-appsource
Running the app
Start the stream after you set up the viewer
The launch.sh is useful because it add the runtime library path to the environment before execution.
export LD_LIBRARY_PATH=$LD_LIBRARY_PATH:/home/ubuntu/gstreamer-build/out/lib/
./launch.sh -udp 1234 10.13.0.106 100
- 1234 is the port the target machine is listening on
- 10.13.0.106 is the IP address of the viewer
- 100 is the number of frames to capture (use 0 for infinite stream. Default=60)
Viewing the stream
The simplest way to wiew the stream on you target device is to use gst-launch-1.0. The following pipeline unwraps the stream to a autovideosink
gst-launch-1.0 udpsrc port=1234 ! "application/x-rtp, payload=127" ! rtph264depay ! decodebin ! autovideosink sync=false
This is basically just a reversal of the packaging to udpsink
Another option is to use glimagesink
gst-launch-1.0 udpsrc port=1234 ! "application/x-rtp, payload=127" ! rtph264depay ! decodebin ! glimagesink
This reports dropped buffers on my machine though.
To use the stream using ffmpeg
ffmpeg -protocol_whitelist rtp,file,http,https,tcp,udp,tls -i sample_sdp_file.sdp -f opengl "3DLab room"
This is the method being used by the output clients (with a custom output).
commit: $Id: gst.udp-streaming.md 735 2018-06-05 13:55:50Z martin $