Hardware video encoding on the Logitech C920 camera and sending it to ROS via wifi with a delay of less than 0.2 second


In this guide, we will send a Logitech C920 video stream from the BeagleBone Blue video camera to the laptop on a wifi network and then take it to the ROS gscam node and search and recognize images of tarot cards and ketchup bottles along the path of the EduMIP robot.

This is a continuation of a series of my articles, in the last we stopped on the fact that through gstreamer we sent a video to a laptop .

In ROS there is a package that can receive video from gstreamer, which is called gscam, here is the documentation on it and the source code .

We need the latest version of gscam with gstreamer-1.0 support, so we will install from the latest version of the source code.

cd catkin_ws/src git clone https://github.com/ros-drivers/gscam cd .. catkin_make -DGSTREAMER_VERSION_1_x=On 

Now we need to create a launch file and enter the gstreamer command there which will send the video to ffmpegcolorspace.

 -v udpsrc port=6666 ! application/x-rtp, encoding-name=H264 ! rtph264depay ! h264parse ! avdec_h264 ! videoconvert ! ffmpegcolorspace 

The resulting file ~ / catkin_ws / src / gscam / examples / streamc920.launch will look like this:

 <launch> <!-- Set this to your camera's name --> <arg name="cam_name" value="creative_cam" /> <!-- Start the GSCAM node --> <env name="GSCAM_CONFIG" value="-v udpsrc port=6666 ! application/x-rtp, encoding-name=H264 ! rtph264depay ! h264parse ! avdec_h264 ! videoconvert ! ffmpegcolorspace" /> <node pkg="gscam" type="gscam" name="$(arg cam_name)"> <param name="camera_name" value="$(arg cam_name)" /> <param name="camera_info_url" value="package://localcam/calibrations/${NAME}.yaml" /> <remap from="camera/image_raw" to="$(arg cam_name)/image_raw" /> </node> <!-- Provide rectification --> <node pkg="image_proc" type="image_proc" name="creative_image_proc" ns="$(arg cam_name)" /> <!-- View the raw and rectified output --> <node pkg="image_view" type="image_view" name="creative_view" > <remap from="image" to="/$(arg cam_name)/image_raw" /> </node> </launch> 

Now if we run it:

 roslaunch gscam streamc920.launch 

It will appear as a video stream from the camera, now in ROS we have this video as topic creative_cam / image_raw.

From one of the past lessons we will start the search and recognition of map images by changing the topic:

 rosrun find_object_2d find_object_2d image:=/creative_cam/image_raw 

The test result of the EduMIP robot on an improvised track can be seen in the video at the beginning of the article. Despite the shaking due to the balancing of the robot, the image can still be recognized, but on a three-wheeled or four-wheeled robot, I think everything will be much better.

I also connected the laptop to the wifi router with a twisted pair cable and lowered the bitrate to 1 Mbps with keyframes once a second, which reduced the video transmission delay to 0.2 seconds.

 gst-launch-1.0 uvch264src initial-bitrate=1000000 average-bitrate=1000000 iframe-period=1000 name=src auto-start=true src.vidsrc ! video/x-h264,width=160,height=120,framerate=30/1 ! h264parse ! rtph264pay ! udpsink host=192.168.1.196 port=6666 



For those who want to see the robot live, on July 7, I will be performing the EduMIP project at the DIYorDIE Meetup in Moscow.

Source: https://habr.com/ru/post/415735/


All Articles