Emergent Mind

Adaptive Video Configuration and Bitrate Allocation for Teleoperated Vehicles

(2102.10898)
Published Feb 22, 2021 in eess.IV and cs.MM

Abstract

Vehicles with autonomous driving capabilities are present on public streets. However, edge cases remain that still require a human in-vehicle driver. Assuming the vehicle manages to come to a safe state in an automated fashion, teleoperated driving technology enables a human to resolve the situation remotely by a control interface connected via a mobile network. While this is a promising solution, it also introduces technical challenges, one of them being the necessity to transmit video data of multiple cameras from the vehicle to the human operator. In this paper, an adaptive video streaming framework specifically designed for teleoperated vehicles is proposed and demonstrated. The framework enables automatic reconfiguration of the video streams of the multi-camera system at runtime. Predictions of variable transmission service quality are taken into account. With the objective to improve visual quality, the framework uses so-called rate-quality models to dynamically allocate bitrates and select resolution scaling factors. Results from deploying the proposed framework on an actual teleoperated driving system are presented.

We're not able to analyze this paper right now due to high demand.

Please check back later (sorry!).

Generate a summary of this paper on our Pro plan:

We ran into a problem analyzing this paper.

Newsletter

Get summaries of trending comp sci papers delivered straight to your inbox:

Unsubscribe anytime.