Emergent Mind

CineMPC: Controlling Camera Intrinsics and Extrinsics for Autonomous Cinematography

(2104.03634)
Published Apr 8, 2021 in cs.RO , cs.SY , and eess.SY

Abstract

We present CineMPC, an algorithm to autonomously control a UAV-borne video camera in a nonlinear Model Predicted Control (MPC) loop. CineMPC controls both the position and orientation of the camera -- the camera extrinsics -- as well as the lens focal length, focal distance, and aperture -- the camera intrinsics. While some existing solutions autonomously control the position and orientation of the camera, no existing solutions also control the intrinsic parameters, which are essential tools for rich cinematographic expression. The intrinsic parameters control the parts of the scene that are focused or blurred, the viewers' perception of depth in the scene and the position of the targets in the image. CineMPC closes the loop from camera images to UAV trajectory and lens parameters in order to follow the desired relative trajectory and image composition as the targets move through the scene. Experiments using a photo-realistic environment demonstrate the capabilities of the proposed control framework to successfully achieve a full array of cinematographic effects not possible without full camera control.

We're not able to analyze this paper right now due to high demand.

Please check back later (sorry!).

Generate a summary of this paper on our Pro plan:

We ran into a problem analyzing this paper.

Newsletter

Get summaries of trending comp sci papers delivered straight to your inbox:

Unsubscribe anytime.