Please use this identifier to cite or link to this item:
|Title:||Automatic Cinematography using Pans and Cuts in 360-Degree Video|
|Abstract:||Recent strides in virtual-reality technology have allowed for the development of 360-degree videos, or videos in which the user has the freedom to look anywhere in the spherical space around the camera's point of view throughout the course of the video. With this project, we seek to develop an algorithm that would remove the human element of viewing 360-degree videos; that is, an algorithm that automates panning through the 360-degree video such that the viewer receives as output a normal-field-of-view video which still features the most interesting content of the video and respects basic cinematographic rules. We begin with an intentionally simple algorithm, and add complexity until we arrive at an algorithm that we feel is satisfactorily flexible and effective. At each step of our implementation process, we evaluate the algorithm by its quantitative performance via heat coverage in the input heatmap video, as well as by its qualitative ability to preserve basic cinematography rules and produce an output that is aesthetically pleasing. Our final version of the algorithm utilizes an attenuation map to provide diminishing returns on static fields-of-view, such that the algorithm encourages dynamic, human-like motion through the video. Our final algorithm has applications both in the academic space of computer vision, as well as in the commercial space of virtual reality and film-making tools.|
|Type of Material:||Princeton University Senior Theses|
|Appears in Collections:||Computer Science, 1988-2020|
Files in This Item:
|HAMBURGER-MITCHELL-THESIS.pdf||1.02 MB||Adobe PDF||Request a copy|
Items in Dataspace are protected by copyright, with all rights reserved, unless otherwise indicated.