Parallax compensation on 360° panoramic video

Involved trainers: 
Name: 
Frédéric Devernay
Mail: 
Laboratory: 
INRIA
Summary: 

Panoramic stitching from photographs taken by a standard camera has been around for more than a decade, and many commercial or free stitching tools ar now available which implement state-of-the-art stitching methods (for example Autopano, Panorama Tools, or Hugin, Autodesk Stitcher).

For panoramic video, all images must be taken simultaneously, so that the hardware system usually consists of several high-resolution cameras, and each frame of the resulting video has to be stitched from all videos. Sample applications for panoramic video include movies for planetariums or Omnimax (Totavision), or interactive movies (Loop'in).

Image stitching consists of several phases: Image alignment, image warping, and blending. For panoramic video, the image alignment and warping phases can be treated using the same methods used for photographs, since the relative position of cameras does not change over time. However, the blending phase is very different, because a panoramic video cannot tolerate temporal artifacts, where an objects suddenly jumps from one place to another. Most artifacts present in panoramas are either due to images that were not taken simultaneously (which does not occur in panoramic video), or residual parallax due to the fact that the cameras' optical centers are not at the exact same position in 3D (which unfortunately happens in panoramic video because several cameras cannot be physically at the same place). We are thus essentially interested in methods that could reduce artifacts due to residual parallax (see example below).

Traditional panorama blending techniques include feathering or alpha blending [1], optimal seam [2], laplacian pyramid blending [3] or its simplification, two-band blending [4], and gradient domain blending [5,6]. Feathering and laplacian pyramid blending are linear operations, so that when the images change slowly the output will change slowly. On the opposite, optimal seam and gradient dimain blending are the result of an optimization, and the optimum may jump abruptly from one local minimum to the other when images change slowly. This second category should thus not be considered for panoramic video blending. However, even with the first category, a detail may slowly disappear from one place to reappear at another place in the images, so that the parallax effects have to be dealt with separately.

Several methods propose to compensate the effects of parallax [7,8,9,10], and within this project we propose to compensate the effects of parallax before the blending stage. Using the stereo disparity or the optical flow between cameras in the overlap regions, we will apply view-interpolation techniques, combined with ideas borrowed from the laplacian pyramid blending, to produce panoramic videos that exhibit no temporal or spatial artifacts.