Realtime Novel View Synthesis with Eigen-Texture Regression

Realtime Novel View Synthesis
with Eigen-Texture Regression

Yuta Nakashima Fumio Okura Norihiko Kawai
Osaka University Osaka University NAIST
Hiroshi Kawasaki Ambrosio Blanco Katsushi Ikeuchi
Kyushu University Microsoft Research Asia Microsoft Research Asia


Realtime novel view synthesis, which generates a novel view of a real object or scene in realtime, enjoys a wide range of applications including augmented reality, telepresence, and immersive telecommunication. Image-based rendering (IBR) with rough geometry can be done using only an off-the-shelf camera and thus can be used by many users. However, IBR from images in the wild (e.g., lighting condition changes or the scene contains objects with specular surfaces) has been a tough problem due to color discontinuity; IBR with rough geometry picks up appropriate images for a given viewpoint, but the image used for a rendering unit (a face or pixel) switches when the viewpoint moves, which may cause noticeable changes in color. We use the eigen-texture technique, which represents images for a certain face using a point in the eigenspace. We propose to regress a new point in this space, which moves smoothly, given a viewpoint so that we can generate an image whose color smoothly changes according to the point. Our regressor is based on a neural network with a single hidden layer and hyperbolic tangent nonlinearity. We demonstrate the advantages of our IBR approach using our own datasets as well as publicly available datasets for comparison.




1.7 MB

Supplementary material

2.0 MB


136 MB

Input data and our results

156 MB

Evolution of Coefficients

Evolution of the point in the eigenspace by Nonlinear (top), Nonlinear w/o regularizer (middle), and Linear (bottom) regressors. The vertical axes are the time unit, and the horizontal axes are values of each coordinate of the eigenspace. Note that the vertical axis for the Linear regressor is different from the others. Corresponding image patches are also shown under the plots.

Experiments for extrapolation

Viewpoint paths (black lines) for the triangle pyramid (left) and Buddha (right) datasets.

Example novel views for viewpoints far from input images with the triangle pyramid and Buddha datasets. The viewpoint from which the novel view is rendered is indicated by the red pyramid in the viewpoint path.


    author  = {Yuta Nakashima and Fumio Okura and Norihiko Kawai and Hiroshi Kawasaki and Ambrosio Blanco and Katsushi Ikeuchi},
    title   = {Realtime Novel View Synthesis with Eigen-Texture Regression},
    journal = {Proceedings of British Machine Vision Conference)},
    year    = {2017},