Real-Time View Planning for Unstructured Lumigraph Modeling

Okan Erat, Markus Hoell, Karl Haubenwallner, Christian Pirchheim, Dieter Schmalstieg

Research output: Contribution to journalArticleResearchpeer-review

Abstract

We propose an algorithm for generating an unstructured lumigraph in real-time from an image stream. This problem has important applications in mixed reality, such as telepresence, interior design or as-built documentation. Unlike conventional texture optimization in structure from motion, our method must choose views from the input stream in a strictly incremental manner, since only a small number of views can be stored or transmitted. This requires formulating an online variant of the well-known view-planning problem, which must take into account what parts of the scene have already been seen and how the lumigraph sample distribution could improve in the future. We address this highly unconstrained problem by regularizing the scene structure using a regular grid structure. Upon the grid structure, we define a coverage metric describing how well the lumigraph samples cover the grid in terms of spatial and angular resolution, and we greedily keep incoming views if they improve the coverage. We evaluate the performance of our algorithm quantitatively and qualitatively on a variety of synthetic and real scenes, and demonstrate visually appealing results obtained at real-time frame rates (in the range of 3Hz-100Hz per incoming image, depending on configuration).
Original languageEnglish
Pages (from-to)3063-3072
Number of pages10
JournalIEEE transactions on visualization and computer graphics
Volume25
Issue number11
DOIs
Publication statusPublished - 2019

Fingerprint

Planning
Textures

Keywords

  • Lumigraph
  • multi-view
  • real-time
  • view planning
  • virtual reality
  • rendering
  • keyframe selection

Fields of Expertise

  • Information, Communication & Computing

Cite this

Real-Time View Planning for Unstructured Lumigraph Modeling. / Erat, Okan; Hoell, Markus; Haubenwallner, Karl; Pirchheim, Christian; Schmalstieg, Dieter.

In: IEEE transactions on visualization and computer graphics, Vol. 25, No. 11, 2019, p. 3063-3072.

Research output: Contribution to journalArticleResearchpeer-review

@article{8ee3db97714847be87392f4e3af3923d,
title = "Real-Time View Planning for Unstructured Lumigraph Modeling",
abstract = "We propose an algorithm for generating an unstructured lumigraph in real-time from an image stream. This problem has important applications in mixed reality, such as telepresence, interior design or as-built documentation. Unlike conventional texture optimization in structure from motion, our method must choose views from the input stream in a strictly incremental manner, since only a small number of views can be stored or transmitted. This requires formulating an online variant of the well-known view-planning problem, which must take into account what parts of the scene have already been seen and how the lumigraph sample distribution could improve in the future. We address this highly unconstrained problem by regularizing the scene structure using a regular grid structure. Upon the grid structure, we define a coverage metric describing how well the lumigraph samples cover the grid in terms of spatial and angular resolution, and we greedily keep incoming views if they improve the coverage. We evaluate the performance of our algorithm quantitatively and qualitatively on a variety of synthetic and real scenes, and demonstrate visually appealing results obtained at real-time frame rates (in the range of 3Hz-100Hz per incoming image, depending on configuration).",
keywords = "Lumigraph, multi-view, real-time, view planning, virtual reality, rendering, keyframe selection",
author = "Okan Erat and Markus Hoell and Karl Haubenwallner and Christian Pirchheim and Dieter Schmalstieg",
year = "2019",
doi = "10.1109/TVCG.2019.2932237",
language = "English",
volume = "25",
pages = "3063--3072",
journal = "IEEE transactions on visualization and computer graphics",
issn = "1077-2626",
publisher = "IEEE Computer Society",
number = "11",

}

TY - JOUR

T1 - Real-Time View Planning for Unstructured Lumigraph Modeling

AU - Erat, Okan

AU - Hoell, Markus

AU - Haubenwallner, Karl

AU - Pirchheim, Christian

AU - Schmalstieg, Dieter

PY - 2019

Y1 - 2019

N2 - We propose an algorithm for generating an unstructured lumigraph in real-time from an image stream. This problem has important applications in mixed reality, such as telepresence, interior design or as-built documentation. Unlike conventional texture optimization in structure from motion, our method must choose views from the input stream in a strictly incremental manner, since only a small number of views can be stored or transmitted. This requires formulating an online variant of the well-known view-planning problem, which must take into account what parts of the scene have already been seen and how the lumigraph sample distribution could improve in the future. We address this highly unconstrained problem by regularizing the scene structure using a regular grid structure. Upon the grid structure, we define a coverage metric describing how well the lumigraph samples cover the grid in terms of spatial and angular resolution, and we greedily keep incoming views if they improve the coverage. We evaluate the performance of our algorithm quantitatively and qualitatively on a variety of synthetic and real scenes, and demonstrate visually appealing results obtained at real-time frame rates (in the range of 3Hz-100Hz per incoming image, depending on configuration).

AB - We propose an algorithm for generating an unstructured lumigraph in real-time from an image stream. This problem has important applications in mixed reality, such as telepresence, interior design or as-built documentation. Unlike conventional texture optimization in structure from motion, our method must choose views from the input stream in a strictly incremental manner, since only a small number of views can be stored or transmitted. This requires formulating an online variant of the well-known view-planning problem, which must take into account what parts of the scene have already been seen and how the lumigraph sample distribution could improve in the future. We address this highly unconstrained problem by regularizing the scene structure using a regular grid structure. Upon the grid structure, we define a coverage metric describing how well the lumigraph samples cover the grid in terms of spatial and angular resolution, and we greedily keep incoming views if they improve the coverage. We evaluate the performance of our algorithm quantitatively and qualitatively on a variety of synthetic and real scenes, and demonstrate visually appealing results obtained at real-time frame rates (in the range of 3Hz-100Hz per incoming image, depending on configuration).

KW - Lumigraph

KW - multi-view

KW - real-time

KW - view planning

KW - virtual reality

KW - rendering

KW - keyframe selection

U2 - 10.1109/TVCG.2019.2932237

DO - 10.1109/TVCG.2019.2932237

M3 - Article

VL - 25

SP - 3063

EP - 3072

JO - IEEE transactions on visualization and computer graphics

JF - IEEE transactions on visualization and computer graphics

SN - 1077-2626

IS - 11

ER -