Abstract
Original language | English |
---|---|
Pages (from-to) | 3063-3072 |
Number of pages | 10 |
Journal | IEEE transactions on visualization and computer graphics |
Volume | 25 |
Issue number | 11 |
DOIs | |
Publication status | Published - 2019 |
Fingerprint
Keywords
- Lumigraph
- multi-view
- real-time
- view planning
- virtual reality
- rendering
- keyframe selection
Fields of Expertise
- Information, Communication & Computing
Cite this
Real-Time View Planning for Unstructured Lumigraph Modeling. / Erat, Okan; Hoell, Markus; Haubenwallner, Karl; Pirchheim, Christian; Schmalstieg, Dieter.
In: IEEE transactions on visualization and computer graphics, Vol. 25, No. 11, 2019, p. 3063-3072.Research output: Contribution to journal › Article › Research › peer-review
}
TY - JOUR
T1 - Real-Time View Planning for Unstructured Lumigraph Modeling
AU - Erat, Okan
AU - Hoell, Markus
AU - Haubenwallner, Karl
AU - Pirchheim, Christian
AU - Schmalstieg, Dieter
PY - 2019
Y1 - 2019
N2 - We propose an algorithm for generating an unstructured lumigraph in real-time from an image stream. This problem has important applications in mixed reality, such as telepresence, interior design or as-built documentation. Unlike conventional texture optimization in structure from motion, our method must choose views from the input stream in a strictly incremental manner, since only a small number of views can be stored or transmitted. This requires formulating an online variant of the well-known view-planning problem, which must take into account what parts of the scene have already been seen and how the lumigraph sample distribution could improve in the future. We address this highly unconstrained problem by regularizing the scene structure using a regular grid structure. Upon the grid structure, we define a coverage metric describing how well the lumigraph samples cover the grid in terms of spatial and angular resolution, and we greedily keep incoming views if they improve the coverage. We evaluate the performance of our algorithm quantitatively and qualitatively on a variety of synthetic and real scenes, and demonstrate visually appealing results obtained at real-time frame rates (in the range of 3Hz-100Hz per incoming image, depending on configuration).
AB - We propose an algorithm for generating an unstructured lumigraph in real-time from an image stream. This problem has important applications in mixed reality, such as telepresence, interior design or as-built documentation. Unlike conventional texture optimization in structure from motion, our method must choose views from the input stream in a strictly incremental manner, since only a small number of views can be stored or transmitted. This requires formulating an online variant of the well-known view-planning problem, which must take into account what parts of the scene have already been seen and how the lumigraph sample distribution could improve in the future. We address this highly unconstrained problem by regularizing the scene structure using a regular grid structure. Upon the grid structure, we define a coverage metric describing how well the lumigraph samples cover the grid in terms of spatial and angular resolution, and we greedily keep incoming views if they improve the coverage. We evaluate the performance of our algorithm quantitatively and qualitatively on a variety of synthetic and real scenes, and demonstrate visually appealing results obtained at real-time frame rates (in the range of 3Hz-100Hz per incoming image, depending on configuration).
KW - Lumigraph
KW - multi-view
KW - real-time
KW - view planning
KW - virtual reality
KW - rendering
KW - keyframe selection
U2 - 10.1109/TVCG.2019.2932237
DO - 10.1109/TVCG.2019.2932237
M3 - Article
VL - 25
SP - 3063
EP - 3072
JO - IEEE transactions on visualization and computer graphics
JF - IEEE transactions on visualization and computer graphics
SN - 1077-2626
IS - 11
ER -