MASSIVE TIME-LAPSE POINT CLOUD RENDERING with VR Innfarn Yoo, - - PowerPoint PPT Presentation

massive time lapse point
SMART_READER_LITE
LIVE PREVIEW

MASSIVE TIME-LAPSE POINT CLOUD RENDERING with VR Innfarn Yoo, - - PowerPoint PPT Presentation

April 4-7, 2016 | Silicon Valley MASSIVE TIME-LAPSE POINT CLOUD RENDERING with VR Innfarn Yoo, OpenGL Chips and Core Markus Schuetz, Professional Visualization Introduction Previous Work Methods AGENDA Progressive Blue-Noise Point Cloud


slide-1
SLIDE 1

April 4-7, 2016 | Silicon Valley

Innfarn Yoo, OpenGL Chips and Core Markus Schuetz, Professional Visualization

MASSIVE TIME-LAPSE POINT CLOUD RENDERING with VR

slide-2
SLIDE 2

2

AGENDA

Introduction Previous Work Methods Progressive Blue-Noise Point Cloud High-Quality VR Point Cloud Conclusion Demos

slide-3
SLIDE 3

3

INTRODUCTION

Point Cloud A set of points that represents the external surface

  • f an object

Our Dataset: Project Endeavor New NVIDIA building under construction Time-Lapse Point Clouds

slide-4
SLIDE 4

4

INTRODUCTION

Simplicity Scalability Easiness of capturing Easiness of data handling

Point Cloud Representation

Advantages Visually incomplete Easy to get noise Increasing data size Disadvantages

slide-5
SLIDE 5

5

INTRODUCTION

Point Cloud Massive Scale (more than 1 TB) Time-Lapse VR Rendering

Our Focus

Real-Time Rendering Instant Scalability Efficient Out-of-Core Design Plausible Visualization High-Quality VR Experience

slide-6
SLIDE 6

6

INTRODUCTION

A Novel Approach for Massive Time-Lapse Point Cloud Rendering Adapting Progressive Blue-Noise Point Cloud (PBNPC) Resampling High-Quality Point Cloud VR Experience Our method provides several important features: performance, quality, & navigation

Our Contributions

slide-7
SLIDE 7

7

AGENDA

Introduction Previous Work Methods Progressive Blue-Noise Point Cloud High-Quality VR Point Cloud Conclusion Demos

slide-8
SLIDE 8

8

PREVIOUS WORK

Li et al., Analyzing Growing Plants from 4D Point Cloud Data, 2013, Transaction on Graphics

Image source: Li et al., Analyzing growing plants from 4D point cloud data, 2013, ToG

Plant growth analysis using time-lapse point cloud

slide-9
SLIDE 9

9

PREVIOUS WORK

First Person Hyper-lapse Video, Kopf et al., 2014, Transaction on Graphics

Image source: Kopf et al., First Person Hyper-lapse Video, 2014, ToG

Hyper-lapse from video (sequence of images)

slide-10
SLIDE 10

10

AGENDA

Introduction Previous Work Methods Progressive Blue-Noise Point Cloud High-Quality VR Point Cloud Conclusion Demos

slide-11
SLIDE 11

11

PROGRESSIVE BLUE-NOISE POINT CLOUD (PBNPC)

slide-12
SLIDE 12

12

COLOR MISMATCH REGISTRATION DATA SIZE

PROBLEMS

Per Day PC: 1.5 GB 2 Years: 1 TB Current: 120 GB GPU Memory: up to 24 GB (NVIDIA Quadro M6000) Out-of-Core Loading Daily weather changes Capturing time changes Sun position changes Shadows Drone captured Capturing is not perfect Actual site changes by daily construction

slide-13
SLIDE 13

13

  • 1. Data Size: 1.5 GB Per Day
slide-14
SLIDE 14

14

  • 2. Color Mismatch

July 27, 2015 July 30, 2015

slide-15
SLIDE 15

15

  • 3. Registration Problem
slide-16
SLIDE 16

16

PIPELINE OVERVIEW

Real-Time Processing Preprocessing Input Point Cloud (LAS files) PBNPC Creation Registration Color Correction

Estimating Time-Lapse Speed Async Loading & Removing Sparse Buffer Filling Sparse Vertex Buffer Adjusting # of points & Rendering

slide-17
SLIDE 17

17

INPUT FILES

Drone captures point cloud every 2-3 day Input file format: LAS (libLAS) Sampling density is varying per day Boundary is varying per day (Noise) 1.5 GB per capture * 86 := 120 GB Reduced to 50 GB

Input Point Cloud (LAS files) PBNPC Creation Registration Color Correction

slide-18
SLIDE 18

18

MAJOR PROBLEMS

Massive Scale Point Cloud Data Handling  We need “Immediate Scalability” &  Preserve Visual Quality

Input Point Cloud (LAS files) PBNPC Creation Registration Color Correction

slide-19
SLIDE 19

19

BLUE-NOISE POINT CLOUD

Types of Noise, depend on frequency distribution White Noise, Pink Noise, and Blue-Noise Blue-Noise Point Cloud Nearly Poisson Distribution Approximation of Human Retina Cells’ Distribution Visually Plausible

Image source: Recursive Wang Tiles for Real-Time Blue Noise, Kopf et al., 2006, ACM SIGGRAPH

slide-20
SLIDE 20

20

PROGRESSIVE BLUE-NOISE POINT CLOUD

Progressive Blue-Noise Point Cloud (PBNPC) Adding or removing any number of points preserve Blue-Noise Characteristics Using “Recursive Wang Tiles for Real-Time Blue Noise”, Kopf et al., 2006, ACM SIGGRAPH

Image source: Recursive Wang Tiles for Real-Time Blue Noise, Kopf et al., 2006, ACM SIGGRAPH

slide-21
SLIDE 21

21

VIDEO: PBNPC

slide-22
SLIDE 22

22

REGISTRATION

Input Point Cloud (LAS files) PBNPC Creation Registration Color Correction

We tried to use Approximated Nearest Neighbors (ANN) to align Point Clouds

  • Several problems: too slow and low accuracy

Render depth maps in several different camera positions Generate gradient maps from depth maps Octree-based Search + Hill Climbing Algorithm

slide-23
SLIDE 23

23

TIME-LAPSE COLOR CORRECTION

We are not correcting colors YET Instead Time-Lapse Blending between Days Blending alleviates the color mismatching problem Blending cannot solve shadow(sun position) and color distribution problems

Input Point Cloud (LAS files) PBNPC Creation Registration Color Correction

slide-24
SLIDE 24

24

TIME-LAPSE COLOR CORRECTION

No Blending

slide-25
SLIDE 25

25

TIME-LAPSE COLOR CORRECTION

Blending

slide-26
SLIDE 26

26

PIPELINE OVERVIEW

Real-Time Processing Preprocessing Input Point Cloud (LAS files) PBNPC Creation Registration Color Correction

Estimating Time-Lapse Speed Async Loading & Removing Sparse Buffer Filling Sparse Vertex Buffer Adjusting # of points & Rendering

slide-27
SLIDE 27

27

OPENGL 4.5, SPARSE BUFFER

Newly introduced OpenGL 4.5 Extension (https://www.opengl.org/registry/specs/ARB/sparse_buffer.txt) Decouple GPU’s virtual and physical memory Similar to ARB_sparse_texture extension We are using Sparse Buffer as a Stack (Prepare entire virtual memory per daily PC)

ARB_sparse_buffer

Estimating Time- Lapse Speed

Async Loading & Removing Sparse Buffer Filling Sparse Vertex Buffer Adjusting # of points & Rendering

slide-28
SLIDE 28

28

OPENGL 4.5, SPARSE BUFFER

We allocate virtual memory for entire time-lapse point clouds glBufferStorage(target, size, data_ptr, GL_SPARSE_STORAGE_BIT_ARB) Physical memory can be committed by using glBufferPageCommitmentARB(target, offset, size, commit) Greatly ease our effort to manage GPU memory

ARB_sparse_buffer

data_ptr will be ignored, when with sparse storage bit GL_TRUE or GL_FALSE for commit or decommit memory

slide-29
SLIDE 29

29

ESTIMATING TIME-LAPSE SPEED

Calculating Time-Lapse Direction & Speed Amount of Async Loading Request is based on Probability Probability adjusted by Time-Lapse Direction & Speed

Based on Loading & Unloading Probability

Estimate Time- Lapse Speed

Async Loading & Removing Sparse Buffer Filling Sparse Vertex Buffer Adjusting # of points & Rendering

slide-30
SLIDE 30

30

ADJUSTING # OF POINTS

For Real-Time Rendering (Normal > 60 Hz & VR > 90 Hz) Instant Level of Details (LOD) by using Progressive Blue-Noise Point Cloud Simply adjusting LOD percentage until target FPS accomplished

Estimate Time- Lapse Speed

Async Loading & Removing Sparse Buffer Filling Sparse Vertex Buffer Adjusting # of points & Rendering

slide-31
SLIDE 31

31

HIGH-QUALITY VR POINT CLOUD

slide-32
SLIDE 32

32

VR POINT CLOUDS

Performance, Quality, User Interaction

slide-33
SLIDE 33

33

VR PERFORMANCE

  • Each point cloud at least 30 million points
  • Rendering 2 point clouds during time-slice transitions
  • Render twice (once for each eye)
  • At 90 Frames per Second
  • +some more to account for additional VR overhead
slide-34
SLIDE 34

34

VR PERFORMANCE

  • Out-Of-Core data structures necessary
  • Multi-Resolution Octree used for Point Cloud VR demo

(“Interactions with Gigantic Point Clouds”, Claus Scheiblauer)

  • Load and render only visible parts up to desired Level of Detail

source: “Potree: Rendering Large Point Clouds in Web Browsers”, Markus Schuetz

slide-35
SLIDE 35

35

VR PERFORMANCE

  • High Level-Of-Detail near camera
  • Render only ~3 million points
  • ut of billions
  • 1.3 billion points at 500-700FPS
  • n Quadro M6000 24GB

source: “Potree: Rendering Large Point Clouds in Web Browsers”, Markus Schuetz

slide-36
SLIDE 36

36

VR POINT CLOUDS

Quality

  • Strong aliasing inherent to Point Cloud Rendering
  • Surfaces made up of overlapping points that
  • cclude each other. Closest to camera wins.
  • Aliasing more noticeable in VR due to

constant motion and low resolution

  • Perceived as “sparkling”
slide-37
SLIDE 37

37

SILHOUETTES LEVEL OF DETAIL OCCLUSIONS

SOURCES OF ALIASING

Surface Patches made up

  • f overlapping points

Points fighting for visibility Model Silhouettes Point Sprite Silhouettes Building Multi-Resolution Octree, only considering point coordinates Like Nearest-Neighbor

source: “Potree: Rendering Large Point Clouds in Web Browsers”, Markus Schuetz

slide-38
SLIDE 38

38

SOURCES OF ALIASING

Occlusions

  • Surface Patches made up of overlapping points that constantly fight for visibility
  • Blend fragments together instead of rendering only the closest

(“High-quality surface splatting on today's GPUs”, Botsch et al., 2005)

  • Using screen-aligned circles instead of oriented splats

source: “Potree: Rendering Large Point Clouds in Web Browsers”, Markus Schuetz

slide-39
SLIDE 39

39

SOURCES OF ALIASING

Level-of-Detail

  • Additionally store interpolated colors in lower Levels-Of-Detail
  • Like Mip-Mapping for point clouds
  • Interpolated colors partially reduce occlusion-aliasing
slide-40
SLIDE 40

40

SOURCES OF ALIASING

  • Large amount of silhouettes due to holes from incompletely or sparsely captured

3D data and noise

  • Plus: Each point sprite has its own small silhouette
  • Smallest HMD movements and noise constantly change silhouettes
  • Using MSAA to reduce these aliasing artifacts
  • MSAA partially reduces occlusion-aliasing, too:

Silhouettes

slide-41
SLIDE 41

41

HIGH-QUALITY POINT CLOUDS IN VR

  • MSAA and High-Quality (HQ) Splatting currently not compatible
  • HQ-Splatting solves occlusion
  • MSAA and averaged color LODs partially reduce occlusions
  • MSAA faster and eliminates most “sparkling” effects
  • MSAA, together with averaged color LODs, better suited for VR
slide-42
SLIDE 42

42

HIGH-QUALITY POINT CLOUDS IN VR

  • Render into different framebuffers, then combine
  • Reduce performance impact by rendering one time-slice:
  • at a lower resolution
  • r without MSAA
  • Quality reduction less noticeable during transition

Transitions

slide-43
SLIDE 43

43

HIGH-QUALITY POINT CLOUDS IN VR

slide-44
SLIDE 44

44

VR POINT CLOUDS

  • Point clouds rarely dense enough for world-scale first-person navigation
  • Table-Scale navigation -> Treat like a small model on a table or floating in space
  • Grab gesture to move model
  • Pinch-To-Zoom gestures (scale, rotate and move)
  • Measurements

User Interaction

slide-45
SLIDE 45

45

VR POINT CLOUDS

  • Grab a point in space to drag and drop the model

Navigation

slide-46
SLIDE 46

46

VR POINT CLOUDS

Navigation

  • Grab a region-of-interest to enlarge, shrink and rotate it
slide-47
SLIDE 47

47

VR POINT CLOUDS

Controller Assignments

Measurements Previous / Next Day

slide-48
SLIDE 48

48

AGENDA

Introduction Previous Work Methods Progressive Blue-Noise Point Cloud High-Quality VR Point Cloud Conclusion Demos

slide-49
SLIDE 49

49

CONCLUSION

  • Methods to render large point cloud data with hundreds of time-slices
  • High-Quality subsampling and progressive LOD using PBNPC
  • High-Quality Point Cloud Rendering for VR
  • Fast and precise navigation
slide-50
SLIDE 50

50

LIMITATIONS AND FUTURE WORK

  • PBNPC sampled in 2D -> extend to 3D
  • Correcting color mismatching problem
  • Integrating NVIDIAs VR SLI Extension
  • Eliminate pitfalls in VR navigation
  • Multi-Res PBNPC for arbitrary large Point Clouds
slide-51
SLIDE 51

51

NVIDIA’S NEW BUILD VISUALIZATION

slide-52
SLIDE 52

52

AGENDA

Introduction Previous Work Methods Progressive Blue-Noise Point Cloud High-Quality VR Point Cloud Conclusion Demos

slide-53
SLIDE 53

April 4-7, 2016 | Silicon Valley

THANK YOU