10.1007/978-3-031-20713-6_9guideproceedingsArticle/Chapter ViewAbstractPublication PagesConference Proceedingsacm-pubtype
Article

Interactive Virtual Reality Exploration of Large-Scale Datasets Using Omnidirectional Stereo Images

Authors Info & Claims
Published:03 October 2022Publication History

Abstract

Virtual reality offers unique affordances that can benefit the scientific discovery process. However, virtual reality applications must maintain very high frame rates to provide immersion and prevent adverse events such as visual fatigue and motion sickness. Maintaining high frame rates can be challenging when visualizing scientific data that is large in scale. One successful technique for enabling interactive exploration of large-scale datasets is to create a large image collection from a structured sampling of camera positions, time steps, and visualization operators. This paper highlights our work to adapt this technique for virtual reality, and uses two authentic scientific datasets – a) a large-scale simulation of cancer cell transport and capture in a microfluidic device and b) a large-scale molecular dynamics simulation of graphene for creating extremely low friction interactions. We create a collection of omnidirectional stereoscopic images (three-dimensional surround-view panoramas), each of which captures all possible view angles from a given location. Therefore, virtual reality devices can always render local movements at full frame rates without loading a new image from the collection.

References

  1. 1.Ahrens, J., Jourdain, S., O’Leary, P., Patchett, J., Rogers, D.H., Petersen, M.: An image-based approach to extreme scale in situ visualization and analysis. In: SC 2014: Proceedings of the International Conference for High Performance Computing, Networking, Storage and Analysis, pp. 424–434 (2014). DOI: https://doi.org/10.1109/SC.2014.40Google ScholarGoogle Scholar
  2. 2.Babylon.js. https://www.babylonjs.com/. Accessed 01 Mar 2022Google ScholarGoogle Scholar
  3. 3.Berman DDeshmukh SASankaranarayanan SKRSErdemir ASumant AVMacroscale superlubricity enabled by graphene nanoscroll formationScience201534862391118112210.1126/science.1262024Google ScholarGoogle Scholar
  4. 4.Blender online community: blender - a 3D modelling and rendering package. https://www.blender.org. Accessed 01 Mar 2022Google ScholarGoogle Scholar
  5. 5.Cai YHeng PWu ELiu XLi HSun QAn image-based virtual reality prototype systemJ. Comput. Sci. Technol.199813547548010.1007/BF02948507SepGoogle ScholarGoogle ScholarDigital LibraryDigital Library
  6. 6.Dwyer Tet al.Immersive analytics: an introductionImmersive Analytics2018ChamSpringer12310.1007/978-3-030-01388-2_1Google ScholarGoogle ScholarCross RefCross Ref
  7. 7.Faigle, C., Fox, G., Furmanski, W., Niemiec, J., Simoni, D.: Integrating virtual environments with high performance computing. In: Proceedings of IEEE Virtual Reality Annual International Symposium, pp. 62–68 (1993). DOI: https://doi.org/10.1109/VRAIS.1993.380797Google ScholarGoogle Scholar
  8. 8.Gaggioli, A., Breining, R.: Perception and cognition in immersive virtual reality. In: Emerging Communication: Studies on New Technologies and Practices in Communication. IOS Press (2001)Google ScholarGoogle Scholar
  9. 9.Ge, J., et al.: Point-based VR visualization for large-scale mesh datasets by real-time remote computation. In: Proceedings of the 2006 ACM International Conference on Virtual Reality Continuum and Its Applications, pp. 43–50. VRCIA 2006 (2006). DOI: https://doi.org/10.1145/1128923.1128931Google ScholarGoogle Scholar
  10. 10.glTF™. https://www.khronos.org/gltf/. Accessed 10 May 2022Google ScholarGoogle Scholar
  11. 11.Heroux, M.A., et al.: ECP software technology capability assessment report-public. Technical report, US Department of Energy’s Exascale Computing Initiative (2020)Google ScholarGoogle Scholar
  12. 12.Latt Jet al.Palabos: parallel lattice Boltzmann solverComput. Math. Appl.202181334350418981310.1016/j.camwa.2020.03.02207288717Google ScholarGoogle ScholarCross RefCross Ref
  13. 13.Lütjens MKersten TPDorschel BTschirschwitz FVirtual reality in cartography: immersive 3D visualization of the arctic Clyde inlet (Canada) using digital elevation models and bathymetric dataMultimodal Technol. Interact.201931910.3390/mti3010009Google ScholarGoogle Scholar
  14. 14.Marrinan TPapka MEReal-time omnidirectional stereo rendering: generating 360 surround-view panoramic images for comfortable immersive viewingIEEE Trans. Vis. Comput. Graph.20212752587259610.1109/TVCG.2021.3067780Google ScholarGoogle ScholarCross RefCross Ref
  15. 15.Nelson Det al.The IllustrisTNG simulations: public data releaseComput. Astrophy. Cosmo.201961210.1186/s40668-019-0028-xMayGoogle ScholarGoogle Scholar
  16. 16.Oculus for developers: performance and optimization. https://developer.oculus.com/documentation/unity/unity-perf. Accessed 16 Feb 2022Google ScholarGoogle Scholar
  17. 17.Schatz MCLangmead BThe DNA data deluge: fast, efficient genome sequencing machines are spewing out more data than geneticists can analyzeIEEE Spectr.2013507263310.1109/MSPEC.2013.6545119Google ScholarGoogle ScholarCross RefCross Ref
  18. 18.Schuchardt, P., Bowman, D.A.: The benefits of immersion for spatial understanding of complex underground cave systems. In: Proceedings of the 2007 ACM Symposium on Virtual Reality Software and Technology, pp. 121–124. VRST 2007 (2007). DOI: https://doi.org/10.1145/1315184.1315205Google ScholarGoogle Scholar
  19. 19.Sowndararajan, A., Wang, R., Bowman, D.A.: Quantifying the benefits of immersion for procedural training. In: Proceedings of the 2008 Workshop on Immersive Projection Technologies/Emerging Display Technologies. IPT/EDT 2008 (2008). DOI: https://doi.org/10.1145/1394669.1394672Google ScholarGoogle Scholar
  20. 20.Tan JDing ZHood MLi WSimulation of circulating tumor cell transport and adhesion in cell suspensions in microfluidic devicesBiomicrofluidics201913606410510.1063/1.5129787Google ScholarGoogle Scholar
  21. 21.Thompson APet al.LAMMPS - a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scalesComput. Phys. Commun.202227110817110.1016/j.cpc.2021.108171Google ScholarGoogle ScholarCross RefCross Ref
  22. 22.Whang, J.: Improving the perception of depth of image-based objects in a virtual environment. Master’s thesis, Virginia Polytechnic Institute and State University (2020)Google ScholarGoogle Scholar
  23. 23.Xu MLi CZhang SCallet PLState-of-the-art in 360 video/image processing: perception, assessment and compressionIEEE J. Sel. Top. Sign. Process.202014152610.1109/JSTSP.2020.2966864Google ScholarGoogle ScholarCross RefCross Ref

Index Terms

(auto-classified)
  1. Interactive Virtual Reality Exploration of Large-Scale Datasets Using Omnidirectional Stereo Images

          Comments

          Login options

          Check if you have access through your login credentials or your institution to get full access on this article.

          Sign in
          • Published in

            cover image Guide Proceedings
            Advances in Visual Computing: 17th International Symposium, ISVC 2022, San Diego, CA, USA, October 3–5, 2022, Proceedings, Part I
            Oct 2022
            485 pages
            ISBN:978-3-031-20712-9
            DOI:10.1007/978-3-031-20713-6
            • Editors:
            • George Bebis,
            • Bo Li,
            • Angela Yao,
            • Yang Liu,
            • Ye Duan,
            • Manfred Lau,
            • Rajiv Khadka,
            • Ana Crisan,
            • Remco Chang

            © The Author(s), under exclusive license to Springer Nature Switzerland AG 2022

            Publisher

            Springer-Verlag

            Berlin, Heidelberg

            Publication History

            • Published: 3 October 2022

            Qualifiers

            • Article
          • Article Metrics

            • Downloads (Last 12 months)0
            • Downloads (Last 6 weeks)0

            Other Metrics

          About Cookies On This Site

          We use cookies to ensure that we give you the best experience on our website.

          Learn more

          Got it!