Enter OceanLens, a software tool that fuses topographic and bathymetric (underwater topography) data into an interactive 3D environment. It’s a perfect blend of immersive technology and data science. We’re able to pull sound propagation data, water temperature, terrain type, video, sonar data, and a lot more. OceanLens can layer and toggle these data sources, displaying it on a virtual globe with various levels of image resolution.
With OceanLens, you can view the undersea domain from any vantage point on a flat screen TV or on a VR headset. As you move the virtual camera, you see an immersive and interactive virtual rendering of the ocean floor.
OceanLens lets planners and operators import and merge data sets—fictional, historical, or live—to provide a base layer for undersea sensor placement, intelligence data processing, common operational picture development, mission planning, and more. It’s built for everything from ship navigation to planning the construction of bridges, dam, and oil rigs.
OceanLens is the brainchild of our experts in naval navigation, VR, and data science. Nirav introduced former “submariner” Eric R. Jones to Lead Developer Ian Byrnes, who showed off the “global, zoomable Google Earth-like tool” Eric says. He thought, “If you can pull that data, can you pull bathymetric data?”
Ian soon came back with a new model—and it went underwater. With the power of data analytics the ability to view what used to be invisible is limitless.
We’re developing software enhancements to model sound propagation patterns underwater. This means users can create their own scenarios by visualizing standard nautical chart formats in 3D. It means they can track vessel movements and paths.
It doesn’t stop there. Booz Allen is also working with Commander Submarine Forces Pacific (COMSUBPAC) to develop custom submarine exercise scenarios and dataset visualizations of the ocean floor, like what lies beneath Pearl Harbor. You can see OceanLens in action at the COMSUBPAC Innovation Lab there.