 exploring beneath the waves presented by science at NASA. Oceans cover over 70% of Earth's surface and profoundly influence our planet's atmosphere, weather, and climate. However, uncovering the many secrets hidden beneath the ocean's waves presents unique challenges for researchers and requires specific technology to observe what humans can't see. NASA technologists are developing sensors that can improve measurements of Earth's oceans. Creating new instruments to study aspects of our home planet we haven't before been able to research. Imaging what's below the ocean's surface requires the development of a new instrument capable of improving the information available to scientists. Vegchariath, a scientist at NASA's Ames Research Center, says images of objects under the surface are distorted in several ways, making it difficult to gather reliable data about them. Chariath has a technology solution. It's called fluid-lensing. He says, refraction of light by waves distorts the appearance of undersea objects in a number of ways. When a wave passes over, the object seems bigger, due to the magnifying effect of the wave. When the trough passes over, the objects look smaller. Fluid-lensing is the first technique to correct for these effects. Without correcting for refraction, it's impossible to determine the exact size or extent of objects under the water's surface, how they're changing over time, or even precisely where they are. Chariath developed a special camera called FluidCam that uses fluid-lensing to see beneath the waves and capture terabytes worth of 3D images at 1.5 centimeter resolution, snapping imagery from aboard an unmanned aerial vehicle. The key to fluid-lensing lies in the unique software Chariath developed to analyze the imagery collected by FluidCam. He explains that this software turns what would otherwise be a big problem into an advantage, not only eliminating distortions caused by waves, but using their magnification to improve image resolution. He is focusing FluidCam on coral reefs, the health of which have been significantly degraded due to pollution, over-harvesting, increasing ocean temperatures and acidification among other stressors. To understand how the reefs are affected by environmental and human pressures and to work with resource managers to help identify how to sustain reef ecosystems, researchers need to determine how much healthy reef area exists now. Fluid-lensing could help researchers establish a high-resolution baseline of global reef area worldwide by augmenting data sets from multiple NASA satellites and airborne instruments. This effort will help identify the effects of environmental changes on these intricate, life-filled ecosystems. Chariath and his team design special software to teach supercomputers how different conditions, such as different sizes of waves, affect the images captured. The computers combine data from multiple airborne and satellite data sets and identify objects in the images accordingly, distinguishing between what is and is not coral and mapping it with 95% greater accuracy than any previous efforts. He says, we created an observation and training network called NemoNet, through which scientists and members of the public can analyze imagery captured by FluidCam and other instruments to help classify and map coral and 3D. This is the database we use to train our supercomputer to perform global classifications. Chariath is working toward a space-based FluidCam from orbit the camera could map coral reef ecosystems globally and help researchers better understand the overall health of coral reefs. To learn more about the amazing technologies NASA uses to explore on planet, visit science.nasa.gov.