Tuesday, March 2, 2021

Exploring Ocean Using AI

I'm excited to share with you Exploring our ocean using artificial intelligence. This is a project that involves a number of individuals from NOAA, National Geographic Society, CVision AI, MIT Media Lab, as well as MBARI. Funding has been graciously provided by the National Geographic Society, NOAA, as well as the National Science Foundation.

This project really centers around one question: How do we explore a realm as vast and ever-changing as the ocean. With the advent of modern robotics, persistent and distributed observations of processes and life in our ocean are now on the horizon. Implementing artificial intelligence, or AI, has been touted as a crucial pathway to enable rapid processing of ocean data, and this is really required for us to be able to scale our observations to the entire ocean. 

Adoption of AI in the ocean is limited though by the availability of curated data, particularly underwater imagery and video, in order to train these algorithms. Our Ocean Shot is to use AI to automate processing of underwater imagery and video to fully explore and discover our ocean. We want to realize a vision of ocean exploration and discovery that involves the use of distributed observation platforms conducting measurements at unprecedented spatiotemporal scales.

Now artificial intelligence can help but in order to train algorithms that can automate the detection and classification of concepts in underwater imagery, we really need labeled data. Labeled data requires localization (or bounding boxes) and identification (or annotation) of concepts in every training image.
 

But the bottleneck for deploying AI in the ocean is the availability of labeled data, which we seeks to address. We're able to do this by first leveraging existing data like MBARI's Video Annotation and Reference System (VARS), which is a 30-year annotated database from ROV-collected video.

If you do a search for a concept like Aegina in the VARS database, you can return a number of images like the two images that you see here: the top image shows Aegina clearly as the only object or concept in the image, whereas the image below shows Aegina with a number of other objects. So while some underwater image data may contain annotations with concept names, in more cluttered fields like the image on the bottom,  the locations of all concepts within an image are critical for training machine learning algorithms.

Our solution which is a publicly available database for training machine learning algorithms on underwater imagery. By leveraging existing data, and providing a repository for new data, we will construct a global data set that can be used to train algorithms for rapid and widespread exploration and discovery of our ocean. 

To date, FathomNet contains 771 concepts and more than 117,000 localizations from MBARI's Video Annotation and Reference System, with additional contributions from NOAA, National Geographic Society, and other partners planned in 2021. Training AI algorithms on FathomNet data have yielded promising results that can be used to achieve our Ocean Shot.

FathomNet can be found at www.fathomnet.org, and the website will be released in the next few months. Users can either search for concepts from the website directly, or by using an API. Data exploration involves the use of concept trees, geographic locations, or a number of other filters, and contributions can also be verified by experts. There are also additional annotation tools that can enable users to augment  existing data, or make modifications to data that they contribute to the database.

Algorithms trained on FathomNet have been used to detect and classify animals observed by numerous deep-sea robotic systems at MBARI, NOAA, and NGS, and the videos that you see on the bottom are examples of footage that have had machine learning algorithms applied to them collected by these different robotic platforms including NOAA's ROV Deep Discoverer, National Geographic Society's Deep Sea Camera system, MBARI's MiniROV platform, and finally MBARI's i2MAP autonomous underwater vehicle, that is specifically designed for midwater transects.

No comments:

Post a Comment