Seawatch: AI aids NOAA in fish counting

While artificial intelligence is raising alarms in many sectors of the international community, it is helping the National Oceanic and Atmospheric Administration with one of the most tedious of tasks: counting fish.

Gray Television, which has stations in Anchorage and Fairbanks, reports that with advances in underwater camera technology and machine-learning-based image processing, biologists with NOAA Fisheries have been able to complete some fish surveys in a fraction of the amount of time previously needed.

The survey data is incorporated into stock assessments, which help determine changes in the abundance of fishery stocks and are fundamental to management decisions, including setting quota.

Traditionally, NOAA has used survey methods including bottom trawling and acoustic surveys. Acoustic surveys give biologists an idea of the amount of fish in the middle of the water column but cannot identify the species present.

Now, camera-based surveys are being tested in a number of situations, NOAA fisheries biologist Kresimir Williams said.

“The one that I’m working with most closely at the moment involves having an actual camera in a trawl net and having the trawl net just sort of aggregate the fish,” Williams said. “Then they can just be let go after that. We’re just collecting images as they go by.”

Williams and a team of researchers working on NOAA’s Automated Image Analysis Strategic Initiative developed a program that uses artificial intelligence and machine learning to rapidly analyze large datasets for marine ecosystems.

“During a summer season we collected somewhere in the neighborhood of 2 to 3 million of the still images. Then if somebody’s going through that, we’re looking at multiple weeks, like four weeks of somebody going through each image,” Williams said. “For that particular application where we have this camera in the net that we’re calling CamTrawl, we have this automated routine that goes through and identifies fish to species and gets an estimate of the fish size, and it can grind through the whole summer of data in less than a day.”

The technology is also being tested to develop surveys in areas that are untrawlable, and therefore have had less research on the abundance of fish in them.

Bottom trawl surveys can only be conducted in areas where a net can be dragged across the bottom. Untrawlable areas could be too steep or rocky for the method to work.

According to NOAA, untrawlable areas make up 17 percent of the Gulf of Alaska and 54 percent of federally managed areas around the Aleutian Islands. Williams says those areas are popular habitat for certain species, including rockfish.

“For them, getting a good assessment can be tricky because all you’re getting is only part of the information from the bottom trawl survey and then you’re expected to sort of fill in the gap as to the part that you can’t survey,” Williams said. “So by coming up with a way to sample those areas effectively using camera methodology we can sort of make the picture complete. Bring the bottom trawl survey data together with the camera and have a more robust assessment that captures the whole population, not only parts of it.”

Williams says the advances in machine learning and image analysis are primarily being driven by industries outside of fisheries science.

“A lot of the latest and greatest stuff that’s being developed is about trying to be as fast as you can with image analysis. Think of self-driving cars where you really need real-time information on detecting objects that are in the field of vision. With fisheries work, we kind of have a slightly different angle on it. We’re more interested in precision than we are necessarily in the speed of the algorithm,” he said.

He added that target accuracy range for the automated fish species and size classification is 80 to 90 percent.

Correction: Last week Seawatch incorrectly reported the dates of the Lower Cook Inlet Board of Fish meeting in Seward The meeting will be held Dec. 10-13.

Cristy Fry can be reached at realist468@gmail.com

Tags: ,