This abstract describes an innovative method for monitoring the growth of European seabass (Dicentrarchus labrax) using stereo imaging over recirculating aquaculture system (RAS) tanks. This method involves a series of processes that begin with the rectification and contrast enhancement of images captured by binocularly arranged cameras. Instance segmentation is performed using the YOLO-v8 deep learning model (AI model) trained on seabass images to recognize fish silhouettes in the captured images. The centerline of the contour is then calculated using the Voronoi algorithm, segmenting the fish into five parts and marking a skeleton of ten points of interest (PoI) along the image boundary. Stereo matching then identifies these silhouettes in all images and allows the real-world coordinates (x, y, z) for these points to be accurately determined, taking into account the refraction effects of the water.
Fish images where the PoI do not meet specific consistency criteria are not considered. An allometric regression model, trained with distances from these points on top-view images of European seabass, estimates the weight of each fish. Preliminary results from trials in three experimental RAS tanks at CIIMAR facilities indicated the effectiveness of the method with a Mean Absolute Percentage Error (MAPE) of less than 10% for weight estimation. This cost-effective and accurate method underlines its potential to improve fish farming management . European seabass has significant value in the European food market, with the success of aquaculture highly dependent on optimal breeding and rearing conditions, water quality and animal welfare. Conventional growth monitoring methods involving physical sampling cause fish stress, increase disease susceptibility and incur high labor costs. Conversely, this non manipulation imaging technique allows continuous, stress-free monitoring that reduces operational costs and enables early detection of diseases through visual inspection and behavioral analysis.
Despite challenges such as fish movement, water refraction and variable image quality factors (luminosity, reflection, turbidity), the proposed method integrates advanced deep learning (AI), instance segmentation, filtering and computer vision stereo techniques that surpass those problems. It enables aquaculture producers to efficiently monitor fish stock growth, predict biomass and weight distribution at harvest, improve planning and sales negotiations, increase economic results and support animal welfare, leading to higher product quality standards.