WHOI  WHOI People  

OLI Grant: Quantification of Zooplankton Viability Using Image Analysis of Nonrigid Motion: An Approach for Automation of Critical Assays Used For Assessment of Ballast Water Treatment Technology Effectiveness

Grant Funded: 2006

With the identification of ballast water transfer as a major pathway for introduction of invasive aquatic species within the global coastal oceans and major freshwater systems has come the technological challenge for neutralizing this pathway. To encourage ship-owners to participate in the development of Ballast Water Treatment (BWT) systems and to provide a vehicle for acquisition of data demonstrating efficacy of various treatment strategies, the United States Coast Guard (USCG) has initiated the Shipboard Technology Evaluation Program (STEP) for onboard ballast water treatment. A requirement for entry into STEP is that the proposed BWT system will likely meet Navigation and Vessel Inspection Circular (NVIC) and International Maritime Organization (IMO) standards for removal or inactivation of ballast water organisms and development of a sound experimental program that unequivocally and quantitatively demonstrates the level of effectiveness of the installed system. After assessment of ~4 BWT systems none has been admitted into STEP. Part of the problem is that assessment of zooplankton viability, a critical assay, is conducted using labor intensive manual microscope techniques that preclude the necessary statistical rigor to prove that the BWT system under study will meet required standards. It is essential that the zooplankton viability assay be automated. We propose to develop a video data recording system coupled with image analysis that will permit quantification of zooplankton viability in approximately 1/10th the time required for manual assays. Zooplankton motility is the descriptor for viability (as it is in the manual assays). The project will entail development of a dark field illuminated stage that optimally presents the zooplankton samples for video recording movement responses after global electro- or piezoelectric stimulation and development of the image analysis routines for enumeration and viability scoring of the zooplankton samples. For detection of motion we will explore two approaches: 1) Modeling the zooplankton as single deformable objects, where changing properties of periphery-hugging "snakes" are quantified as a feature vector which is both scale- and rotation-invariant. If the movement of live zooplankton results in feature vectors that are markedly different from those of dead zooplankton, the two feature classes can be distinguished by a pattern classifier; 2) In the situation where the differences between the two classes are more subtle, we will model the zooplankton as articulated objects, consisting of limbs and joint angles. The new feature sets will consist of the feature vectors of the deformable limbs as well as all the joint angles. This will provide a more subtle description of the articulated motion and, consequently, the resulting pattern classifier will be more effective.

Originally published: February 1, 2006