Saiwa
Logo

Vote for Saiwa

Saiwa nominated for OCI Artificial Intelligence Award for Agri-Food powered by AWS.
Powered By Saiwa

Multispectral Submersion Classification of Water Soldier in Drone Imagery

Status: Development (Started in Thu Jan 01 2026)

In 2026, Saiwa continues its collaboration with Ducks Unlimited Canada (DUC) to further enhance drone-based monitoring of invasive aquatic vegetation. Building on last year’s high-accuracy RGB-based Water Soldier detection pipeline, this new phase focuses specifically on determining whether each detected plant is located above or below the water surface.

About Ducks Unlimited Canada (DUC)

Ducks Unlimited Canada is a leading conservation organization dedicated to protecting and restoring wetlands and associated ecosystems across Canada. Through science-driven habitat management and strategic partnerships, DUC advances innovative monitoring solutions for sustainable environmental stewardship.

Overview

In the previous phase of the project, Saiwa developed a high-resolution RGB-based detection model capable of identifying Stratiotes Aloides (Water Soldier) from drone orthophotos. However, while RGB imagery provides sufficient spatial detail for detection, it does not reliably indicate whether a detected plant is positioned above or below the water surface. Spectral differences related to water absorption and reflectance—particularly in the Near-Infrared (NIR) range—provide additional discriminatory power for this classification task.
The 2026 project therefore introduces multispectral analysis not for detection, but specifically for submersion status classification.

Problem Statement

Although high-resolution RGB orthophotos enable accurate detection of Water Soldier regardless of its position relative to the water surface, determining vertical status (above vs. below water) remains challenging using RGB data alone.
Key challenges include:

  • Limited spectral contrast in RGB bands between emergent and shallow submerged vegetation
  • Subtle visual differences under varying illumination conditions
  • The need for pixel-level correspondence between RGB detections and NIR measurements
    A robust framework is required to combine precise RGB-based detection with spectral cues from NIR imagery.

Project Goals

The primary objectives of this phase are:

  1. Maintain high-precision RGB-based detection of all visible Water Soldier specimens
  2. Accurately align RGB orthophotos with NIR imagery through geometric registration
  3. Extract spectral indicators from NIR data to distinguish emergent from submerged vegetation
  4. Develop a classification model that assigns a submersion label (above-water / below-water) to each detected instance

Solution Approach

1. RGB-Based Detection (Baseline)

All plant detection continues to rely exclusively on high-resolution RGB orthophotos. The trained deep learning model identifies Water Soldier instances regardless of submersion depth, leveraging clear-water visibility conditions.

2. Cross-Sensor Image Registration

To incorporate spectral information, RGB orthophotos are geometrically aligned with NIR imagery using feature-matching and homography-based transformation techniques. This ensures pixel-level correspondence between detection outputs and multispectral data.

3. Spectral Feature Extraction for Submersion Classification

For each RGB-detected instance, spectral features are extracted from the aligned NIR layer. Because water strongly absorbs NIR radiation, emergent vegetation exhibits significantly higher reflectance than submerged vegetation. These differences form the basis of classification.
Relevant features may include:

  • NIR reflectance intensity
  • RGB–NIR band relationships
  • Vegetation indices incorporating NIR

4. Submersion Status Classification

A supervised machine learning classifier is trained using labeled data (field validation and expert annotations) to determine whether each detected specimen is above or below the water surface.
The final output preserves the original RGB detection geometry while adding a submersion status attribute.

Expected Impact

By separating detection (RGB-based) from submersion classification (NIR-assisted), this approach maintains high spatial accuracy while introducing reliable vertical status estimation. The enhanced workflow enables more informed management decisions, supports targeted interventions, and strengthens long-term monitoring of invasive aquatic vegetation.

Contact

For technical details, implementation inquiries, or collaboration opportunities, please contact us via info@saiwa.ai or through this contact form.