Multi-Sensor Smoke and Fire Detection and Tracking using Self-Supervised Deep Learning

Segmentation, Instance Tracking, and data Fusion Using multi-SEnsor imagery (SIT-FUSE) utilizes self-supervised machine learning (ML) that allows users to segment instances of objects in single and multi-sensor scenes, with minimal human intervention, even in low- and no-label environments. Can be used with image like and non image-like data. Currently, this technology is being used with remotely sensed earth data to identify objects including:

  • Wildfires and smoke plumes
  • Harmful algal blooms and their severity
  • Palm oil farms
  • Dust and volcanic ash plumes
  • Inland water bodies

SIT-FUSE’s innovative multi-sensor fire and smoke segmentation precisely detects anomalous observations from instruments with varying spatial, spectra, and temporal resolutions. This capability creates a sensor web by incorporating observations from multiple satellite-based and suborbital missions. The ML framework’s output also facilitates smoke plume and fire front tracking, a task currently under development by the SIT-FUSE team.

Project is currently funded by NASA CSDSA, TEMPO, and DISASTERS programs.

Organization
Spatial Informatics Group, LLC.
Contact

Lister: Nick LaHaye

Members

Displaying 1 - 1 of 1 people associated with this pathfinder project

Nick LaHaye

Research Data Science Spatial Informatics Group, LLC. (Pasadena, California)

AI/ML Remote Sensing Smoke Modeling Software Development