Training data is required for the development of machine learning techniques for rapid detection of flood boundaries and water depth measurements. The training data for this project was acquired through the use of sensors on Unmanned Aerial Vehicles (UAVs – aka Drones) and satellites. Specific sensors include optical and Synthetic Aperture Radar (SAR) which can be integrated into both drones and satellites. The project has generated the following outputs:
- UAV optical training data
- Raw UAV and satellite optical and SAR data used for the project (note that Sentinel optical and SAR satellite data are freely available by the European Space Agency)
- Processed UAV and satellite optical and SAR data (i.e., classified images for flood extent)
- 2D flood extent shapefile
- A digital flood surface model (i.e., a raster model whose pixel values represent the floodwater depth)
- Supporting GIS data used for the project such as wetland maps, topography data, land use maps
- A developed and trained deep learning model for UAV and Satellite optical and SAR data, model codes, and algorithms for 3D rapid extraction of flood extent for inland and coastal areas and floodwater depth measurement at different scales.
- Training datasets and geospatial data (shapefiles and raster) generated during the project will be transitioned to NOAA partners with the machine learning code.
Peer-Reviewed Publications List based on funding provided by award NOAA-OAR-WPO-2021-2006592
Fawakherji, M., and Hashemi-Beni, L. Flood Detection and Mapping through Multi-Resolution Sensor Fusion: Integrating UAV Optical Imagery and Satellite SAR Data. IEEE Access (under review).
Fawakherji, M., Blay, J., Anokye, M., Hashemi-Beni, L., and Dorton, J. DeepFlood: High-Resolution Dataset for Accurate Flood Mapping and Segmentation. Scientific Data. (under review).
Fawakherji, M., and Hashemi-Beni, L. Multi-Head Encoder-Decoder Deep Learning Architecture for Flood Segmentation and Mapping through Multi-Sensor Data Fusion. IGARSS 2024. (accepted, in press).
Blay, J., Fawakherji, M., and Hashemi-Beni, L. Flood Impact Risk Mapping in Settlement Areas from a 3D Perspective: A Case Study of Hurricane Matthew. IGARSS 2024. (accepted, in press)
Anokye, M., Fawakherji, M., and Hashemi-Beni, L. Flood Resilience through Advanced Wetland Prediction. IGARSS 2024. (accepted, in press)
Salem A, Hashemi-Beni L. Inundated Vegetation Mapping Using SAR Data: A Comparison of Polarization Configurations of UAVSAR L-Band and Sentinel C-Band. Remote Sensing. 2022; 14(24):6374. https://doi.org/10.3390/rs14246374
Gebrehiwot, A., Hashemi-Beni, L. 3D Inundation Mapping: A Comparison Between Deep Learning Image Classification and Geomorphic Flood Index Approaches, June 2022. Frontiers in Remote Sensing, Sec. Data Fusion and Assimilation. https://doi.org/10.3390/rs14246374
Peer-Reviewed Presentations and Posters List based on funding provided by award NOAA-OAR-WPO-2021-2006592
Fawakherji, M., & Hashemi-Beni, L. Flood Detection, and Mapping Through Sensor Fusion: A Comparative Study of Multi-Sensor Data Integration with UAV Optical Imagery. AMS 2024.
Anokye, M., Fawakherji, M., & Hashemi-Beni, L. Advanced Wetland Prediction for Flood Resilience in Southern North Carolina. ASPRS 2024.
Blay, J., Fawakherji, M., and Hashemi-Beni, L. A 3D Perspective of Analyzing the Effect of Hurricane Matthew in Small Communities: A Case Study of Grifton, NC. ASPRS 2024.
Fawakherji, M., and Hashemi-Beni, L. . Multichannel Deep Learning for Flood Detection and Mapping. AGU 2023.
Salem, A., Fawakherji, M., and Hashemi-Beni, L. Multimodal Data Fusion for Flood Monitoring: A Case Study of Hurricane Florence. AGU 2023.
GitHub Repository for training datasets
Add Info
ADD INFO