Authors: Zachary Bortolot*, James Madison University
Topics: Remote Sensing, Land Use and Land Cover Change
Keywords: Radar, SAR, panfusion, visualization, change
Session Type: Paper
Presentation File: No File Uploaded
Radar imagery can provide important information for decision makers, especially due to radar’s ability to collect meaningful data at night and under cloudy conditions. Unfortunately, radar imagery is difficult for non-experts to interpret since radar returns are affected by very different factors than those that affect reflectance in the optical wavelengths. Compounding the problem is the fact that when radar data are presented as panchromatic images many users find that they are not compelling, and when they are presented in color (e.g., by assigning different polarization combinations to the red, green, and blue colors on a screen) the colors can lead to interpretation mistakes because objects are often different colors than they would appear in normal color imagery. A solution is to fuse color information from optical imagery with the panchromatic radar imagery, and multiple techniques exist for doing this. However, these existing techniques assume that the landscape is identical in the optical and radar imagery, and this is often an unrealistic assumption. Violating this assumption generally leads to inaccurate and potentially misleading colors. This research presents a panfusion approach for radar imagery that identifies areas with landcover change and uses a texture-based approach to assign realistic colors from unchanged parts of the image to these areas. Post fusion filtering is performed to reduce artifacts. Tests conducted in the Mississippi Delta and central South Carolina by fusing recent (2016 or 2018, respectively) UAVSAR radar data and National Aerial Imagery Program orthophotographs collected in 2007 or 2009, respectively, show promising results.