Monitoring of Urban Changes with multi-modal Sentinel 1 and 2 Data in Mariupol, Ukraine, in 2022/23

11 Aug 2023  ·  Georg Zitzlsberger, Michal Podhoranyi ·

The ability to constantly monitor urban changes is of significant socio-economic interest, like detecting trends in urban expansion or tracking the vitality of urban areas. Especially in present conflict zones or disaster areas, such insights provide valuable information to keep track of the current situation. However, they are often subject to limited data availability in space and time. We built on our previous work, which used a transferred Deep Neural Network (DNN) operating on multi-modal Sentinel 1 and 2 data. In the current study, we have demonstrated and discussed its applicability in monitoring the present conflict zone of Mariupol, Ukraine, with high-temporal resolution Sentinel time series for the years 2022/23. A transfer to that conflict zone was challenging due to the limited availability of recent Very High Resolution (VHR) data. The current work had two objectives. First, transfer learning with older and publicly available VHR data was shown to be sufficient. That guaranteed the availability of more and less expensive data as time constraints were relaxed. Second, in an ablation study, we analyzed the effects of loss of observations to demonstrate the resiliency of our method. That was of particular interest due to the malfunctioning of Sentinel 1B shortly before the selected conflict. Our study demonstrated that urban change monitoring is possible for present conflict zones after transferring with older VHR data. It also indicated that, despite the multi-modal input, our method was more dependent on optical multispectral than Synthetic Aperture Radar (SAR) observations but resilient to loss of observations.

PDF Abstract

Datasets


Introduced in the Paper:

urban_change_monitoring_mariupol_ua

Results from the Paper


  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.

Methods


No methods listed for this paper. Add relevant methods here