The current satellite remote sensing systems have to make a tradeoff between the spatial, temporal, and spectral resolutions. The fusion of the complementary information among the multi-source remote sensing observations is a good way to improve the potential applications of remote sensing data. The SENDIMAGE laboratory has carried out a series of studies into the theories and methods of multi-source information fusion. We have taken the lead in proposing the concept and framework of spatio-temporal-spectral integrated fusion (2012). In addition, the laboratory has conducted research into point-surface fusion for remote sensing observations and station data. One of these papers also won the ERDAS award for Best Scientific Paper in Remote Sensing, presented by the American Society for Photogrammetry and Remote Sensing (ASPRS).
- Multi-view Fusion (Super-resolution)
The SENDIMAGE laboratory has developed video super-resolution methods for complicated changing scenes from the perspective of reconstruction accuracy, computational efficiency, and self-adaption. We also have performed the study of multi-temporal and multi-angle image super-resolution of remote sensing images.
- Spatio-spectral Fusion
The integration of the two or more remote sensing images with different spatial and spectral resolutions to produce the fused image with the more detailed information than each of the sources. A particular case is the pansharpening that blends a higher spatial resolution panchromatic image (only one spectral band) with a higher spectral resolution multispectral image (lower spatial resolution) to generate the fused multispectral image with high spatial resolution.
- Spatio-temporal Fusion
Remote sensing image with both fine spatial resolution and temporal frequency are obtained by merging the complementary information of the remote sensing images with different spatial and temporal charcteristics.
- Spatio-temporal-spectral Integrated Fusion
The traditional image fusion methods were developed independently, and can only fuse a single type of remote sensing image due to the lack of a rigorous and unified framework. Therefore, the SENDIMAGE laboratory has taken the lead in proposing the concept and framework of the spatio-temporal-spectral integrated fusion method. This method gives a common description and unified modeling for the multi-source remote sensing observations, and can be applied to the unified fusion of multi-temporal, multi-band, multi-angle, and multi-scale optical remote sensing observations.
- Point-surface Fusion of Spatial Observations
Based on point-surface fusion methods, the retrieval model of pixel-based PM2.5 was established using ground-observed PM2.5 concentration and satellite-observed aerosol optical depth (AOD). The result is a spatially continuous PM2.5 concentration distribution.
- Data-model Merging(Assimilation)
In order to obtain continuous and dynamic information, we conducted a study of data assimilation combining observations and dynamic models. Based on the CoLM (Common Land Model) land surface model and the SWAT (Soil and Water Assessment Tool) semi-distributed hydrological model, important improvements were made in the simultaneous data assimilation of soil moisture and soil temperature observations, and the enhanced application of the ensemble Kalman filter and ensemble Kalman smoother algorithms.