This paper proposes a novel framework for the fusion of hyperspectral and light detection and ranging-derived rasterized data using extinction profiles (EPs) and deep learning. In order to extract spatial and elevation information from both the sources, EPs that include different attributes (e.g., height, area, volume, diagonal of the bounding box, and standard deviation) are taken into account. Then, the derived features are fused via either feature stacking or graph-based feature fusion. Finally, the fused features are fed to a deep learning-based classifier (convolutional neural network with logistic regression) to ultimately produce the classification map. The proposed approach is applied to two datasets acquired in Houston, TX, USA, and Trento, Italy. Results indicate that the proposed approach can achieve accurate classification results compared to other approaches. It should be noted that, in this paper, the concept of deep learning has been used for the first time to fuse LiDAR and hyperspectral features, which provides new opportunities for further research.
«
This paper proposes a novel framework for the fusion of hyperspectral and light detection and ranging-derived rasterized data using extinction profiles (EPs) and deep learning. In order to extract spatial and elevation information from both the sources, EPs that include different attributes (e.g., height, area, volume, diagonal of the bounding box, and standard deviation) are taken into account. Then, the derived features are fused via either feature stacking or graph-based feature fusion. Final...
»