Optimal assignment of point clouds using deep learning
Led by: | Brenner, Politz |
Team: | Stephan Niehaus |
Year: | 2019 |
Is Finished: | yes |
The main goal of this master thesis was to register airborne 3D point clouds from different sensor systems. These point clouds are derived from airborne laser scanning (ALS) and dense image matching (DIM) of aerial images. Those point clouds may cover the same surface, but do contain different attributes and characteristics. One major problem when dealing with those two point clouds is vegetation. The laser beam in ALS is able to penetrate vegetation leading to ground and vegetation points in the final point cloud. Since they are derived from aerial images, DIM point clouds only contain the surfaces and thus describes the treetops. This major difference between ALS and DIM causes problems for the registration of point clouds and established algorithms such as the iterative closest points (ICP) algorithm are facing issues when dealing with both point cloud types at the same time, because they assume that similar points are close to each other.
This master thesis examined, if the registration results could be improved, once those vegetation points as well as other points, which have different characteristics between ALS and DIM point clouds, are removed from the registration process. First, the points within each point cloud were manually split into ‘good’ and ‘bad’ points. Only those ‘good’ points were used for calculating the transformation parameters, which were then applied on all points. In the second step, this semi-automatically applied splitting process should then be replaced by an automatic classification of those ‘good’ and ‘bad’ points. The classification was done using a Convolutional Neural Network, which was trained and tested in several setups.
The registration progress improved when the transformation parameters are calculated using only the ‘good’ points. The classification of those ‘good’ points however did not achieve satisfying results and thus leave room for further improvements in the future.