Point cloud and Digital Surface Models

As discussed in the lidar section, point clouds are the result of numerous registered points that can represent shape, size, position and orientation of objects in space [1]. Although both mobile laser scanners (on the ground scanning) and airborne laser scanners can produce dense point clouds, due to their practicality, consistency with other aerial sensors, and delivering top of the canopy data, airborne laser scanners are more prevalent in agricultural remote sensing. Regardless of their source, processing dense point clouds faces three fundamental challenges: 1) need of powerful computer processing resources to handle hundreds of millions or even billions of points with geometric, colorimetric, and radiometric attributes, 2) lack of any semantic information or linkage among the points (points are discreet data entries), and 3) presence of noise [2]. The point cloud processing can be divided into two steps: segmentation and classification. The former refers to clustering points into subsets (typically called segments) based on one or more common characteristics (geometric, radiometric, etc.), whereas classification is the assignment of points to specific classes (labels) according to some predefined criteria [223]. However, type and nature of the scene, e.g. agricultural vs municipal,  can significantly influence the requirements of a data processing algorithm [2]. A comprehensive discussion and classification of all the steps of point cloud processing is presented at [223]. 

A low-precision alternative for LiDAR 3D reconstruction and point cloud generation can be obtained through photogrammetry. Photogrammetry is the science of making accurate measurements from photographs and using optics' principles, the camera's interior structure, and its orientation to reconstruct dimensions and positions of objects from overlapping images [3]. Usually, this method is applied to data derived from stereo cameras or cameras equipped with GPS and IMU to construct a point cloud by mathematical methods and computer vision techniques such as Structure-From-Motion (SFM) [4]. Although photogrammetry can be carried out using a single spectral band, multi-band images such as RGB helps to improve the accuracy of the process and results. Additionally, higher image resolution and radiometric calibration of the images used for photogrammetry improves the accuracy of outcoming point cloud [5]. Other factors that determine the accuracy of the results include i) the algorithm by which the program is extracting data, ii) the overlap of the images, iii) the quality and resolution of the images, and vi) the precision of the positioning and IMU devices used during the flight.

DSMs and Digital Terrain Models (DTM) are two of the most valuable point cloud processing products used to extract canopy structure characteristics. DSM is a land model, including all trees, shrubs, ridges, and furrows in a 2.5D form in which each point has X, Y, and Z (elevation from the sea level) values. DTM, or bare ground model, is derived from DSM by excluding all objects above the ground. For flat or fixed-slope orchards, DTM generation is straightforward. However, it might be challenging for orchards on uneven terrain or when canopies are too dense that the ground in the aerial view is blocked. Once a DTM was calculated, a Canopy Height Model (CHM) can be driven by subtracting the DTM from the DSM. By compensating for the variations in the field surface, CHM displays all objects and trees on a flat surface (zero line), so the height of the trees would be an absolute number and can be easily compared [6]. Figure 7 shows these terms graphically.

Graphical representation of DSM, DTM, CHM, and zero line. DSM and DTM are measured as sea level elevation, while CHM represents the height of each element from the ground level

Figure 7- Graphical representation of DSM, DTM, CHM, and zero line. DSM and DTM are measured as sea level elevation, while CHM represents the height of each element from the ground level.

Several commercial software, such as Pix4D and Agisoft, are available for photogrammetry processing in an agricultural setting, and 3D  models of many orchards reconstructed by this method ,e.x., almond [7] and apples [8]. However, a noticeable shortcoming of photogrammetric point clouds is the weak penetration power of optic cameras compared to laser scanners, limiting information obtained from lower layers of canopies.

References

[1]       E. Grilli, F. Menna, and F. Remondino, “A review of point clouds segmentation and classification algorithms,” The International Archives of Photogrammetry, Remote Sensing and Spatial Information Sciences, vol. 42, p. 339, 2017.

[2]       E. Che, J. Jung, and M. J. Olsen, “Object Recognition, Segmentation, and Classification of Mobile Laser Scanning Point Clouds: A State of the Art Review,” Sensors, vol. 19, no. 4, Art. no. 4, Jan. 2019, doi: 10.3390/s19040810.

[3]       J. B. Campbell and R. H. Wynne, Introduction to remote sensing. Guilford Press, 2011.

[4]       M. Weiss et al., “Using 3D Point Clouds Derived from UAV RGB Imagery to Describe Vineyard 3D Macro-Structure To cite this version : HAL Id : hal-01454675 Using 3D Point Clouds Derived from UAV RGB Imagery to Describe Vineyard 3D Macro-Structure,” 2017, doi: 10.3390/rs9020111.

[5]       M. X. Tagle Casapia, “Study of radiometric variations in Unmanned Aerial Vehicle remote sensing imagery for vegetation mapping,” Lund University GEM thesis series, 2017.

[6]       R. Perko, H. Raggam, J. Deutscher, K. Gutjahr, and M. Schardt, “Forest assessment using high resolution SAR data in X-band,” Remote sensing, vol. 3, no. 4, pp. 792–815, 2011.

[7]       J. Torres-Sánchez et al., “Mapping the 3D structure of almond trees using UAV acquired photogrammetric point clouds and object-based image analysis,” Biosystems engineering, vol. 176, pp. 172–184, 2018.

[8]       G. Sun, X. Wang, Y. Ding, W. Lu, and Y. Sun, “Remote Measurement of Apple Orchard Canopy Information Using Unmanned Aerial Vehicle Photogrammetry,” Agronomy, vol. 9, no. 11, p. 774, 2019.