Regression has been widely applied in Light Detection And Ranging (LiDAR) remote sensing to spatially
extend predictions of total aboveground biomass (TAGB) and other biophysical properties over large forested
areas. Sample (field) plot size has long been considered a key sampling design parameter and focal point for
optimization in forest surveys, because of its impact on sampling effort and the estimation accuracy of forest
inventory attributes. In this study, we demonstrate how plot size and co-registration error interact to
influence the estimation of LiDAR canopy height and density metrics, regression model coefficients, and the
prediction accuracy of least-squares estimators of TAGB. We made use of simulated forest canopies and
synthetic LiDAR point clouds, so that we could maintain strict control over the spatial scale and complexity of
forest scenes, as well as the magnitude and type of planimetric error inherent in ground-reference and LiDAR
datasets. Our results showed that predictions of TAGB improved markedly as plot size increased from 314