In order to improve the simulation of the near-range atmospheric dispersion of radionuclides, computational fluid dynamics is becoming increasingly popular. In the current study, Large-Eddy Simulation is used to examine the time-evolution of the turbulent dispersion of radioactive gases in the atmospheric boundary layer, and it is coupled to a gamma dose rate model that is based on the point-kernel method with buildup factors. In this way, the variability of radiological dose rate from cloud shine due to instantaneous turbulent mixing processes can be evaluated. The steady release in an open field of 41Ar and 133Xe for 4 different release heights is studied, thus covering radionuclides that decay with a high-energy gamma and a low-energy gamma, respectively. Based on these simulations, the variability of dose rates at ground level for different averaging times in the dose measurements is analyzed. It is observed that turbulent variability in the wind field can lead to dose estimates that are underestimated by up to a factor of four when conventional long-term measurements are used to estimate the dose from short-term exposures.