In this figure the 171 floats are sorted according to the standard deviation
of their associated WOA-based time series. While the
WOA-derived time series show a slow increase of the standard
deviations from near-zero to about 0.005 for the first
140 floats followed by a rapid increase to 0.03, the Argoderived
standard deviations are more uniform and closer to
0.006 before following the steep rise and show several particularly
large values. It should be noted that values below
0.003 are below the resolution of the salinity algorithm. The
low values for the WOA-derived time series are the result of
floats passing through a highly smoothed salinity field. Most
Argo-derived values, on the other hand, are clearly larger
than the sensor resolution and do not represent mere instrumental
noise. The Argo team guarantees an accuracy of 0.01
for delayed mode quality controlled data, despite the fact that
the instrument stability is often closer to 0.003 (Wijffels, personal
communication), but Oka (2005) estimates sensor drift
at as low as 0.004 per year, based on recalibration of recovered
floats. About half the floats used here have a standard
deviation less than or close to the accepted accuracy if the
larger Argo accuracy of 0.01 is used as a yardstick.