Previous Page  3 / 44 Next Page
Information
Show Menu
Previous Page 3 / 44 Next Page
Page Background

Ref. No. [UMCES] CBL 2017-050

ACT VS17-05

3

EXECUTIVE SUMMARY

The Alliance for Coastal Technology (ACT) conducted a sensor verification study of in situ

nutrient analyzers during 2016 to characterize performance measures of accuracy, precision and

reliability. The verification including a week of laboratory testing along with three moored field

deployments in freshwater, estuarine, and oceanic coastal environments. Laboratory tests of

accuracy, precision, and range were conducted at the University of Maryland’s Chesapeake

Biological Laboratory (CBL) in Solomons, MD. A series of five tests were conducted to evaluate

performance under controlled challenge conditions including: concentration range, temperature,

salinity, turbidity, and dissolved organic carbon. All laboratory tests were conducted in 250 L

polypropylene tanks using RO water as the initial matrix, within a temperature controlled room.

Instruments sampled from a common, well-mixed, test tank maintained at a documented level of

known challenge condition. Instruments were set-up by the manufacturer daily prior to the start of

each individual laboratory test, exposed to each test condition for a period of three hours, and

programmed to sample at a minimum frequency of 30 minutes. Reference samples were collected

every 30 minutes for five timepoints during corresponding instrument sampling times for each test.

For the laboratory concentration range challenge the absolute difference between the Real

Tech Real-NO3 and reference measurement across all timepoints for trials C0 – C5 ranged from -

0.217 to 0.490 mgN/L, with a mean of 0.185 ±0.168 mgN/L. A linear regression of the

measurement difference versus concentration was significant (p=0.0192; r

2

=0.193), but with a low

regression coefficient due to a reversal in direction for the C4 trial. In general, the Real-NO3

increasingly over-predicted concentrations as they increased in the test. An assessment of precision

was performed by computing the standard deviations and coefficients of variation of the five

replicate measurements for C1 – C5 concentration trials. The standard deviation of the mean

ranged from 0.010 to 0.022 mgN/L across the five trials, and the coefficient of variation ranged

from 0.20 to 6.47 percent. For the laboratory temperature challenge at 5

o

C, the absolute difference

between instrument and reference measurement across all timepoints for trials C2 – C4 ranged

from -0.0880 to 0.4381 mgN/L, with a mean of 0.056 ±0.115 mgN/L. Measurement differences at

both C2 and C3 were significantly lower at 5

o

C (0.017 and 0.058) versus 20

o

C (0.020 and 0.237)

(p<0.01). Differences were not statistically significant across temperatures at the C4 level. Similar

to test results at 20

o

C, the measurement offset increased in a positive direction as concentration

increased. For the laboratory salinity challenge performed at the C3 concentration level, the

absolute difference between instrument and reference measurement across all timepoints for the

three added salinity levels ranged from 0.146 to 0.483 mgN/L, with a mean of 0.272 ±0.095

mgN/L. A linear regression of the measurement differences versus salinity was significant

(p=0.004; r

2

=0.38) with a slope of 0.005 and intercept of 0.184. The average offset at salinity 30

was 0.16 mgN/L higher than the average for the 10 and 20 salinity trials. For the laboratory

turbidity challenge, performed at the C3 concentration level, the absolute difference between

instrument and reference measurement across all timepoints for the two added turbidity levels

ranged from 0.028 to 0.135 mgN/L, with a mean of 0.096 ±0.036 mgN/L. The effect of turbidity

on measurement accuracy was mixed, when compared against RO water results, however, the

magnitude of over-prediction approximately doubled between the 10 and 100 NTU trials. For the

laboratory DOC challenge, performed at the C3 concentration level, the absolute difference

between instrument and reference measurement across all timepoints for the two added DOC levels

ranged from 0.099 to 0.482 mgN/L, with a mean of 0.292 ±0.193 mgN/L. The measurement

difference increased positively by a factor of four between the 1 and 10 DOC trials. A linear