You are on page 1of 5

1.

General: Device that communicates, denotes, detects, indicates, measures,


observes, records, or signals a quantity or phenomenon, or controls or
manipulates another device.

2. Law: Formally executed document that evidences a legally enforceable


agreement between two or more parties, and expresses a contractual duty,
obligation, or right. Practically all documents used in borrowing, lending,
investing, and sale and purchase on credit, are legal instruments.

Definition: Measurement is collection of quantitative data. A measurement is made by comparing


a quantity with a standard unit. Since this comparison cannot be perfect, measurements inherently
include error.

accuracy: The degree of conformity of a measured or calculated value to its actual or specified value

precision: 1. The degree of mutual agreement among a series of individual measurements, values,
or results; often, but not necessarily, expressed by the standard deviation. 2. With respect to a set of
independent devices of the same design, the ability of these devices to produce the same value or
result, given the same input conditions and operating in the same environment. 3. With respect to a
single device, put into operation repeatedly without adjustments, the ability to produce the same
value or result, given the same input conditions and operating in the same environment. Synonym (for
defs. 1, 2, and 3) reproducibility. 4. In computer science, a measure of the ability to distinguish
between nearly equal values. (188) 5. The degree of discrimination with which a quantity is stated; for
example, a three-digit numeral to the base 10 discriminates among 1000 possibilities.

Expected value

1. The sum of all possible values for a random variable, each value multiplied by its
probability of occurrence.
2. The integral of the probability density function and a continuous random variable over
its range of values.

error

3. 1. An act, assertion, or belief that unintentionally deviates from what is correct, right, or true.
4. 2. The condition of having incorrect or false knowledge.
5. 3. The act or an instance of deviating from an accepted code of behavior.
6. 4. A mistake.
7. 5. Mathematics The difference between a computed or measured value and a true or
theoretically correct value.
8. 6. Abbr. E Baseball A defensive fielding or throwing misplay by a player when a play normally
should have resulted in an out or prevented an advance by a base runner.
9.
10. Resolution -- the smallest amount of input signal change that the instrument can
detect reliably. This term is determined by the instrument noise (either circuit or
quantization noise). For example, if you have a noiseless voltmeter that has 5 1/2
digits displayed and is set to the 20 V input range, the resolution of this voltmeter is
100 µV. This can be determined by looking at the change associated with the least
significant digit.

Now, if this same voltmeter had 10 counts of peak-to-peak noise, the effective resolution
would be decreased because of the presence of the noise. Because of the Gaussian
distribution of the noise, in this case the effective resolution would be 0.52 ´ 1 mV. In general
when you have a measurement system that has X counts of Gaussian noise, the effective
resolution of the system is given by

Error Calculation or Accuracy Determination

DC Measurement

Reading-Dependent Noise and Range-


Error Type Errors Dependent Errors
Specified Accuracy % of reading x offset
reading/100
Nonlinearity % nonlinearity x
range/100
System Noise rms noise x 6.6 (to
find peak-to-peak
value)
Settling Time % settled x Step
change/100 (this must
be added for scanning
systems unless it is
included in the
accuracy
specifications)
NM Noise Normal Mode Noise x
10^(-NMRR/20)
CMV Common mode
Voltage x 10^(-CMRR/20)
Temperature Drift (% of reading/°C) x X (offset /°C) x X °C
(this must be added if °C x reading/100 X is the temperature
your temperature X is the temperature difference between the
range is outside the difference between specified temperature
specified range for the the specified range and the actual
given accuracy tables) temperature range and operating temperature.
the actual operating
temperature.

Let us consider the National Instruments NI 4350 Series 5 1/2 digit Temperature and Voltage
Data Logger. We will calculate the total accuracy of a 1 V reading. Let us also assume that
the instrument is at an ambient temperature between 15 and 35 °C, and it has been less than a
year since the last calibration was performed but more than 90 days. The total accuracy based
on the error budget determined above is:

Percent of Reading Range-Dependent


Error Type Errors Errors
Specified Accuracy 1 V x 0.0131% = 131 3 µV
mV
Nonlinearity 0, included in accuracy
System Noise 0, included in offset
Settling Time not needed as
specification table
included any errors
due to scanning
NM Noise, assume 1 1 mV x 1.4 x l0^(-100/20)
mVrms of = 0.01 µV
environmental noise
CMV, assume 2.5 x 10^(-100/20) = 25 µV
maximum CMV of
2.5 V
Temperature Drift N/A (because the NI N/A (because the NI
4350 specification 4350 specification
tables cover 15 to 35 tables cover 15 to 35
°C) °C)
Subtotal 131 µV 28.01 µV
Total Maximum Error 159.01 µV or 0.016% of reading

AC Measurement

Reading-Dependent Range-Dependent
Error Type Errors Errors
Specified Accuracy at % of reading x offset
a Given Signal reading/100
Frequency Range
Nonlinearity % nonlinearity x
range/100
System Noise rms noise x 3.46 (to
find peak to peak
value x assume
gaussian noise)
Settling Time % settled x Step
change/100 (this must
be added for scanning
systems unless it is
included in the
accuracy
specifications)
CMV Common mode
voltage x 10^(-CMRR/20)
Temperature Drift (% of reading/°C) x X offset /°C x X °C
(this must be added if °C x reading/100 X is the amount of
your temperature X is the amount of temperature drift from
range is outside the temperature drift from the specified
specified range for the the specified temperature range.
given accuracy tables) temperature range.
Crest Factor Error X x reading/100
Add X% additional
error based on the
type of waveform.

Let us consider the NI 4050 5 1/2 digit multimeter and calculate the total accuracy of a 1
Vrms reading. Let us also assume that the instrument is at an ambient temperature between
15 and 35 °C, and it has been one year since the last calibration was performed. Because this
is an AC measurement, we need to specify the frequency of the signal measured and the crest
factor. Let us assume a frequency of 1 kHz and a crest factor of 2. The total accuracy based
on the error budget determined above is:

Reading-Dependent Range-Dependent
Error Type Errors Errors
Specified Accuracy at 0.42% x 1 V = 4.2 1.2 mV
a Given Signal mV
Frequency Range
Nonlinearity Included in Included in
specification table specification table
System Noise Included in
specification table
Settling Time This is not a scanning
DMM and let us
assume that the signal
is not changing
CMV assume 250 ´ 10^(-100/20) = 2.5
maximum allowable mV
CMV of 250 V
Temperature Drift Not applicable
because the
temperature range is
included in the
specification table
Crest Factor Error 0% x 1 V/100 = 0 mV
Subtotal 4.2 mV 3.7 mV
Total Maximum Error 7.9 mV or 0.79% of reading

You might also like