what is accuracy class?

S

Thread Starter

srinivas duddu

can anyone tell me what is the difference between accuracy class and accuracy ?


 
V

Vladimir E. Zyubin

Hello,

To be short:
"accuracy" is a characteristic of measurement,
"accuracy class" is a characteristic of measuring instrument.

in the strict sense, the word "accuracy" needs the suplimentary words: "basic", "absolute", "relative", "static", "dynamic", etc., etc., etc. and here (in Russia) the main term is "inaccuracy" (it is more convenient to operate), the word "accuracy" is the complementary term.

"accuracy class" can caracterize either "absolute inaccuracy"(the per cent of measured value) or "reduced inaccuracy"(the per cent of the superior limit of the range of the measuring instrument). In the first case there must be a circle around the digit that shows that the "accuracy class" is normalized by the "absolute inaccuracy".

and so on.

It is a "big science"... :) so, if you need more strict and detailed knowledge you ought to read the special books.. It seems to me there is a special standard (ISO?) about the terms... Here, in Russia, the standard dealt with inaccuracy exists.

--
Best regards,
Vladimir mailto:[email protected]
 
S

Salai Kuberan E S, Manager/C&I

NIST defines a piece of eqpt as accurate when its performance or value that is,its indications,its deliveries,recorded representations or its capacity or actual value
etc., as determined by tests made with suitable standards -conforms to the standard
within the applicable tolerances and other performance requirements .Equipment that
fail to coform is inaccurate.

Accuracy class is to denote the limits of tolerances w.r.t applications.For
example,NIST stipulates mass flow measurement with accuracy classes
0.3,0.3A,0.5,1.0,2.0,2.5 etc., depending on various fluid fuels.

Salai

Q: can anyone tell me what is the difference between accuracy class and accuracy ?
 
B
Hi Vladimir:

You are absolutely right in using the term "inaccuracy". I happen to like the term uncertainty since accuracy or inaccuracy usually needs to be explained - as well as defined.

It is essential that all who deal in measurement know that when they accept a reading the accuracy, inaccuracy or uncertainty of that reading be fully understood. These uncertainty factors are in play even with the best
calibration of the measuring device and in some cases can be very substantial indeed. Luckily, for the industrial world, device repeatability
uncertainty is very low.

Alas, most appear to accept measurement figures given to them as being absolute.

Bob Pawley
 
Top