I
Hi all,
Is it necessary that an Instrument have accuracy mentioned in its specs? Can linearity act as a substitute to explain the accuracy of an instrument?
Suppose, I have an instrument. I calibrate it using a standard reference between Zero and Span. Now if the instrument has a linearity of 0.1% then for all intermediate values, the readings are likely to be within 0.1% of the line joining Zero and Span. Isnt it, then, enuf to NOT worry about the accuracy? How will the information about accuracy be any different from Linearity, in such a case.
I am sorry to ask such a basic question but I am stuck up with an instrument where I have to know its accuracy. The spec sheet mentions everything BUT the accuracy. It talks about Zero drifts, span drifts, minimum detectable limits, noise, linearity etc.. But not accuracy. So I am trying to understand, if accuracy is at all required to be known for an instrument OR linearity information should suffice.
Regards
Is it necessary that an Instrument have accuracy mentioned in its specs? Can linearity act as a substitute to explain the accuracy of an instrument?
Suppose, I have an instrument. I calibrate it using a standard reference between Zero and Span. Now if the instrument has a linearity of 0.1% then for all intermediate values, the readings are likely to be within 0.1% of the line joining Zero and Span. Isnt it, then, enuf to NOT worry about the accuracy? How will the information about accuracy be any different from Linearity, in such a case.
I am sorry to ask such a basic question but I am stuck up with an instrument where I have to know its accuracy. The spec sheet mentions everything BUT the accuracy. It talks about Zero drifts, span drifts, minimum detectable limits, noise, linearity etc.. But not accuracy. So I am trying to understand, if accuracy is at all required to be known for an instrument OR linearity information should suffice.
Regards