I am not sure what the choices were for the question but you can probably expect that given the information that is talking about an analog meter. If the meter is 7.5% off at full scale then it is not very accurate at lower readings of full scale. If the meter is off by 7.5% at 150v, full scale, then if a target is 150v the meter could read as high as 161.25 or as low as 138.75. The 11.25v delta is constant through the voltage scale so reading a 100v target you could read 88.75v to 111.25v and still be in the accuracy of the meter. So if you are trying to read a voltage of something that is only 20v your deviation could be greater then 50% and still be within the accuracy of the meter.
The rule is to get the most accurate reading you should stay in the upper 2/3 of your scale. I am not sure if that helps but that is the way it was explained to me.
Originally Posted by
rasilva
A given voltmeter has an accuracy of 7.5% and can read 150V at full scale, what could be said for such meter?
So wouldnt it be at any range still same %?
Warren Garber
Have a great day!