RGarcia
February 2, 2021, 12:00 PM
A given voltmeter has an accuracy of 7.5% and can read 150V at full scale, what could be said for such meter?
Im having difficulties trying to solve this question, can someone explain to me how to calculate this? my answer is that at full scale the meter could be off by 7.5% and 3.7% at half scale. Not going to lie, I did guess on this question.
Kalbi_Rob
February 2, 2021, 08:13 PM
A given voltmeter has an accuracy of 7.5% and can read 150V at full scale, what could be said for such meter?
Im having difficulties trying to solve this question, can someone explain to me how to calculate this? my answer is that at full scale the meter could be off by 7.5% and 3.7% at half scale. Not going to lie, I did guess on this question.
Given 7.5% error at full scale of 150 means that the meter can read anywhere from 161.25 to 138.75.
DeltaV= 161.25V-150V =11.25V So, at any point on the voltmeter scale, it can be off by 11.25V.
%error = ((Expected - Actual)/Expected) *100%
At 100V, this means it could read anywhere from 111.25V to 88.75V for a %error of 11.25%
At 75V, this means it could read anywhere from 86.25V to 63.75V for a %error of 15%
At 50V, this means it could read anywhere from 61.25V to 38.75V for a %error of 22.5%
Sorry for the numerous examples, but wanted to make sure it made sense, and how that small error can be devastating at lower values.
Powered by vBulletin® Version 4.2.5 Copyright © 2025 vBulletin Solutions Inc. All rights reserved.