If a current of 30 amps is measured with a 100 amp 3% ammeter, the maximum possible error in terms of actual error is:
Im having problems calculating these types of questions, could someone help?
If a current of 30 amps is measured with a 100 amp 3% ammeter, the maximum possible error in terms of actual error is:
Im having problems calculating these types of questions, could someone help?
This may help you. http://www.mathsisfun.com/measure/er...asurement.html
For percent error on that device:
100Amps X 3% = plus or minus 3 from 100 or a range of 97A to 103A when measuring 100A.
If the error is +/- 3 amps then your 30Amps may be 27A to 33A.
if the device is listed as a 3% tolerance over all measurement ranges then use 30A X 3% = plus or minus 0.9A or range of 29.1A to 30.9A
What am I missing if I keep coming up with 3.9%?
If 4000 was the specified value, and 3850 was the measured value, what is the percent error?
Please see below for percentage error calculation.
Step 1: Calculate the error (Subtract one value from the other)
Step 2: Divide the error by the specified value (we get a decimal number)
Step 3: Convert that to a percentage (by multiplying by 100 and adding a % sign)
% error =((Specified value-Measured value)/ Specified value )X 100 %
= ((4000-3850)/ 4000) X 100 %
= 0.0375 X 100 %
= 3.75%