Originally Posted by
anonymous
This is my first post on here, I'm sorry if this is a stupid question or if this has been asked before. I did some searching and couldn't find anything.
NETA 7.10.2 B.5 requires that voltage circuit burdens are measured at VT transformer terminals. This test is listed as optional in MTS but not in ATS. I've asked a number of experienced testers and none of them could give me a straight answer on how to do this test. Should the test result be in ohms or VA? If it's ohms, is it enough to just put an ohmmeter on the secondary circuit? This seems wrong to me, since I expect most of the impedance will be reactive (inductive, as in meter coils) and an ohmmeter will give a DC resistance. If it's VA, how do you calculate it? Can I measure DC resistance and then give a test result as a range, like so many VA at PF=0.1, so many VA at PF=1.0?
Is there a voltmeter-ammeter method for this? For example (assuming a 14400-120 VT), disconnect the VT from the circuit, feed 120VAC to the secondary circuit, ammeter in series on one leg, voltmeter between the two legs. Volts x amps = VA. Would this work? Is this overkill? (Either way, NETA says "at the transformer terminals," where feeding 120V to the secondary would be unsafe).
Most of the VTs we've worked on have had high accuracy/burden ratings, like 0.3W-Z (12.5-200 VA) on a substation with one or two relays on the VT circuit, where I'd guess the actual burden would be more like 1 VA or less. The last thing I want to do is inject any kind of current into a voltage sensor on a protective relay. So how important is this test, really? Is the purpose just to make sure there are no shorts?