Originally Posted by
KLLU26
i've been testing for about 4 months now and one question i seem to keep asking (and haven't gotten a definitive answer) pertains to the recorded times of the long time delay during primary injection testing of a molded case square d circuit breaker.
while testing A phase, at X current we would record roughly 60 seconds.
B phase would drop to around 35-40 seconds,
C phase would be similar.
now i am familiar with the fact that as the breaker heats up the trip time will decrease, so my question is about the value that we record. in my mind under a normal load the breaker will be warmer, obviously, so during an overload the breaker would most likely trip around the 35-40 second mark. so why would we record and expect our values to be around 60 seconds from a cold test? i think it would be more accurate to record and expect the 35-40 second reading in a real world situation.
is this just part of the manufacturers specification or NETA standards?