PDP-11/70 Boards

wrcooke at wrcooke.net wrcooke at wrcooke.net
Wed Dec 8 19:28:38 CST 2021


> On 12/08/2021 5:14 PM Rob Jarratt <robert.jarratt at ntlworld.com> wrote:


> 
> 
> The problem is that it isn't marked with a wattage, just a current, which left me wondering at what voltage. Although Will Cooke's response seems to be that the voltage doesn't matter, so at 240VAC it would be 600W. Can that be right?


My previous response was a bit misleading and not completely accurate. Sorry. I was trying to rush out the door.


Variacs are rated for max current. In the stated case, 2.5 amps. They can provide that current at any output voltage. So, the maximum wattage (actually volts-amps, which is a bit different) will be 2.5 amps * highest output voltage. (Some are possibly rated at for that current when the output voltage does not exceed the input. They can usually go higher than the input by 20% or so) They can provide that same current at any voltage UP TO that max output voltage. In theory, they could provide higher current at lower output voltages. The magnetic flux is proportional to current times turns (of wire on the core.) A lower output voltage implies fewer turns. BUT, the rest of the transformer is not designed for that higher current. So it is a bad idea to try to pull more than the rated current. However, pulling that rated current at any output voltage up to the input voltage is fine.


You can get a pretty good idea how much power a variac (or any transformer) can handle by its weight and size. The 3 amp variable transformer I have on my bench weighs about 25 pounds. It is a 120V input so that is about 360 volt-amps. One that can handle 600 volt-amps will weight around twice as much (the relationship isn't linear, though.)


Why do I keep saying volt-amps instead of Watts? Watts are actual power. Volt-amps are apparent power. If there are capacitors or inductors in the circuit the current and voltage will be out of phase. That phase change means the current and voltage won't peak at the same time and therefore the actual amount of power used will be less than it "appears" by multiplying volts * amps. But the core and winding of the tranformer will still have to handle the apparent power, even though the actual power used is less. The actual power used is the apparent power times the cosine of the angular difference in phase.


I'm not familiar with the DEC power supplies, but it is almost certain they have capacitor input power supplies. That means a phase change. That means, then, that your input supply will need to provide more apparent power than actual power used. A rough approximation is to double the input from what the supply outputs, assuming reasonable efficient supplies. So if one of those DEC supplies provides, say, 5 volts at 10 amps, that is 50 watts output. I would start with a variac (or whatever) that can supply 100 volt-amps at the input voltage. So, for 20V input it would need 5 amps.


Will


More information about the cctech mailing list