What does 40VA rating for transformer mean?
I am working on a circuit for a home make home security system. I bought the following wall transformer to power my circuit. I was going to use with a 5VDC Linear voltage regulator to get 5 volts for my digital circuitry.
I couldn't find any documentation on this transformer. When I plug it into the wall and measure the voltage at the output I get zero volts. I tried attaching it to a couple different loads (with fairly small resistance) but I still couldn't measure any voltage at the output. I realize this is probably because the transformer is not regulated and thus requires a load for the output to be 12V but I am still not sure if I can use this transformer with my design. Does this thing only output 12VDC if it has the exact right load? Or since I will be connecting this to a linear voltage regulator will that handle it all for me? The total current draw of my circuit will typically be less than 200mA unless the alarm is sounding, and then it will go up some.
Also, I was planning on using the 12V from the transformer to power the horn(siren) directly and switch it on with a power Mosfet. The 12 Volts will not be regulated so do I need to add a big resistor in series with the horn so that it will have the proper 12 Volts across it?
I guess I should have just spent a little more and got a regulated supply!
open-circuit (no load) transformers usually don't show 0V when the primary has power; they usually ride *higher* than the rated voltage.
That transformer is 12 VAC, were you planning on just connecting the linear regulator directly to the AC or rectify it first?
Those 12V are the effective AC voltage. To get the peak voltage you need to multiply by the square root of 2, which results in about 17V.
40VA means that at an effective voltage of 12V it can handle an effective current of 40VA / 12V = 3.3A, which should be enough (though you neglected to tell us how much current the horn needs).
You need a rectifier and a smoothing capacitor in front of the linear regulator. The rectifier will reduce the voltage by about 1.4V, so at the input of the linear rectifier you will have a little more than 15V, with some ripple depending on the capacitor and the current drawn.
At 200mA and 10V difference between input and output your linear regulator will need to get rid of 2W of power, so it will need some cooling.
As for the horn, it depends on the specification of the horn, how precisely it needs those 12V. If it can tolerate some ripple you could use five 1N4004 diodes in series (5 * 0.7V = 3.5V, 15.5V - 3.5V = 12V) to lower the voltage. This is better than a resistor, because the voltage across a resistor depends much more on the load than for a diode.
40 VA is "Volts times Amps," which is typically close enough to 40 Watts for hobbyist purposes. The reason they say "VA" instead of "W" is that if your voltage lags (or leads) current, you get (VA) times (cosine of the angle between the V and the A), instead of just VA. If V and A are in phase (pure resistive load), then VA = Watts.
To amplify that point: if voltage and current are out of phase, then during part of each cycle the transformer will supply excess power, and during another part it will take receive power back from the load and send it back to the power company. A measurement in "watts" would subtract the returned power from the supplied power, but the transformer sees power transfer in both directions as a burden.
Transformers inherently handle only AC. Unless your tranformer is just plain broken (probably not), it is putting out somewhere around 12V AC as specified. If you're reading this with a DC voltmeter, it will read 0V since that is the DC average. Change the voltmeter to the AC Volts scale, and you should see somewhere around 12V, probably a little higher.
The AC voltage needs to be "rectified" to make it DC. A quick and dirty method uses a single diode and output capacitor. That basically passes the positive parts of the AC cycle and blocks the negqative. It will have some ripple, but will work since you're going to regulate it down to a much lower voltage. A better method is something called a "full wave bridge". This uses 4 diodes but harnesses both halves of the AC cycle to contribute to your DC power voltage. Basically, a full wave bridge takes the absolute value of the AC voltage.
As Starblue mentioned, either way the DC voltage will come mostly from the peaks of the AC voltage. These are sqrt(2) higher than the RMS voltage. The 12V spec will also be under maximum load, so the actual voltage will be a bit higher than 12V * sqrt(2).
2W is a lot of heat to get rid of in the regulator. That's too much for a TO-220 package in free air, and will require a heat sink. Alternatively, you could use a switching regulator if you feel comfortable with such things. It's more advanced, but is a worthwhile thing to learn.
Oh my. I can't believe I didn't notice the output was AC! Probably because I've never seen a transform with a 12VAC output before so I wasn't looking that closely. Thanks for the tips.
I've never seen a transformer with DC output - I'm surprised that you've never seen one with AC output. Transformers work because magnetic field is a function of the change in current. When using DC with a transformer, the magnetic field will decay after the initial impulse, and the transformer will behave like a short circuit. *All* transformers are AC!
@Kevin Vermeer - I think he probably means *Wall-Wart style* transformer (the kind which generally has rectification built in). There are AC versions available, but they are not that common.