You are on page 1of 3

Differences between Watts and

Volt Amps
by Editorial Staff
As you would know, electrical products generally indicate both to
show how much energy and current they draw.

So herewith a quick summary of the differences with some interesting


alternative calculations to work out total VA (I am waiting for the
deluge of critiques from my electrical brotherhood).

Real Power – Watts

Real power is measured in watts (or W). One watt is the consumption
or generation of energy at the rate of one joule per second. This is
what you as a consumer generally pay your electrical utility in
kilowatt-hours (a 60W light bulb left on for 10 hours consumes 0.6
kWh).

We have a surprisingly wide range of charges throughout the world


for kilowatt-hour charges. For example (for 2011 in US cents), in India
it is 8c/kWh against Denmark where it is 41c/kWh (perhaps – and I
am speculating here – they are paying for the enormous investment
in wind energy). Other interesting ones are Australia (29c/kWh), UK
(20c/kWh) and South Africa and Canada (10c/kWh). People building
aluminium smelters would consider these numbers as they would
make a big difference to a business case.

Watts are calculated by W = volts (rms) x amps (rms) x cos (phi) where
phi is the angle between the current and voltage for ac circuits (cos
(phi) is often referred to as the power factor). Rms volts refers to root
mean square voltage (which is peak volts divided by square root of
2).
For dc circuits, this simply becomes W = V (dc) x I (dc) .

When calculating the real power for multiple devices; you simply add
the watts for each appliance.

The measurement of watts does require specialized equipment where


both voltage and current needs to be measured over a specific time.
A standard multimeter is not much help here.

Apparent Power – Volt-Amperes (VA)

VA = volts (rms) x amps (rms)

or for dc VA = V (dc) x I (dc)

Volt amps are very important for calculating current draw (and is
essential to know in sizing cables). So to work out the current draw
of a device; you simply take the VA and divide by rms voltage.

For example, you have a device which is rated at 500VA (maximum


VA the device will draw). If it is supplied by a 230Vrms ac line; you
would calculate maximum current as 500 VA/230Vrm = 2.2amps
(rms). You must ensure your wires and associated circuits can cope
with 2.2amps (rms).

Adding VA

Unfortunately (apart from direct current circuits); you can’t simply


add the VA rating of devices to come up with the total VA rating. This
is because the currents for each device are not necessarily in phase
with each other.

But – importantly – you can add up the individual VA ratings to get a


conservative figure since the actual total (calculated correctly) will
always be less than or equal to this value.
Power Factor

Power factor is always between zero and 1 because watts (real power)
is always less than or equal to volt amperes. As you electrical types
know, it is possible to have a voltage across a device (e.g. capacitor)
and to draw a significant current (and thus need to rate the cables
correctly for this current) but to consume no energy (zero watts).

As far as kWh are concerned, Earl Wilson remarked: Benjamin


Franklin may have discovered electricity, but it was the man who
invented the meter who made the money.

Volts vs Watts

Differences between Watts and Volt Amps

You might also like