Didn't say it was a method of consumption, I know exactly how electricity works. It is a more accurate way of measuring if your psu is keeping up the power or not. Wattage = voltage * Amps. Most of the systems power comes from the 12V rail which should have a certain amount of amps.
Source from wiki
Q: Hang on, isn't a PSU's wattage all that matters?
A: This was partially true for older computers, but recently theres been a big change, suddenly the big power users (videocard,cpu) are all powering off the 12V rail, so it becomes the limiting factor rather than a wattage rating. Wheras before the CPU would power off the 5V rail, with older PSU's designed with that in mind. So try to run that older PSU on the newer system and it often has issues since it wasn't designed for the newer system with an entirely different rail draw (almost all off 12V rail, instead of a mix). Besides most of today's systems have trivial wattage usage, see SilentPCreview (http://www.silentpcreview.com/article265-page1.html) for measured wattage draws of 6 different computers. It's quite obvious from the P4 dual core, that even a quality 350w unit could power it provided it had a big enough 12V rail, since the 12V rail draw was 17A, but only a 223W total wattage draw (DC). so therefore wattage ratings are generally now a poor guide to how good a PSU is (order of importance roughly is: PSU Brand, 12V rail capacity, needed connectors (24pin in particular), Group regulation or not (none preferred), then wattage). As a 'rule of thumb', unless the combined 12V rail capacity is at least 90% of the total wattage rating (eg 30A for a 400W psu) , the 12V rail will be the limiting factor for modern systems (not wattage rating).