T Nation

Any Electrical Engineers on Here?


Are there any electrical engineers on here, I need opinions on something please guys...


Just ask your question, there's plenty of smart people here. I do controls engineering so I might be able to help.


In the UK the supply voltage is typically 240V, but can be as low as approximately 210-215 or as high as approximately 257V. It has since been declared at 230v but the national grid made no changes so it still varies, based typically on 240V.

I have seen a device which reduces the voltage by up to 30V in 5 volt increments, so for sake of argument lets say it will put the typical household (we're talking a domestic/residential installation or light commercial) on 220V at the supply. The idea of this device is to reduce power consumption by reducing the voltage and therefore the power consumed by appliances in the home/office.

I have heard-from various people in the 'know' (experienced and qualified) in Electrical engineering-conflicting opinions on whether or not this will in fact save power.
It claims to save up to 17% on motorized equipment and 3-5% on IT equipment. Some people have said outright it doesn't work-appliances like fridges and freezers will just have their motors run slower and stay on longer subsequently consuming the same amount of energy.
Others have said it will save energy, but the motors will run ever so slightly less efficiently so not quite as cool/cold but this will hardly be noticeable. The manufacturers state clearly that it won't save money/energy on things like electric heaters, cylinders or appliances with heating elements in them.
Someone said the math all adds up but you can't be sure it will work until its field tested. Someone said it won't save a thing on computers or IT equipment.

I thought Electrical Engineering was a science and that the answer would be a pretty straightforward 'Yes it works' or "No, it doesn't'.
I appreciate that certain appliances must operate slightly differently, so you can't just say 'you save X on fridges, Y on freezers and Q on computers' but is the principle sound or not?


The answer is, it depends on HOW they lower the voltage. Remember that Amperage and Voltage are inversely proportional. So simply lowering the voltage will simply raise the amperage assuming the resistance is constant. If they use an in-line resistor then it really wont do shit other than generate a little heat (which would lower the voltage on the line, but do NOTHING to save energy), however, if they modulate the FREQUENCY, then it may provide some advantages - particularly on motor loads. So this beg the question, are they actually lowering VOLTAGE or are they modulating FREQUENCY? I would bet on the latter.

Many commercial and industrial buildings use Variable Frequency Drives to help run the motors in the HVAC system and other BAS components more efficiently. The thing is that these VFDs need to be programed and calibrated fairly specifically in order to work optimally. I'm not sure that a device that you just "plug in" would be smart enough to do that.

As for motors running more or less efficiently, remember the purpose of a motor is to achieve a certain RPM - that's it. HOW much energy it consumes (in Watts or KiloWatts) doing so will determine if the device works.

Perhaps you could give a little more detail on the product and I could get into it a little deeper.

And no, I'm not an electrical engineer, but before I got into mortgage, I was a very well qualified commercial electrician (5 years of school plus a bunch of other courses). But I am qualified to answer your question.


One more thing - if it IS actually lowering voltage on a motor load, that motor WILL struggle to maintain the RPM it was designed to run at, thus consuming more amperage to do so. It is a bad idea to increase the amperage on motor windings, as it will generate heat and cause the motor to burn out prematurely.


The idea of lowering voltage to save energy has been around for a long time. The thought behind it that power = voltage x current (P = V x I), so if you lower your V, P will decrease with it. However for a constant load (power), you will increase your current if you lower your voltage. The idea works for things like incandesent lighting (when you have low voltage the bulb will still light, but not be as bright, thus using less energy), but not so much for motors and other devices. Operating some motors at or near the low end of acceptable voltage also can decrease their effiency depending on the design.

Also by lowering your voltage you will increase the line losses, since losses = current^2 x resistance, and by lowering the voltage you are increasing the current. This does not apply as much to commercial/residential wiring but more to large scale industrial/utilty systems. Also by decreasing the source voltage (thus increasing line current) you will also increase your voltage drop on the line between the source and the load (voltage drop increases with increased line current).

Engineering is not pure science but rather applied science and most solutions/designs are give and take.


Thanks guys, this is basically a transformer (step down, the secondary side being 30V lower) which you connect to the supply and run selected circuits through. The manufacturers claim that losses are minimal, like a few watts, as long as it is not overloaded.

And yes, this is just purely designed to be used in a light residential setting, its max loading is 20A (with an 80A capacity when in bypass, so anything over 4.5KW it stops operating as a transformer) and so it is only designed to have a kitchen or utility circuit run through it, or a bank of PC's in an office etc. When they claim 17% on motors I take that to mean on things like washing machines or fridge freezers. I appreciate the limitations of such a device but am wondering if it will in fact deliver any returns when used in its limited capacity.


Thanks for your reply.

If operating domestic appliances like fridges, freezers and washing machines at the low end of acceptable voltage will that really reduce efficiency to an unacceptable degree? Say the fridge will happily operate at 230V, the 'declared' national voltage, and a typical home has 245V,by reducing the voltage to 220V or even 215V, assuming volt drop is acceptable, will there i) be an issue with the appliance operating properly? and ii) will there be energy savings?

Also, assuming volt drop to be acceptable, will operating a bank of PC's at 215/220V be acceptable, and will energy be saved in comparison to operating them on 240-255V?

Thanks Waldo


A lot of depends on the design of the appliance - there is an acceptable range that the utility has to provide (95% - 105% of nominal is typical), so if 230 is nominal voltage, and your devices are designed/rated for 230 V, then they can probably accept voltage from 215 V - 250 V without issue (again it really depends on the design).

As far as the energy savings, it really depends on the appliance. I am not too familiar with the types of motors used in most appliances, so I'm not sure if they are more of a fixed load (which would no provide any real energy savings opertaing at a lower voltage) or fixed current (which would offer energy savings by operating at a lower voltage).

I think most computers are a fixed power devices, so I don't think any energy savings could be realized by lowering the operating voltage on a bank of PC's (the computer PSU's are AC-DC converters and are rated at XXX watts - they will use those XXX watts regaurdless of the source voltage).


Generally Switching Power Supplies for PCs today have wide input voltage range, and seems to have slightly higher efficiency at higher voltages compared to lower. It's not much, but it's measurable. (:


Thanks guys. I have actually ordered one so I am going to run an experiment on my parent utility circuit, which has two fridge freezers on it and nothing else. I am going to carry out a seven day load study with a little gizmo from ebay, then repeat the study with the voltage regulator installed. There should be a noticeable reduction if they claim a 17% reduction, I'd expect at least 10%. If it works I'll repeat the experiment in my friends office to see how it affects his consumption. If it doesn't work then its refund time LOL.

I'll let you know how I get on...


I doubt you will save much. You may save some because things like light bulbs will be a little dimmer. However, regulated supplys on things like PCs will just demand more current at the lower voltage, your computer won't just "run a little dimmer".

In fact most appliances like your dishwasher and refrigerator are going to be optimized for a specific voltage. For example the compressor on your refrigerator should be set up to run efficiently at a very specific speed and duty cycle. Dropping the voltage may move the appliance into a less efficient duty cycle range. (it could also slightly effect the life of the appliance). High and low pressure lines and heat exchangers and fans are all sized for a specific flow rate of refrigerant. Slowing down the compressor will probably consume more energy, because now everything in the system is not correctly speced. (same goes for your AC unit)

There are too many variables to really say if you'll save power, but also factoring in some power loss in your voltage converter, I'm betting you either won't see a difference, or it will waste more energy than it saves.


Thanks DD, I'm more interested in savings on iT equipment, which the manufacturer claim to be 3-5%, but its easier for me to test it on my parents utility outlets. The thing is the voltage on the secondary side of the transformer won't lower the voltage to a level which is unacceptable by supply standards, 220V at the supply, even 215V is acceptable. I also read that a supply voltage exceeding 240, will reduce the life of any equipment designed for 240V, which is most over here although some is 230V and some 250V.

They have specified (the manufacturer) that there will be no savings on anything with a heating element.

The manufacturers claim the transformer losses are absolutely minimal, but we'll soon find out for sure...

Thanks again,



It will be interesting to see your results, but I'd guess you are looking at an increase in energy for the very reasons that others have stated. If you stay within the 210-240 range you are unlikely to damage any motors, but they will typically be slighty more efficient at the higher voltage. As ac said, these motors will run at their nameplate rpm, they will just consume more current to do so. The only way to change the output rpm is by changing the frequency, but any newer residential appliance is likely already at its optimal rpm or it is geared to be there.

Your computers are actually somewhat similar in this case. These power supplies constantly monitor secondary voltage to make sure it stays exactly the same regardless of the input voltage. So if you give them the lower primary voltage, they will simply consume more current to maintain the output, and typically with less efficiency.

My guess is that lighting is the only place you will see any energy savings, as the resistance will be much more constant so that current should stay about the same (edit: proportionately the same) as you lower the voltage, but of course you could get the savings much easier with a lower wattage bulb.


There are a lot of variations in the appliance industry. Some use inverters to control a variable speed motors. Some inverters will let the RPM droop if there isn't enough power. But those motors aren't directly affected even by a frequency change in supply power. Like I was saying, just way too many variables.


This should be an interesting test! So how are you going to measure the results? Do you have a power quality meter or something with CTs and voltage attachments? Or an Emon Dmon meter or something?

Since it is a freezer you are testing, my hypothesis is that you won't see much of a difference. The thermostat in the freezer will call for a certain temperature. With the motor driving the compressor at a reduced efficiency (due to the lower voltage) I think the unit will A) run higher amperage in an attempt to maintain the RPM necessary to drive the compressor and B) run longer to achieve the target temperature because it is operating "off-spec". Remember, with a T-stat involved, a certain amount of "work" HAS to be done...

My .02

Keep us posted, this is a fun question.


Thanks guys, i will keep you posted. I appreciate all the replies, and have joined an electrical engineering forum but am yet to receive any replies, I will copy and paste them if I do for you guys to read.

I feel the best way to put this is to simply conduct my own experiment, and I do have my reasons for doing so.

I've heard a fair bit of conflicting information now, though the general consensus from impartial people is pretty much along the lines of what you guys are saying. I spoke to a guy earlier with an engineering degree, and he was with a friend who is an electrical engineer (coincidently-how handy was that? LOL) and they pretty much said the same as you guys.

I'm going for the fridge freezers partly for convenience, though also because there should be a larger, and therefore noticeable saving-if there is one LOL.

I've ordered one of these from ebay:


I also have Fluke test equipment and a current clamp meter so i'll record everything as comprehensively as possible.

My parents utility room has a little sub board on it with an individual circuit only supplying these two appliances. I'll record everything before and after and keep you guys posted. I'm more than happy to be proved wrong, and I respect the opinions of you guys, but just for my own satisfaction, I actually want to be PROVED wrong in this case, nothing ventured, nothing gained LOL.

Lets see what happens....


Retired electronics engineer here. It is all about a thing called "Power Factor" look it up on Wikipedia. There is a good explanation of switched mode power supplies that all IT equipment uses. Transformers are not very efficient due to reactive components and heat loss as well as magneto strictive effect. With a purely resistve load (theoreticaly) e.g. filament lamp some saving could be achieved although luminosity will be less using either a triac circuit or step down trafo. Hope this helps a bit.


If you are going to run this energy test on something like a freezer, you need to take some things into account.

-you are going to have to log consumption electronically, immediate readings aren't going to mean much. Consumption needs to be measured over a number of cycles of the compressor.

-Use (opening and closing the door(s)) will dramatically impact energy use

-Ambient temperature

-Defrost cycles are huge energy drains. This may be something hard to deal with. Different units use different methods for cycling defrost. Some will use sensors, on-time, duty cycle of the compressor, set temp, est in algorithms to figure out when to defrost. If you capture a defrost cycle on one test and not the other, your results could be WAY off. The best way would be to average it out (which would probably take months per test). If you are going to run a shorter term test, I'd recommend unplugging the freezer for a few minutes, plugging it in and immediately starting the test. That should reset the defrost cycle (and the compressor cycle which is also good). Bare in mind there may be a startup protocol, that means your measurements aren't going to be true for what the machine uses, steady state, but it should be consistent between tests.

Good luck.


On that monitor you posted, kilowats per hour isn't going to tell you much for a freezer.