Voltage Optimisation - worth it? — Bulb Community

Voltage Optimisation - worth it?

We currently use about 120,000 kWh/year (commercial site).
Does anyone have an experience or opinions about Voltage Optimisation systems?

The basic idea as I understand it: is that the National Grid supply voltage to sites between 230-242V, whereas all devices after 1993 have a voltage requirement of 210-220V. They come and install a VO unit which stabilises the voltage to 220V and saves you anywhere between 10-16% off your bill.

The rate of return of this is probably 2-3 years, which sounds almost too good to be true. Almost as good as the switch to LED's.

Thanks for your input!

Comments

  • The short answer is: it depends on what kind of equipment you're using, but if it's standard office equipment (computers, photocopiers, lighting, etc) then probably not. You pay for your electricity in kilowatt-hours, not volt-hours or ampere-hours.

    To explain this, imagine a machine that needs water instead of electricity, so instead of drawing, say, 1 kilowatt of electricity it draws 1 cubic meter of water per second. If you've got a pipe with a cross-section of 2 square meters then the water would need to flow at 0.5 meters per second to meet your needs. If the pipe's cross section was 0.5 square meters the flow would need to be 2 meters per second and so on.

    Cross Section × Flow Rate = Water Use

    In this analogy, the cross-section of the pipe is like the voltage and the flow rate is the current (measured in amperes). Multiply the volts by the amperes to get the total rate of energy use (measured in watts).

    Voltage × Current = Energy Use

    Since you're paying for the amount of energy used and how long it was used for (one kilowatt of use for one hour is one kilowatt-hour) then voltage doesn't make any real difference. If you drop the voltage, the current increases to make up the difference. If a machine needs one kilowatt to run, it'll just pull the current it needs at the voltage it's given to get that.


    Now, with all that said, there is another consideration: some very specialised or industrial equipment might need a very specific voltage range to work optimally (certain types of AC motor can be very tightly calibrated, for example). If your business has a particular type of equipment that makes up a large chunk of your energy use I'd recommend speaking to the manufacturer who would be able to give you a better answer.
  • VNSVNS
    edited July 9
    @rmuk, thank you for you for your reply!!!
    We're using computers and all, but we do have two large walk in fridges running 24/7.

    Our lighting is in the process of being switched over to LED's on sensors, so hopefully that will be a significant saving there too.

    I am having a few meetings coming up to explore the Voltage Optimisation some more, let's see if it makes any financial sense!

    Thanks, VNS
  • I always though inductive loads such as fridge motors could benefit from voltage optimisation? Depending on the size of the set up here it is still worth investigating. The analogy previously suggested does apply to resistive loads- for example heaters- it takes a set amount of energy to heat a certain amount, if you dropped the voltage you would draw more current or simply take longer to heat. But other appliances perform exactly as intended with significantly lower voltages than the grid supplies... it's a complicated subject and you really need expert advice. Try to get a guarantee of some kind out of the equipment provider in terms of power usage reduction- obviously they have a vested interest in telling you what you want to hear and are likely to give you best case scenarios to get you to sign on the dotted line!
  • edited July 14
    Things like computers are slightly more efficient with a lower voltage, however it's probably not much in the grand scheme of things. In UPS equipment at work we will generally use a UPS that steps down the voltage to 220v, rather than the 238v - 252v we get from the grid. It saves a couple of watts on each server, but it's not about saving money as much as ensuring the power supply is always perfectly clean.
  • Thank you both for your replies! Much appreciated. We're still looking at it Zenussi have provided their solar quote throwing a VO in with it too. They have a vested interest but my concern is my current equipment being damaged by any VO, we run walk in fridges and also AC units all year round.
  • edited August 31
    I'd suggest a deeper investigation into the philosophy behind such devices and comparing it to the equipment you use and their patterns of operation. As far as I was aware, the main way you save energy by reducing supply voltage is through any simplistic, largely resistive loads you have connected to the mains, and that run either continuously or on a set pattern. EG incandescent or halogen lighting, ventilation fans, vacuum cleaner motors, non-thermostat controlled heating / cooling / AC and so on. Those all have power consumption that varies as the square of input voltage, as the current flowing through them is equal to voltage divided by internal resistance, and power is equal to voltage times current. With there being no change to the duty cycle or said resistance based on any influence the device has on its environment.

    Essentially, it's the same as replacing those lights, motors or heating elements with lower-wattage (higher resistance) types, but doing so on a bulk basis across your entire site. If you don't want to go to the trouble and expense of changing them all directly, and maybe want the ability to vary the line voltage if needs be (boosting the lights or ventilation back to original levels when the site is in heavy use, but cutting it back to a minimal level when essentially shut down), installing such a regulator could be a considerable money saver not just in terms of electrical units used but also the materials and work involved in actively downrating everything.

    If you're largely using anything that has a more sophisticated power supply or transformer/regulator between it and the mains supply (any computer or similar electronic device you could name made in the last 25-30 years, just for starters, as well as most CFLs or LEDs; dimmables are a mixed bag, some won't show any change until the input voltage drops considerably, others will dim noticeably just in response to regular mains fluctuations, but in either case the relationship is under electronic control and not a simple linear change of current under voltage control), or that is expected to do a particular amount of work regardless of supply voltage (a lift that has to move a certain amount of weight through a certain vertical distance, electrically heated water boilers/immersion heaters, ovens/furnaces/heating systems or fridges/dehumidifier/AC systems that maintain a particular internal temperature or humidity level using a thermostat, arc welders, electromagnets, battery chargers etc), then it'll be of little use. All it'll really do is make some things work harder, and possibly less efficiently or effectively if that ends up pushing them into a 100% duty cycle or otherwise outside of their normal operating parameters. Some of them will do the same job, and use the same power, but do it slower and create the temptation to replace them with higher powered models anyway. Quite likely for a lot of them you just won't see any functional difference, because their power supplies will adapt to provide the exact same DC voltages and currents on their business side, just drawing higher current from the mains to compensate for the lower voltage on that side, multiplying to the same power in the end.

    Which I have to qualify with a warning that if the connected equipment already runs quite close to the rated limits of its supply cable or circuit you may end up blowing fuses, as the current ends up exceeding that limit (usually it's more current that's the issue than voltage, for cable/circuit safety - though it's (extremely high) voltage that causes arcing, it current that which causes potentially catastrophic heat buildup).

    Rule of thumb question is, when the local grid undergoes a quite obvious switch with an up or down jump in voltage, can you detect a difference in the operation of the device, AND does it moderate its operation with any kind of feedback sensor affected by the machine itself (other than maybe a photodiode - you can't really influence sunlight the same way you might internal temperature, and the borderline effects don't change much between a 25w bulb and a 150w one because they're both very dim vs the sun). If the answer to the first part is yes, AND the second part is no, then it's worth doing. Otherwise, maybe not.

    EG, do you notice a continually-lit (or PIR sensor triggered) lightbulb changing its brightness in response to the switch, or a continually running motor subtly changing its speed? Heating and cooling effects are rather harder to judge, but generally if there's a thermostat involved, the total amount of energy consumed won't change much in response to altered input voltage, you'll just have to wait longer for the element / compressor to switch off each time it turns on. There might be a slight saving from the heated/cooled enclosure spending more time closer to the temperature of the outside environment and so having a weaker energy gradient on average, but that essentially just points to it *not working as well as it should*.

    In all cases, if you have less power running through your meter, the equipment you power will be doing less work. The choice comes from whether or not all that work is being made use of, or whether a lot of it is essentially wasted, and if you have any other practical way of reducing that waste. If you have something that's running continually when it could actually be turned off half the time, you may be much better off installing some kind of simple switch on its circuit or replacing it with a radically more efficient alternative, instead of installing an expensive voltage chopper that will produce a more fractional saving, may have an untoward effect on other equipment, and has a nonzero line load itself.
  • (cut from the other post to separate them out and prevent reading fatigue... it's a bit sprawling but it's hard for me to explain it better)

    For a domestic example of the fuse thing, a simple kettle or heater rated for 2950W at 240V (so 12.29A) on a 13A fused circuit will cope fairly well with power in the 220 to 250V range, just producing between 2479 and 3201 watts, at 11.27 to 12.80A, with a little leeway for overvoltage conditions to probably as much as 275~280V (a 13A fuse won't generally blow at 13.01A after all, and a heating element is much like a light bulb filament... it'll have a shorter lifespan with a higher energy output, but a fractional overload won't kill it straight away), and if you drop to 200 volts it'll just wimp out producing around 2049 watts of heat (at 10.24A, and probably lasting many years longer than expected). If you're trying to keep a steady temperature or boil water it won't change your consumption (unless the output is no longer enough to reach the set temperature) because about as much energy will be needed to raise the air or water to the set point. Maybe useful for patio / beer garden / smoking shelter heaters though...

    In contrast, a server-class computer power supply using a cable that's got a marginal 3A fuse installed, or several of them individually with 10A fuses plugged in to a 13 or 16A power strip that's just below rated current at normal mains voltage could run into problems. It'll be rated to provide a certain amount of current to the actual electronics, at a range of closely controlled DC voltages, and generally designed to work with anything from about 90 to 270 volts (covering over-and-under voltage conditions on power systems rated for 100 to 250v), which it will convert to the internal voltages using a "switch mode" system that, in simple terms, means it just increases the duty cycle of a high frequency transistorised switch connected to the mains side as power demand rises... or if demand remains the same but input voltage drops. Higher duty cycle, higher current, as the circuit is connected for more of the time (with a capacitor inline to reduce the noise and spikes so produced). Also often they're more efficient with higher input voltages, but we'll ignore that for now.

    If you have five of them drawing 2.5A each at a typical 235v ( = 588 watts... servers are hungry beasts) connected into a six-socket power strip rated for 13A (and maybe a relatively low wattage monitor that can be switched to the output of any of them plugged into the sixth socket), that's 2938W at 12.5A. If the mains voltage floats up to 250v... that'll still be about 2938W (and about 588 per cable), just now at 11.75A (and 2.35A per plug - the seemingly linear relationship here matches what we'd expect). But if it sags a similar amount to 220v, we're still drawing those same wattages, but current is now 2.67A per cable, not too much of an issue, except that also means 13.35A through the power strip. That should be tolerated alright so long as it doesn't persist for too long and eventually returns to normal, but you don't really want to reduce the voltage too much further... take it as far as 200V, which would make for a nearly 30% energy saving with a continually-running resistive load, and (unless the machines are connected to a UPS or similar system that detects a potential brownout and automatically shuts them down - also something you probably want to avoid as much as possible...) the power supplies will still try to draw 588w each, and 2938w total. The load on each cable is nearing the maximum for the somewhat optimistic installed fuses, though not tripping it, but our total current through the power strip is 14.70A, not including any additional devices that might have been connected to take up the ~60w leeway that seemed to exist originally. Generally, you'd kind of hope that the fuse would eventually blow, at that current, even if it took a few minutes or even hours. Too many of those connected together on the same main circuit, and you could be regularly tripping out your breaker board with a reduced voltage, when you wouldn't have done at normal line voltage.

    OK, that's a bit extreme, and you really shouldn't deliberately load any circuit up that close to its limit if you can avoid it (yes, individual devices that draw almost a full 13A exist, and rather more if hardwired, but if you have multiple lower load devices you can spread them out better if you think things through), so normally lowering the voltage should be safe... but, it does happen, and people will at some point end up flirting with whatever safety limits are put in place, either through carelessness, bravado, or just not having any other choice because they've been tasked with installing a greater density of hardware than was originally intended. And if your switching transformer is already operating close to its own limits anyway (some are rated, and internally fused, for different outputs depending what mains standard they're connected to) you could end up blowing *that* instead.
  • As for the one supplied with the solar equipment, I'd say that's more intended to keep a steady output from the solar system, or from that blended with the regular mains, and particularly if you intend to export any of the power.

    The voltage that comes from a solar panel varies directly in relationship to the intensity of light falling on it, with a curiously varying resistance keeping a relatively steady current (I mean... maybe ... it's a while since I revised any of that stuff... but power output doesn't seem to be exponential vs light intensity and they certainly don't produce fixed voltages), with that voltage not necessarily being anything close to that of the mains (12v nominal is common, so you'd need 19 or 20 panels for peak output to match; a typical 8-panel system would only make 96v at full rated power), and moreover is inherently direct-current (...technically, it's half-wave AC with a variable duty cycle and a 1/86400th Hz frequency, but that looks like DC to most hardware such as transformers...).

    So to produce power from that which can be used by typical mains powered equipment, you effectively have to feed it into something equivalent to a strange UPS that uses non-rechargeable batteries. Which can pull energy both from the 230 to 250ish volt, 50Hz AC grid, and the maybe 100v-or-less, DC panels, and synthesise a standardised output from that, which tends to be in the 220 to 230v range because it's easiest to base the output off the mains template with some allowance for lossage and the ability to more easily deal with temporary mains input sags and still produce a rock solid output even when the panels are completely offline.

    IE, a voltage optimiser or something of a similar name and function is pretty much essential for solar power unless you're charging a fairly forgiving set of deep-cycle storage batteries and running DC equipement from them. Which has a side benefit of producing steady, standardised output voltage regardless of the input, so long as sufficient power/current is available.

    As for why the mains generally comes in at a higher voltage than what devices are rated for ... that rating is only nominal, essentially a midpoint of the acceptable range of inputs. As the amount of resistance and cable losses (which cause voltage drops along the line) from the substation transformers to the device plug (including via any long, cheaply made extension leads) can't be entirely guaranteed, only estimated as being within a particular range, the substation output will be somewhere around 240-250v, which "220v" or "230v" devices are expected to still accept gracefully if they get the full whack, so that there's a good chance that they'll still be provided with the baseline 220v at the far end of whatever cables are in the way even if a good 10% of the total voltage (and possibly even more of the total power) goes missing along the way. At the extreme, they may be expected to accept 210 or a little less (I think I've seen 208 mentioned as a particular benchmark somewhere) as much as 245+, but actual range will vary by device and it's generally better to highball it because you can only sap voltage out of a line with flaky wire or insulation, not add it (unless you're actively feeding-in), and the simple resistive devices that see increased current and power with higher voltage tend to be a little more resilient in the face of that than sensitive electronics that see higher current with lower voltage.
Sign In or Register to comment.