I’d suggest a deeper investigation into the philosophy behind such devices and comparing it to the equipment you use and their patterns of operation. As far as I was aware, the main way you save energy by reducing supply voltage is through any simplistic, largely resistive loads you have connected to the mains, and that run either continuously or on a set pattern. EG incandescent or halogen lighting, ventilation fans, vacuum cleaner motors, non-thermostat controlled heating / cooling / AC and so on. Those all have power consumption that varies as the square of input voltage, as the current flowing through them is equal to voltage divided by internal resistance, and power is equal to voltage times current. With there being no change to the duty cycle or said resistance based on any influence the device has on its environment.
Essentially, it’s the same as replacing those lights, motors or heating elements with lower-wattage (higher resistance) types, but doing so on a bulk basis across your entire site. If you don’t want to go to the trouble and expense of changing them all directly, and maybe want the ability to vary the line voltage if needs be (boosting the lights or ventilation back to original levels when the site is in heavy use, but cutting it back to a minimal level when essentially shut down), installing such a regulator could be a considerable money saver not just in terms of electrical units used but also the materials and work involved in actively downrating everything.
If you’re largely using anything that has a more sophisticated power supply or transformer/regulator between it and the mains supply (any computer or similar electronic device you could name made in the last 25-30 years, just for starters, as well as most CFLs or LEDs; dimmables are a mixed bag, some won’t show any change until the input voltage drops considerably, others will dim noticeably just in response to regular mains fluctuations, but in either case the relationship is under electronic control and not a simple linear change of current under voltage control), or that is expected to do a particular amount of work regardless of supply voltage (a lift that has to move a certain amount of weight through a certain vertical distance, electrically heated water boilers/immersion heaters, ovens/furnaces/heating systems or fridges/dehumidifier/AC systems that maintain a particular internal temperature or humidity level using a thermostat, arc welders, electromagnets, battery chargers etc), then it’ll be of little use. All it’ll really do is make some things work harder, and possibly less efficiently or effectively if that ends up pushing them into a 100% duty cycle or otherwise outside of their normal operating parameters. Some of them will do the same job, and use the same power, but do it slower and create the temptation to replace them with higher powered models anyway. Quite likely for a lot of them you just won’t see any functional difference, because their power supplies will adapt to provide the exact same DC voltages and currents on their business side, just drawing higher current from the mains to compensate for the lower voltage on that side, multiplying to the same power in the end.
Which I have to qualify with a warning that if the connected equipment already runs quite close to the rated limits of its supply cable or circuit you may end up blowing fuses, as the current ends up exceeding that limit (usually it’s more current that’s the issue than voltage, for cable/circuit safety - though it’s (extremely high) voltage that causes arcing, it current that which causes potentially catastrophic heat buildup).
Rule of thumb question is, when the local grid undergoes a quite obvious switch with an up or down jump in voltage, can you detect a difference in the operation of the device, AND does it moderate its operation with any kind of feedback sensor affected by the machine itself (other than maybe a photodiode - you can’t really influence sunlight the same way you might internal temperature, and the borderline effects don’t change much between a 25w bulb and a 150w one because they’re both very dim vs the sun). If the answer to the first part is yes, AND the second part is no, then it’s worth doing. Otherwise, maybe not.
EG, do you notice a continually-lit (or PIR sensor triggered) lightbulb changing its brightness in response to the switch, or a continually running motor subtly changing its speed? Heating and cooling effects are rather harder to judge, but generally if there’s a thermostat involved, the total amount of energy consumed won’t change much in response to altered input voltage, you’ll just have to wait longer for the element / compressor to switch off each time it turns on. There might be a slight saving from the heated/cooled enclosure spending more time closer to the temperature of the outside environment and so having a weaker energy gradient on average, but that essentially just points to it not working as well as it should.
In all cases, if you have less power running through your meter, the equipment you power will be doing less work. The choice comes from whether or not all that work is being made use of, or whether a lot of it is essentially wasted, and if you have any other practical way of reducing that waste. If you have something that’s running continually when it could actually be turned off half the time, you may be much better off installing some kind of simple switch on its circuit or replacing it with a radically more efficient alternative, instead of installing an expensive voltage chopper that will produce a more fractional saving, may have an untoward effect on other equipment, and has a nonzero line load itself.