Someone sent an me an article from the Economist suggesting that a proposed new version of the USB standard to allow higher power to be taken from (or delivered to) the port would revolutionise domestic house wiring and render the grid obsolete.
Hmm - much hype and minimal basic science understanding is about what you'd expect from a magazine called The Economist.
A couple of comments from my probably limited understanding as it does seem that there is a certain amount of hype in the article and it needs cross-checking with the laws of physics.
The idea of having a separate low voltage DC ring main in a house for items which are designed to operate from low voltage (eg almost anything you have that runs off a battery) has been around for a long time. The problem from the laws of physics comes from the fact that the electrical power (in watts) is the voltage (in volts) multiplied by the current (in amps).
So an item that needs 60watts of electrical power (eg this laptop) will require (draw) a current of a quarter of an amp at 240volts or 5 amps at 12 volts. In other words a 12 volt ring main would need to be able to carry 20 times the current of your existing 240 volt ring main (doesn't matter if AC or DC). Once you start to put several devices on the same low-voltage ring the current flowing through the wires quickly adds up.
That in itself would not necessarily be a problem, particularly as a lot of modern devices require much less power - led lights are brilliant from this point of view, but anything which is converting electricity into heat or physical motion - toasters, hairdriers, kettles, white goods, heaters, etc - are pretty much stuck with needing a given amount of power. These items actually probably make up the bulk of your electricity use.
The real killer though is the fact that the size of wire (conductor) required to deliver a current without getting red-hot and melting is proportional to the square of the current. So to deliver the same power at 12 volts as at 240 volts would require wires with a cross-section area 400 times greater.
To put that a different way, the wire in your 13amp ring main is specified to safely carry 13 amps (it'll get warm if you overload it) whatever the voltage. 13amps at 240 volts is 3.12kW of power - which is why your cooker has to be on a separate piece of thicker (60amp) wire from the fuse box as it probably is using getting on for 10kW if you turn everything on at which point it may draw 40amps or so.
Using the same wire for a 12 volt ring main the maximum current you could safely carry would still be 13 amps, so the power you could deliver with it would only be 156 watts. Probably not enough to power a desktop computer (which typically will have a minimum 300watt power supply).
Your existing lighting circuits use cable specified for a 5amp maximum load - about 1200 watts at 240 volts, or 60 watts at 12 volts - so a 12 volt distribution would probably be ok for the lighting circuits if your house only used LED lights (typically a 1.5watt led bulb has a similar light output to an old style 60watt filament bulb)
Incidentally this is why the national grid distributes electricity on their big pylons at extremely high voltages - 400 thousand volts - by doing so the current required for a given power (in watts, or thousand watts - kilowatts, kW - or million watts - megawatts, MW) is much lower so the cables on the pylons don't have to be unfeasibly thick. Its also why the solar panels on your roof may be wired in 'series' to produce a higher voltage to the inverter input so that you don't have to have wires the size of thick ropes coming down from the roof.
The second area where I would ask for a quick sanity check is when the article refers to the inefficiencies of power supplies. Again I am not a formal expert beyond A level physics and some practical electronics experience but my understanding is that modern 'swiched mode' power supplies are very efficient.
The power supply is the bit that takes the 240volt (240 used to be the maximum allowed, I'm showing my age there - in fact it is now 220volts normally since a pre-EU bit of European standardisation back in the 1960s.) mains and converts it into the lower voltage (eg 12volts) required by portable devices and battery chargers. It may also convert AC to DC.
It may be built in to the equipment, or these days very often a separate small black box with mains wire in and a low voltage wire out that connects to your device.
In the old days (pre 1980) power supplies were generally of a class known a 'linear' - they had a relatively big heavy transformer inside that converted an AC high voltage to a lower one (also AC) through the use of magnetism. If a DC output was required then a separate component (a rectifier) made the AC into DC, and some big capacitors were required to smooth out any remaining bumps in the voltage. These linear power supplies were indeed pretty inefficient (largely due to losses in the transformer) - typically 50% efficiency would be average and 67% would be a highly optimised and expensive design with carefully selected components. So with those devices up to half your power was indeed being wasted before it even got to the radio you were trying to turn on.
Since then however there has been a revolution in power supply design with the introduction of 'switched mode' power supplies. These do away with the big bulky inefficient transformer (they still have a little one being used for a different purpose that doesn't affect the overall efficiency) and achieve typical efficiencies well in excess of 80% with figures in the mid 90s being normal nowadays. They are also much smaller, cheaper and easier to produce, don't generate as much heat (a symptom of waste) - they do have a couple of drawbacks though which makes them unsuitable in some specialist applications.
So nowadays there is much much less power being wasted in power supplies for devices - if you do the sums I think you'll find that the losses from all of the power supplies you might use in your house for things like charging phones, running laptops etc from a 240volt supply will be much less than the additional losses you will introduce by dropping your internal distribution voltage to 12 volts which will increase the current in the ring main cables and mean you'll have to either use thicker cables (more copper resource tied up) or loose some as heat as the cables get warm.
Incidentally there is an interesting side effect of switching to LED (or even compact fluorescent) lights that I think we discussed on this list before - old style filament bulbs 'wasted' a lot of power as heat - but in a sense often this heat was not wasted but was going to help heat up the house meaning that the central heating didn't have to work so hard.
Also note than modern inverters (which convert DC from solar panels to AC to feed into the mains) expect to achieve efficiencies in the mid 90% range.
Finally note that this is discussing low-volatge (less than 50v) DC systems - things change slightly with high voltage DC, then the main issue I think becomes safety - putting your hans across a 240v DC rail will tend to weld them in place, whereas putting your hands across a 240v AC rail doesn't - either will kill you though if you don't remove your hands PDQ!
Things are seldom as simple as marketeers want us to believe.