GE – The Appliance of Ignorance
- Published
- in Smart Energy
Back in the 1980’s when they were trying to establish themselves in the British market, Zanussi ran a campaign for their products using the advertising strapline “the Appliance of Science“. I was reminded of it this week when I was reading a white paper from another appliance manufacturer – GE. Not because it had anything to do with science; in fact just the opposite – it was about the most unscientific paper I’ve ever come across.
It was written to promote GE’s view on which wireless standard should be chosen for the Home Area Network (HAN). These are designed to connect devices around the home to a smart meter or a home gateway that has access to information about your current energy tariffs. GE thinks the best choice should be ZigBee because “ZigBee is better than Wi-Fi”. One of the paper’s authors is an active editor for the ZigBee Alliance Smart Energy Profile, so that’s not surprising – he’s entitled to be enthusiastic about the technology he’s part of. And it may be that ZigBee is a good choice. But GE’s analysis doesn’t provide any evidence as to why it might be. Instead it provides an evidence-free quasi-analysis that does ZigBee more harm than good.
We’ve had a year of hype as different wireless standards vie for the crown of being chosen as the de facto one for smart metering. Much of that obscured the facts which need to be considered to make that choice. In the last few months I thought the industry had settled down and was beginning to a bit more logical. This rant from GE suggests that some of those involved in the debate still have a lot to learn. If you want to see how not to make a reasoned argument, download and read the GE white paper. I’ll highlight what is so wrong about it.
There is no shortage of wireless standards that would like to be picked. The current runner setting the pace is ZigBee, but Wi-Fi and Bluetooth are there as well, plus a bunch of prominent also-rans, including Z-Wave, Wavenis and Wireless-MBUS, along with a clutch of rank outsiders. Most of these standard would admit that they still have some way to go. None of the standards are yet fully fit for purpose for smart energy, but most of them recognise their deficiencies and are working on fixing them. There’s a valid argument that none are really good enough and the industry should take the best bits of each and come up with something new, but we probably no longer have time for that approach.
Because none of them are perfect, it’s proving difficult to make a choice, as government led groups around the world are discovering. Many different factors need to be weighed up, of which the most important are range, power consumption, robustness to interference, security, interoperability and IP ownership. Throughput and latency, which are important in other wireless applications aren’t particularly relevant for smart metering, as so little data is transferred, none of which is time critical.
In their white paper, GE is quite content to ignore robustness, security, interoperability and IP ownership, which makes the job a lot easier for them. And they limit the contenders to Bluetooth, ZigBee and Wi-Fi. So they’re making their task relatively easy by not being very representative. Yet they still manage to mess it up.
They start their analysis with range. Range is one of the most misunderstood parameters in wireless connectivity. For any of the standards operating at 2.4GHz, if the radios are transmitting at the same power, with the same antenna, the range is predominantly determined by the receive sensitivity. That depends on the ability of the receiver to extract a signal from the noise, which is a factor of the symbol rate (the rate at which you modulate the signal), how many symbols you use to encode each piece of data, and to a lesser degree the modulation method, modulation index, the degree of error correction used for the packets and the amount of receiver diversity you use. For more about how these all work together, have a look at my book – “Essentials of Short Range Wireless“.
That’s obviously all a bit too complicated for the GE team. Instead they make the observation that the Bluetooth range on their mobile phone is only a few metres. On that basis, they discard it from further consideration. There’s a reason the range is limited on most phone, which is because phone manufacturers understand power consumption. For them, the most important parameter is battery life. As most Bluetooth applications only need a range of a few metres – typically from the phone to a headset, they cut the transmit power down to minimise the power draw. 2.4GHz radios tend to be exponential in their power draw. So a device transmitting at 4dBm will consume 30mA, whilst one transmitting at 14dBm will probably take up 250mA. For a phone manufacturer, 30mA is acceptable, 250mA is not. They do the same for Wi-Fi implementations, which is why your phone’s Wi-Fi range doesn’t match that of your laptop. But I suspect the people writing this paper haven’t yet worked out how to use Wi-Fi on their phones.
You can see that GE engineers might not appreciate this subtlety – after all, they’re not experts in power management, as their devices don’t have to run on batteries. It’s ironic that the poor efficiency of household appliances is one of the reasons we need a smart grid in the first place. If you have a look on the energy bible site for dishwashers, the best performing GE product comes in at position 147 out of 605. And over half of their products fall into the 16% of worst performing dishwashers. (I chose dishwashers as there’s less variation in size, so it seems a fair comparison.) Hence GE can hardly claim to be qualified to pontificate about power consumption. Although that doesn’t stop them.
That’s not the only deficiency this paper has. They state that Bluetooth is based on the IEEE 802.15.1 standard. It’s not. Bluetooth writes its standards independently. It contributed an early version to the IEEE, but that’s five versions out of date. They also refer to the Bluetooth Alliance. A quick check of the website, or any of the specifications, all of which are publicly available, would show them it’s the Bluetooth Special Interest Group. There’s no such thing as the Bluetooth Alliance. Maybe they’re confused because Alliance rhymes with Appliance.
But on with their range experiment, which is now down to deciding between ZigBee and Wi-Fi. The first bad point they find against Wi-Fi is the fact that ZigBee “is the lowest power”. Putting aside the fine detail that power consumption depends on topology and application, I’m afraid that’s not true either. Like for like, Bluetooth low energy, Z-Wave and Wavenis are all lower power than ZigBee. But they’ve already been excluded, so according to GE, they don’t count.
But then Wi-Fi stumbles again, because according to GE, ZigBee is the only standard that supports mesh and ZigBee is generally used in a mesh. Unfortunately for GE’s analysis, ZigBee is not the only standard for mesh. Mesh can be run over most networks. There are plenty of commercial mesh networks running over Bluetooth. And ZigBee isn’t the only wireless standard that has specified mesh. There’s another one. It’s called 802.11s, which is part of the 802.11 set of standards which underpin Wi-Fi. In fact, most ZigBee implementations don’t use mesh. The standard includes it, but the bulk of commercial deployments only use ZigBee in a star network or cluster tree topology.
Then comes the issue of cost. Apparently Wi-Fi chips are around $2 more expensive than ZigBee. That will come as a shock to the companies who bought around 700 million Wi-Fi chips last year, a volume that helped drive the price down. In contrast, less than 10 million ZigBee chips were sold, according to IMS, which means they’re typically more expensive than Wi-Fi. And both are a lot more expensive than Bluetooth, which is now shipping for less than $1.
Now we get the really weird bit. GE claims that an additional processor is needed to run the protocols for these chips and for Wi-Fi that costs $11 more than it does be for ZigBee. My experience is that people using embedded Wi-Fi and Bluetooth are writing their application code directly on these chips. After all, most of these chips already contain two or three processors. So developers don’t need to use another separate microprocessor. And certainly not one costing over $10. So GE doesn’t seem to know how to design products cost effectively, and are using that inadequate engineering to try and tell the rest of the world what to do. But maybe that’s why GE appliances cost so much.
Having tried to bludgeon us into submission with a concerted display of ignorance, they now change tack to prove their point with an experiment. They take a ZigBee chip (a good, low power one) and compare it with an 802.11n chip, which they claim is low power. In one sense they’re right, the 802.11n chip they use is designed to be low power. Or at least low power for 802.11n, which is capable of constantly pushing several hundred Mbps of data across a link. If they had wanted to do a valid comparison, they should have chosen an 802.11 chip designed for low power sensor applications. In other words, they should have used one operating at 1Mbps using DBPSK modulation, such as the ones from Gainspan and G2 Microsystems.
Instead they perform a comparison of bicycles against jumbo jets and come to the surprising conclusion that bicycles are lower power. Which means that in GE’s eyes, ZigBee wins.
In a final act of madness, they do a calculation of the savings that a customer would make if an appliance used ZigBee instead of Wi-Fi. They don’t separate out the power taken by their unnecessary external microprocessor, (they never tell us what that is), so it’s all very dubious. But they claim that choosing ZigBee rather than Wi-Fi would save a customer a massive 4.2 kWhr per appliance per year.
Let’s go back to those energy bible figures. If you sort them by annual power usage, GE dishwashers take up 48 of the bottom 60 places in terms of energy guzzling consumption. They consume around 324 kWhr per year, against the most efficient dishwasher on the list – the Fisher & Paykel DS605, which takes only 157 kWhr per year – less than half the power of the GE appliances. So if they spent their time concentrating on designing decent appliances rather than writing silly papers like this, the GE team could save the user 167 kWhr per year, which makes pontificating about the choice of radio based on a 4.2 kWhr saving seem rather irrelevant. If you’re beginning to think that GE engineers who live in glass houses shouldn’t throw stones, you’re probably right.
What is worrying is that this has been submitted to the study groups looking at wireless for smart grid as a serious paper. Every proponent of a wireless standard will be biased to some degree – that’s to be expected, as you’d expect the people working on a standard to believe in it. But this paper trivialises the whole process. As I said, it ignores important points like robustness, IP, interoperability and security. One of the most thorough analyses I’ve seen regarding these points is in a paper put out by the Bluetooth SIG, which is worth reading. In comparison, this GE white paper is little more than an evidence-free rant.
I don’t think that this does the cause of Smart Energy any good. Nor does it bring any honour to the ZigBee Alliance, who I suspect are embarrassed by it. And it certainly does nothing to enhance anyone’s opinion of GE. If this is the level of due diligence they apply to technology it’s no wonder their appliances sit at the bottom of the league tables for energy efficiency. On the basis of this report I’ll keep on buying Fisher & Paykel and Zanussi.
The reason I didn’t mention Z-Wave, is that this post was a response to what I considered very poor analysis in the GE report, which dismissed Bluetooth and Wi-Fi because it failed to make valid comparisons. Your complaint in this case should be to GE, as to why their report didn’t even acknowledge Z-Wave’s existence 🙂
It was also written a month before CES, which, as you say, saw a lot more evidence of Z-Wave. I’ve previously commented on the fact that 2.4GHz is not ideal – the industry would be far better looking at a lower frequency – ideally even lower than 868MHz. The problem is twofold – lack of global spectrum and a lack of high volume silicon. So I suspect we may end up with 2.4GHz, purely because it is the volume choice for radios.
I do think that smart appliances and the HAN are a distraction. The industry is still some years away from getting standards and data analysis sorted out for metering. That’s really the first step. Whilst there is already a market for high end smart appliances, I suspect they will remain a marginal geek toy for most of the next decade. I’ll make a diary note to check back on the accuracy of that prediction in 2020.
Surprised Z-Wave doesn’t come up in this discussion. At CES they were a lot bigger than ZigBee in home control and have a lot of supporters. Their mesh is in 868/915 which is a better place to be than 2.4GHz
The appliance industry has been studying this for years and every time concludes that they cannot afford any cost increase. The most sensible approach for them has often seemed to me to be a powerline technology – no new wires and virtual wireless.
igBee is definitely a mesh network, and far and away the most developed mesh standard, but it is not alone. As you say, standards like 802.11s are being designed for different applications, but that doesn’t mean they will not be used for in-home meshes. 802.11 was not primarily designed for the role of a home access point – it was envisaged that it would be a backbone for corporate or factory installations which would require roaming between access points. But that’s not how it’s used most of the time. Consumers and manufacturer have a habit of subverting standards for what they want, not what the standards bodies intended. And many in the Wi-Fi space would love to add the mesh “tick-box” to their basket of features.
The current issue that 802.15.4 / ZigBee has is the cost of silicon, which is still around four times that of Bluetooth. I suspect that is a temporary issue – it’s largely down to the fact that the market has not been big enough to allow enough chip spins. The ZigBee dies I’ve seen are still relatively large, so it’s difficult to see how they could hit a price point of $1, even at half a million pieces. However, the indications are that the next generation of chips will have geometry shrinks that will make that $1 price point possible. At which point the playing field for wireless becomes a lot more interesting.
Akiba is right – ZigBee is definitely a mesh network. Also 802.11s is really aimed at municipal wireless networks as opposed to a home area network or sensor network, where you mesh the APs.
6LoWPAN is not a stack. It is two IETF documents which describe header compression and neigbor discovery. The ZigBee Alliance is making good progress in developing a specification and test plan which stitches all these IETF documents together to make an IPv6 stack suitable for home area networks.
The bottom line is of course IP can run over any technology, wired or wireless. The fact is 802.15.4 checks pretty much all of the boxes for a home area network, especially with regard to price point and power. That’s not to say other technologies don’t but that is why ZigBee and 802.15.4 is out in front
A lot of what 6LoWPAN has done is being embraced by the ZigBee Alliance in their Smart Energy Profile 2.0, to allow every device to be addressable. I’m not sure that 6LoWPAN by itself has a role in smart energy, as there is too much else that it would need to add to its specification to meet the demands of those specifying the smart grid. Instead it’s likely to be taken and used as a building block by others.
I’d personally like to see some more debate on the best way to address individual nodes. Whilst 6LoWPAN is an elegant solution, I’m not convinced that it is the best low power approach. Instead there’s an argument for address translation at the next node up (the router or relay) to minimise the address element of the packet sent to the final device. The reasoning behind that is that for very low power devices, which need to run for more that ten years on a battery, or use energy harvesting, every byte of data sent over the air has an impact, and even 6LoWPAN imposes an overhead which may not be efficient enough.
And you’re totally right about the lower frequencies. Something down at 400-600MHz would be better still and probably optimal. Below that antennas get to be too large for small devices. The issue here is a lack of globally available spectrum, which is why everyone is concentrating on 2.4GHz. Several billion 2.4GHz chips are shipped every year across Bluetooth, Wi-Fi, 802.15.4 variants and proprietary RF. That’s built up a lot of design and production knowledge, which has driven down the cost of a chip. So although it’s not the best frequency, it’s probably the one that we’ll end up with.
> ZigBee is better than Wi-Fi
And 6LoWPAN is better than both?
6LoWPAN is like ZiGBee based on IEEE 802.15.4, but offers IPv6 up to IPsec.
http://en.wikipedia.org/wiki/6LoWPAN
There are manufacturers that are using 2400 MHz and 868 / 915 MHz range. I prefer 868/915, because the coverage inside buildings is better.
The break numbers vary enormously depending on circumstances. I’ve worked with companies who have had valid arguments for going down the route of chip on board at just 500 pieces, and at the other extreme with companies who were still using modules for products that had production runs in excess of 100,000.
There are a lot of items that people need to be aware of in making this decision. The first one is whether they have the knowledge and can get the necessary support for doing a chip based design. If not, many go down a consultancy route, which has a cost. Once complete, they need to go through qualification if they’re using Bluetooth, ZigBee or Wi-Fi, as well as national approvals for the countries they plan to ship in. Then there are the less obvious costs of production test equipment for a radio design, which they may not need if they use a module. And for some products, there’s the effect on time to market, where modules are normally a faster route.
Often the RF side is the easy bit. The harder part is understanding and interfacing the protocol stack into your application. For simple cable replacement they may not be too difficult. For more complex topologies it can be a major development, particularly the first time around. Against which, once you’ve gone through the process, that knowledge is an asset which you have within the company.
What I would recommend is that any designer makes sure they understand all of these before they rush into a chip on board design. I’ve seen several companies struggle for six to twelve months on a chip on board design before going back to the drawing board and using a module. And that’s a year of lost sales.
Nick your break even numbers are too high. The cross over is closer to 4,000 when you take into account mechanical issues, connectors and the antenna. I don’t need support either, just put the chip in Digikey, publish a real datasheet and I will probably never call you. We have 802.15.4 and Bluetooth chips in products and have never talked to the factory on a technical issue.
So to get around the secretive wifi mess we have a USB connector in the box and plug USB wifi sticks into it. I can buy all we need at under $5 each. The downside to this is the mechanical issues and the need for a MCU with USB host support.
I fail to see why wifi is treated differently than 802.15.4 and Bluetooth. They are pretty much the same thing to the FCC.
Pricing is always a little difficult to place a figure on for wireless systems, as there are a number of different factors involved. At low volumes, typically below 10,000 pieces per year, most manufacturers will use modules rather than chips. The reason for this is that it means that manufacturers can usually get around the need for standards qualification and approvals, as they can rely on the results of the module manufacturer. Otherwise, there’s a cost which is typically between $20,000 and $50,000. That means buying a module is better value. Above 10,000 pieces it may be more cost effective to design with a chip, but this requires a fair degree of experience of the wireless standard, RF design and the approval process. Because of this, a lot of semiconductor companies are reluctant to support manufacturers who are buying fewer than 100,000 chips.
Once you pass this point, the price point can fall significantly. At a million pieces per year, which is where I’d expect companies like GE to be, I’ve seen pricing for low power 802.11 chips which is below $5.
Roving Networks just announced our newest ultra low power Wifi module.
http://www.rovingnetworks.com/171.php
our new RN-171 is a true plug and play (on board IP stack )
based on our G2 SOC we have the lowest power specs on the market ( 4uA sleep, 30ma RX, 120 TX)
pricing is only $19.95 at 1K <$
$15 at 10K, and $9.50 at 100K.
Are any of these Wifi SOC solutions really available for under $10 Q1000 (not Q10M)? — I doubt that any of these SOC wifi companies have a real Q10M order on the books.
I can easily get 802.15.4 and Bluetooth SOC solutions sub $5 in Q1000. Just order them from Digikey. That proves they are real solutions.
Cheapest real Wifi solution I have found is the CSR chip with an add-on MCU. I can get that for under $10 Q1000.
We use more than Q1000, but being able to easily order a test lot at Q1000 with out jumping through complicated hoops is a major judge of how easy the vendor is to work with. There are several wifi companies we find too hard to work with, they are only looking for $1M PO’s and don’t care about anything else.
Although I agree that the GE claim was moronic, I do have to disagree with your claims. as well:
“In fact, most ZigBee implementations don’t use mesh. The standard includes it, but the bulk of commercial deployments only use ZigBee in a star network or cluster tree topology.”
There is no cluster tree topology and star topology is only run on pure 802.15.4. Zigbee only has mesh routing so if you’re running Zigbee, by default, you’re running Zigbee’s mesh routing as well.
I’m also involved in the smart grid effort. Other people involved in it already know there is a lot of protocol bashing between groups and 802.11 vs Zigbee/802.15.4 has seen its fair share as well as *ahem* Bluetooth & Bluetooth Low Energy.
Most people in the standards groups know what the real advantages and disadvantages are for each of the protocols. GE’s paper is mostly interesting out of a morbid interest…