Smart Energy, mHealth and the Chocolate Factory
- Published
- in Health, Smart Energy
Although they may seem strange bedfellows, both the mHealth industry the smart metering industries (both favourite children of the technology world), are facing the same problem. Both are moving from a world of almost no data to data overload of a level they never imagined, even in their worst nightmares. Whether it’s from an annual health check or a visit from the meter reader, both are used to getting one data point per customer per year. The advent of connected sensors means that is changing to anything up to one reading per second.
It’s a bit like the case of a child who has hitherto only been allowed chocolate on Christmas Day. Now they’re being led into a chocolate factory and told they can eat as much as they want. The inevitable result is a very happy child for a few hours, until they’re violently sick. At which point they either vow never to eat another chocolate, or learn to treat it in a more sensible manner.
Today the medical industry and energy utilities are being shown the doors of the chocolate factory. We have yet to see how they behave once they enter it. Some may emerge as triumphant Charlies, but others risk becoming the commercial equivalent of Augustus Gloop and Veruca Salt.
The first point to address in this new world of data overload is the assumption that we’ll be able to do lots of useful things once we have this data. There are lots of companies painting a picture of automated homes and lifestyle medical devices based on analysing this tsunami of data, but as yet we don’t know how much can be inferred from it, let alone how we will be able to use it to control other devices. The assumption that having over a million times the volume of data every year (one reading every three seconds instead of one per year) is going to tell us anything useful is still exactly that – an assumption. The nightmare scenario is that it doesn’t – it’s just random noise.
I fervently hope that’s not the case and that the data is useful, but to confirm that needs a lot more work. mHealth and smart energy aren’t the only markets facing this problem – the U.S. military acknowledged it recently, when they said “we’re going to find ourselves swimming in sensors and drowning in data.” We are moving from drawing a straight line through two points to drawing one through a million of them. At the most basic level, it involves a fundamental change in the underlying business model. Both the medical profession and the energy utilities currently work on the assumption that if they hear nothing from us, they can ignore us for the next year. Now they’ll be hearing from us every few seconds. It’s not just the volume of data that is available, but the question of how to react to it. That new granularity will show deviations from the straight line, whether it’s raised blood pressure or turning on the hosepipe to water the petunias. What should a supplier do about it?
In the past, the safe route has been to ignore everything, not least because you don’t known about it, and it will probably have gone away by the time of the next data point. Once you let the cat out of the bag and tell the consumer that you are monitoring their every move or cup of tea, then they will expect more feedback. That means more resources on the part of the provider, which is likely to mean more cost. Where’s the business model that supports that?
It suggests that the industry needs to step back from some of the more complex technology and fanciful gadget push that is appearing in the market and instead concentrate on answering the basic question. Which is “what can I usefully do with the data”? That means working with simple sensors that can collect the data, and back end systems that can then aggregate and mine it. When the UK’s Technology Strategy Board was collecting input for their Assisted Living Innovation Program, I argued that they should do exactly that – deploy ten thousand or more sensors of whatever variety and concentrate on collecting and analysing the data. I’m pleased to say that they’ve embraced that approach.
It is a critically important task for anyone who is moving into M2M (and that is essentially what mHealth and smart metering are). You need to start by understanding your data. Only when you have done that can you start to decide what value it has and whether a large scale deployment is justified. That justification might be because it makes your business more efficient, it might be because you can offer additional services to your customer, or gain a competitive advantage, possibly by disrupting the market. Or it could be because a government pays you to do it; but if they do, will they continue to pay the long term, day-to-day operating cost or working that data?
The problem is that you’re unlikely to know the answer to these questions until a year or more after you’ve deployed your first ten thousand devices and collected and analysed that year’s worth of data. That’s a large initial expense with no immediate return.
If the resulting business model is customer oriented, rather than profiting from internal business efficiencies, then it needs to include some compelling feedback if the user is going to want to continue to use it. That in itself is a new area for both the medical and energy industries. Neither use a language which the consumer understands, at least until the day the bill arrives. Instead they stick to scientific jargon with BTUs, kWhrs, systolic and diastolic pressures.
Consumers are far more interested in comparisons – for them these provide the compelling feedback. That means simple comparisons such as “are we spending more than we were?”, “more than our neighbours are?”, “are we getting better?”, “should I have eaten that extra doughnut?” need to be developed. None of these are the type of information that these industries have experience with, but if they are going to provide a compelling service they need to take into account customer psychology. Even when that is done it may not have the desired effect, as evidenced by the recent report which found that when told they are using less energy than their neighbours, Republicans tend to compensate by increasing their energy usage.
To add another level of complexity, many of these comparisons raise privacy issues that are new to these industries. Comparisons are normally more persuasive when they’re made with a group of peers, rather than just comparing past performance. But how many companies are aware of what they are allowed to do in comparing an individual against data from other customers? How much granularity can you use in comparisons with a neighbour?
Some companies are trying to leapfrog the data learning stage by selling a vision to customers. A good example is Fitbit, who are using thir initial customers both to build their database and provide feedback. However, for established businesses, which are those that will be supplying 99% of energy and healthcare users, that’s probably not an option.
Equally difficult is answering the question of how often feedback should be provided? Should it be realtime – “turn it off now”, or after the event? Should it affect when you are doing something, i.e. trying to change behaviour now, or retrospectively? Even before we get to schemes such as energy shedding, which will turn off appliances, we need to know much more about the usage models behind data before bringing further automation into the picture.
These are difficult questions, both for new and established industries. However, the fundamental order remains unchanged. First you need to acquire the data. Then you need to understand what it means. Only then can you determine what that implies for your business model. Keep an open mind and be flexible in building those business models. Whether they be improved efficiency, customer retention, Government mandate, future sales through behavioural modification or company acquisition, they all need a company to take the time to understand the data and develop a consistent model. Otherwise, you may look back and wish that you’d kept chocolate as a once a year treat, and never entered the chocolate factory.
Nick,
“Understanding the data is a necessary first” is a risky strategy. If I understand you correctly, the aim is to infer structure and patterns from masses of data.
For the vast majority of applications, however, there are underlying models (e.g. for ECG monitoring one is looking for existence and periodicity of P-waves, PR intervals, Q-waves etc; for health monitoring of engines one may look at manifold and exhaust chamber temperature variations etc.). This design approach defines what signals to monitor, the frequency of monitoring, the signal processing required to analyse these wave forms and rules to apply when looking for faults.
The whole notion of amassing large volumes of data and then analysing this data is a recipe for long and costly development projects. There is also no guarantee of success – to paraphrase Donald Rumsfeld, if you torture the data for long enough it will tell you want you want.
Ken,
I don’t disagree that other industries have developed the tools, but you still need to do the work to analyse the data and make sense of what it means. The supermarket loyalty card is a good example. Depending on who you talk to, you’ll be told that it took between five and ten years for the supermarkets to work out how to make money out of these. In the UK, most failed – Tesco did an excellent job, but most of the others have dropped their card schemes as they were losing money.
My argument is that understanding the data is a necessary first phase. Instead of concentrating on that, much of the industry is being beguiled by the hype of what can be delivered before they’ve even started the data analysis phase.
Hello Nick – is all this anxiety warranted?
There are examples of well orchestrated (mega) data management systems already in the market e.g. supermarket loyalty cards.
Also, there are commercial implementations of pattern recognition systems out in the market (e.g. predictive maintenance for machinery, sleep-apnea monitoring, cardiac arrythmia monitoring etc.).
The engineering and signal processing communities have tools and modeling techniques to address these problems.
Spot on comments and analogy! The problem is, from my perch, the utility industry is being thrust into this factory of data called the Smart Grid with stimulus dollars that were requested by grant writers with less-than-20/20-vision about the data production. And like Mike TV or Amanda Glump (whatever her name is in the movie) they are overly anxious to consume something they know nothing about. They are doomed. Some readily admit this data factory has them scared.
And that falls into two parts: what the business model thinks they need and what the customer believes they need. I suspect in many cases they may not align.
I totally agree about the fact that many are currenntly not looking beyond the dazzle of the shiny trinkets.
Great blog!
I was with you until the end. Here’s where I depart: as businesses and individuals, we need to decide FIRST what the need is, what we are looking for and why. We should know these answers before we start collecting data. For many of us, we don’t have to eat the whole box of chocolates to realize we only wanted the pecan clusters.
I believe we will look back and realize we are still in technology “infancy” – still dazzled by the shiny trinkets of technology. We must move into the human development areas of knowing what we need and how to get it.
Thanks for the great thoughts!
I’m not sure what form the backhaul takes matters too much, as long as it’s there and reliable. Wi-Fi has advantages (it’s cheap and moderately ubiquitous) and disadvantages (the complexity of setting security and relatively fast product replacement cycles).
It’s important that it works, but it shouldn’t be anywhere near the top of the list of technical issues – it’s just a transport pipe. The far more important issues are making the product easy to use, getting the data into a database, working out what the data means and finding a compelling way to make use of that information.
Good blog.
I reckon wireless for healthcare (at least) has to include / assume broadband and wifi @ home as Gsm/Sms is wholly inappropriate at the best of times.
Wifi no longer being the preserve of the middle classes means this is less of an inclusion issue than if once was and becomes one of how tomake devices secure and easy to interop at the same time (many patients don’t take the right dosage of prescriptions let alone putting a WEP key into a remote BM or sphygmomanometer!).
Good food for thought here…