Analytics of Things: Bigger than Big Data

Originally published 16 February 2017

In this article, Ron Powell, independent analyst and expert with the BeyeNETWORK and the Business Analytics Collaborative, talks with Dan Graham, director of technology marketing for Teradata. They discuss the Analytics of Things and how the analysis of sensor data is poised to change the way we live and work.

Dan, it’s always great to see you, and this time you are working on Analytics of Things. What is the Analytics of Things?

Dan Graham: Well, the Analytics of Things is a subset of a huge market trend – a market trend that is just overwhelming. It’s probably the biggest thing since the Internet itself. It’s huge. We’re talking trillions of dollars going down on this Internet of Things over the next five years. And in that clamor between the vendors and customers, there’s a subset – a smaller group of functionality areas that comprise analytics. In the Wall Street Journal, Tom Davenport coined the term “Analytics of Things” to say that we’re going to draw a little circle around the parts of the Internet of Things that are analytically oriented. The important part of this is that it’s all throughout the Internet of Things. It’s not just the data warehouse or the data lake. It’s throughout the entire ecosystem of the Internet of Things.  It’s right inside of all the operational environments. You still need a data warehouse and a data lake to make things happen. 

So if you’re searching on Google and you search “Internet of Things,” you get about 15 million responses. If you narrow your search for the “Analytics of Things,” you might find what you want. 

Can you give us some examples of what customers are doing with the Analytics of Things?

Dan Graham: There are a lot of great customers that have already been successful with the Analytics of Things. At Partners, I attended four presentations by our customers. They are really fantastic stories. 

We have great customer stories. A lot of the manufacturers have been doing this for a long time. Volvo is one example. They’ve been analyzing the data in a data warehouse using their Six Sigma engineering team, and they’ve been doing it for quite a few years. You know how Volvo has the persona of being the safest car on the road. Well, there’s a reason for that. They’re analyzing all the sensor data. They’re going through it, and making Volvo much more reliable so that you won’t break down and you won’t have crazy things happening with the brake pedal. But imagine this – Volvo’s home office is up in Sweden.  They’re building cars that talk to each other. So if you’re a customer and you opted in by accepting their privacy rules and you’re driving down one of the roads in Sweden that is iced over, another Volvo car will Wi-Fi back to your Volvo car and alert it about a black ice patch ahead. Then your car will automatically slow down and adjust for conditions. This is really cool. The cars are talking to each other and helping out. 

We had an Aster customer that had a very small data warehouse. They were using SQL Server. They didn’t really want a large infrastructure, and they didn’t need it. They had approximately 500 customers. But their customers were buying monstrous paper-making machines. They manufacture the machines, and the machines manufacture cardboard, tissue and 8x10 paper, and so on. These machines are about the length of a football field. They’re gigantic and they have 10,000 sensors on them. Our customer wanted to do a better job designing these machines so they implemented the Aster system. It ran in parallel and did cool stuff with the sensor data. The analysis indicated how to optimize everything about the machine so that it runs better and runs longer. The fun part, though, was that because these machines are so important, you don’t want them to turn off. You don’t want to stop the machine for maintenance. So spreading out the time between scheduled maintenance and eliminating all unscheduled maintenance is critical. Every time it is turned off, they’re losing thousands of dollars every minute. So the Aster analytics helped them optimize performance – the yield – but they were looking at all of these things they call consumables – felts, belts, certain parts of the rollers – that were under extreme stress from pressure, heat and chemicals. And they wear out quickly. So they started analyzing this and realized that their competition had been selling low-cost, knock-off felts and belts. Well, they analyzed their entire customer base, and they benchmarked them. They learned that the customers that were using the low-cost, knock-offs were having to be shut down more often for replacing. So while the knock-offs were cheaper to buy, they were actually more expensive because the machines had to be shut down more times per year.  So what’s more important? The downtime or the low cost procurement on the consumables. They gave these reports to their sales force to show to the operators and the procurement departments. Bang. $35 million the first year. Paid for the Aster and then some. They took home a handsome profit on that.

So we see stories like that everywhere. We see it in the manufacturing sector, the transportation sector, and the utility sector. You live in Wisconsin, so you’re participating in the Internet of Things already with utility smart meters. What they can do with that information is profound – the way they can reshape your pricing, reshape your behavior, but – more importantly – rebalance their electrical grids. They can rebalance their supply by time of day. So all of these things ripple right back into the system.

Let’s talk about airplane engines. Every engine emits an incredible amount of sensor data, right?

Dan Graham: Well, that’s interesting. The engines do that, but those stories you’ve heard where you’re getting terabytes per hour tend to be during test and development.  And there are really only a few engine manufacturers around, and we don’t get anything from them. Most don’t want to talk about their sensors. 

But there was a great Boeing presentation at Partners this year. Their session about their sensor data “knocked me” off my chair. I couldn’t believe it. They started with all the sensor data on the aircraft itself. They put it into a Hadoop data lake, and the engineers were going through this data looking for root cause failures. They wanted to determine:
  • Why does this part fail?
  • When does it fail?
  • How can we design this differently so that it doesn’t fail as often or ever?
Obviously, they don’t want downtime. They can’t really take this monstrous machine out of production. While the engineers were doing this, a DBA – who is a fantastic guy that I know well – told them he was going to connect the sensor data in Hadoop to the Teradata system using this new system called QueryGrid. 

QueryGrid is a little bit like a federated query. It goes out of the Teradata system in parallel, and pulls the data back from Hadoop in parallel. When they did that, the business users who don’t speak Java and couldn’t use the data lake were able to use their business intelligence tools to reach over to Hadoop and join that sensor data with the Teradata system. They were then able to more accurately schedule maintenance. Knowing when parts are going to wear out, they could schedule maintenance to make sure that the parts, the repair people, a repair bay and all the necessary tools would be there. 

That sounded great. But then because they’ve had to go to the engineers and ask them about the data, these two communities – the business and the engineers – bonded. They had a cultural bridge because they had something in common that they really wanted to work on together. This produced ten more use cases. So the business people are happy. They’re getting access to data they didn’t have before. They’re using a simple point and shoot BI tool. The engineers are helping them. The engineers feel great. The machines are being managed better. 

In addition, when the engineers at Boeing got the sensor data back into the Teradata system, there is an ANSI standard SQL function called temporal that they were able to utilize.  That was originally for multidimensional designs. Fundamentally, it looks at data as it is being loaded and if the data hasn’t changed, it isn’t stored.  Why bother – nothing has changed. Then, when you run a query, and ask for all the values from then to now, it’ll just give you the same value 50 times.  Well, turns out that’s perfect for sensor data. So he started using it on sensor data. He was getting 60 to 80 times compression. Now columnar, which is some of the benchmark for compression, gets 8 times compression. And, he reported on Sunday that he got one sensor that he managed to do over 300 times compression on. So we don’t have to be afraid of “terror” bytes.  You don’t have to be terrorized by sensor data if you’re using the right technology.

What do you see for the future of the Analytics of Things? 

Dan Graham: I’ve been working on this for a while. I can’t be more excited. I think that the first thing is that this is much bigger than big data. It will make big data look like a drop in the ocean. Arguably, the whole purpose of sensor data is to analyze it. Whether the “thing” is a wind turbine, an automobile, or a heart monitor so I can see how granny is doing, it’s going to affect every part of our lives. This is the biggest trend since the Internet, and there’s a chance that it will be even bigger than that. I’m overstating it, but the point is that it is going to affect everybody’s lives. It’s going to affect your health care. It’s going to affect how you buy things in the grocery store. The grocers are working with the hospitals and they’re working with the home improvement centers. For example, if you had some personal health issues and you signed off a little bit of privacy information – not much – but you allow the retailer to understand what you need, and they get ready to prescribe it and help you get it. The home improvement guys are ready to help you install it. All these things are starting to connect. Your cars are talking to other cars. You’ll drive down the road through your main city, and it will talk to the city planning system that guides the traffic. You don’t need to use Waze, the driving app, or any other GPS while you’re driving with your cell phone and you shouldn’t be, right? Your car will just tell you to turn left at the next intersection. It will get you through faster. As a result, we’ll have fewer traffic jams. We’ll help you find a parking place because we know where they are. And then when you drive home, your car is going to call ahead and talk to your home. It will tell your home you’ll be there in about 30 minutes so turn on the air conditioning. Everywhere you look, it’s going to touch your life. 

IDC is saying this is a $2.1 trillion business now. That’s a whole lot bigger than big data. That’s a whole lot bigger than the analytics business itself. It is going to touch everybody in every walk of life. And it’s going to be a fun ride.

Well, Dan, you always seem to be in the right spot at the right time, and it seems like the Analytics of Things is going to keep you busy for a while!

SOURCE: Analytics of Things: Bigger than Big Data

  • Ron PowellRon Powell
    Ron, an independent analyst and consultant, has an extensive technology background in business intelligence, analytics and data warehousing. In 2005, Ron founded the BeyeNETWORK, which was acquired by Tech Target in 2010.  Prior to the founding of the BeyeNETWORK, Ron was cofounder, publisher and editorial director of DM Review (now Information Management). Ron also has a wealth of consulting expertise in business intelligence, business management and marketing. He may be contacted by email at rpowell@powellinteractivemedia.com.

    More articles and Ron's blog can be found in his BeyeNETWORK expert channel. Be sure to visit today!

Recent articles by Ron Powell

 

Comments

Want to post a comment? Login or become a member today!

Be the first to comment!