BeyeNETWORK Spotlights focus on news, events and products in the business intelligence ecosystem that are poised to have a significant impact on the industry as a whole; on the enterprises that rely on business intelligence, analytics, performance management, data warehousing and/or data governance products to understand and act on the vital information that can be gleaned from their data; or on the providers of these mission-critical products.Presented as Q&A-style articles, these interviews conducted by the BeyeNETWORK present the behind-the-scene view that you won’t read in press releases.
This BeyeNETWORK Spotlight features Ron Powell's interview with Alan Meyer, Marketing Manager of Data Warehousing on System z, IBM Software Group. A few of the topics they discuss include business analytics, federation and how mainframes can reduce cost and complexity and also deliver data safely and securely.Alan, one of the biggest trends we see today is that enterprises are looking to reduce cost and complexity. Given that trend, is anyone buying mainframes today? If so, why would they spend all that money when there are less expensive options?
When customers are looking at their environments today, they're looking at ways to reduce cost and simplify. This is driving the movement toward consolidation. It gives them the ability to lower their overall cost, increase security, make administration much easier and make better use of their resources. As far as costs are concerned, most customers are finding that the mainframes are actually cheaper. In fact, a study conducted by Dr. Howard Rubin, a consulting organization focusing on the economics of IT, has shown that customers who were mainframe-centric – that is, using the mainframe as their primary platforms – actually had anywhere from 25 to 34% lower IT cost per item produced than those who weren't. It's really a less expensive option. It's one of the reasons people are rediscovering the mainframe.That’s intriguing because most people think of the mainframe as a major expense. Does it also reduce complexity?
That’s right, because you're now managing one system instead of hundreds of small systems spread across the data center. In addition, organizations are not only running both their OLTP and data warehouse on the same platform, they are also consolidating their data marts onto the platform. This saves in hardware, software, floor space and administrative costs as well.Another major trend that continues to accelerate is operational BI and analytics. A trend where IBM has been one of the early pioneers is business analytics. A lot of people in an enterprise, as well as customers and suppliers, need access to the information in the data warehouse for analytics. How does System z help make that information available in a safe and secure way?
Operational business analytics are another reason people are returning to the mainframe. They're looking at how to make use of the information that's been locked away in the data warehouse. It's become the new silo. In fact, Gartner states that only about 8.2% of an organization has access to the insights that are in the data warehouse. These are the knowledge workers who use the BI tools. Organizations are asking how they can share the insights with the other 92%, as well as their vendors and customers, to enable everyone to make more effective decisions at the right time, in the right context. These businesses are making insights available to the entire organization through their operational systems. But this means the data warehouse needs the same operational characteristics. For the same reason that organizations run their operational systems on System z – availability, recoverability and security – they are now looking at bringing those attributes to the data warehouse. These operational insights provide the ability to accurately identify each customer’s need, and meet that need, at the time of transaction, all in a highly available and secure environment.That also coincides with what we are seeing for business intelligence at the BeyeNETWORK that the next big wave is to reach everyone in the enterprise over the next 2 to five years. Looking specifically at analytics and IBM's acquisition of Netezza, how have you integrated DB2 and zSeries with Netezza and what benefits does that integration provide?
In 2010 we came out with a product called the Smart Analytics Optimizer, which used in-memory database and columnar store technologies, and was tightly integrated with DB2. Around that same time, we acquired Netezza, and we recognized that we had a technology that would allow us to do hardware acceleration for particular kinds of queries. We’ve realized that capitalizing on the Netezza technology offers even better performance, capacity and appliance characteristics, so we replaced the Smart Analytics Optimizer engine with Netezza technology but kept it fully integrated it into DB2 for z/OS. That marked the birth of IBM DB2 Analytics Accelerator (IDAA). What we have done is taken the two technologies, each optimized for specific workloads, and blended them into the best of both worlds. This allows us to use the Netezza hardware acceleration for complex queries – full table scans, aggregations, and tasks of that nature – and DB2 for z/OS to perform the more traditional mixed workload where it excels. This enables our customers to capitalize all the capabilities of Netezza while retaining all the attributes of System z and DB2
The DB2 Analytics Accelerator is transparent to the DB2 application or end user; queries just run faster when it is present. This is a full and tight integration from the standpoint that it utilizes the DB2 Optimizer to route queries to the most appropriate technology, either to DB2 or to the DB2 Analytics Accelerator. The results are remarkable. We've seen our beta customers run queries against millions rows to billions of rows tables that used to take hours that are now running in seconds and sub-seconds. It's very encouraging. It’s probably one of our most exciting products in System z in recent times, and it's changing the way people are looking at things.
Many people today talk about appliances, but this offering truly meets the definition. Every customer that has installed has been running queries in less than two days after the box arrives. It runs DB2 queries, so it is simply a matter of installing the system and loading the data. The data retains its system-of-record in DB2 for z/OS and retains all of the security and operational attributes of DB2 data. There is a second copy of data maintained on the Accelerator that is synchronized with the DB2 data and the Accelerator uses this data for high speed analysis. The Accelerator is fully managed via DB2 stored procedures, making it a truly plug-and-play appliance.How does this change the dynamics of the architecture and why is that so important to customers?
For some data warehouse applications, the data has often been moved to other platforms to provide higher speed analysis. With the new DB2 Analytics Accelerator that is no longer a requirement. In fact, it provides a platform that enables organizations to blend both their operational data store
(ODS) and their data warehouse. In a typical data warehouse environment, data is accumulated and taken off of the operational systems, put through a transformation and loaded into the data warehouse. That movement increases data latency. It also creates a lot of cost and overhead in moving that data around, opening the data to possible intrusions. With System z, everything is kept on the same platform, and DB2 and SQL is used to move data and do the transformations; the data never leaves the platform. This provides the ability to lower data latency, reduce costs, and make it a simpler overall environment.
By leveraging the same database, managing the data is simple, and operational processes are already in place, which minimizes the effort to deploy. Since administration of the Netezza technology is done with stored procedures within DB2, they can be placed into standard automation procedures. Therefore, the DB2 Analytics Accelerator looks like a service within DB2 itself. It's completely transparent to end users. They continue to use their DB2 in their applications, and the only thing that's different from their perspective is things just happen a lot faster.
We are also discovering that customers have many creative ways to benefit from this technology. Beyond using IDAA for data warehousing and data marts, customers are actually seeing that their operational reporting can be accelerated. Today, we're seeing a lot of regulation that requires operational reporting to be far more complex than it has been in the past. So here's a way of being able to apply technology and substantially reduce the time that it takes to create these operational reports. It changes the way people are looking at their overall enterprise architecture and changing some of the basic assumptions about where to run particular types of workloads.
So the Accelerator makes that determination based upon the query, right?
Exactly. If the two technologies were not tightly integrated then developers or users would have to try to figure which query to submit to the appropriate technology and how to optimize that query for the technology. With this integration, it's not something that the end user has to worry about. The optimizer within DB2 analyzes each query and determines the fastest and most cost effective way to execute each query. There is no requirement for manual intervention of any kind. Most organizations have a database analyst who spends a great deal of time looking at query plans for their warehousing systems, trying to figure out whether to add an index to a table or rewrite the query. All of this inspection and work goes away. In the DB2 Analytics Accelerator there are no indexes to create or manage; it is all done within the hardware acceleration of the Netezza technology. The SQL is moved out to the hardware accelerator so the end user and the database administrator don't have to worry about it. You continue to use the same applications, and they have much faster query times.It gets back to reducing complexity for the enterprise.
Absolutely.Another big area of interest for our audience is data federation. What is your definition of data federation and what kind of federation options does System z provide?
Federation is a process of accessing data that's in a variety of different places without actually having to move the data. It allows me to go and find the records that I want and bring them into my environment, application or data warehouse without necessarily having to move that entire file. This is helpful when I don't own the data. For instance, if I'm going out to a remote service, capturing information on the Web, or perhaps going after legacy file systems or legacy databases that require a variety of complex interfaces, federation provides me a great opportunity. I have a simple DB2 SQL interface and access a wealth of different information, irrespective of where it is. Our federation product has interfaces to almost every different data type, and we use it ourselves. Many IBM solutions leverage federation technology as their gateway into the disbursed and heterogeneous world including such products as DataStage, Optim and Event Publisher.In recent years, IBM has made other acquisitions in predictive analytics with Cognos and SPSS. Have the SPSS capabilities been integrated into the mainframe environment?
Absolutely. Today, we run SPSS on Linux on z, which is either SUSE or Red Hat, and we have the full modeling and predictive analytics capabilities of SPSS. It allows us to be able to do those predictive analytics in line, in the database, and we also are in the process of integrating that into our Cognos application so Cognos will be able to capitalize on SPSS algorithms at the time of reporting.
Predictive analytics are becoming an even greater part of our customer’s information environment, and we are working on ways to make them easier to implement and more robust in their ability to find meaningful and actionable insights.Do you have the latest growth statistics regarding mainframe adoption?
We've certainly seen very strong growth in mainframe, and this is especially true now that we announced the z196 last year and the z114 this year. The z114 really changes the perspective of the cost of a mainframe. In fact, you can buy a starting z114 as inexpensively as $75,000 so it really changes the dynamics. It allows you to have all of the attributes of a System z environment at an entry-level price point. In the last quarter of 2010 we added 35 new System z customers giving us an increase of 77 for the full year. So as you can see, customers are recognizing the real value in this growing platform.I really appreciate you taking the time bringing us up to speed on the DB2 Analytics Accelerator, what's happening with Netezza, and giving us some insight into the zSeries and mainframes.
SOURCE: Predictive Analytics, Operational BI and Data Warehousing on the Mainframe: A Spotlight Q&A with Alan Meyer of IBM
Recent articles by Ron Powell