Blog: Mike Ferguson Subscribe to this blog's RSS feed!

Mike Ferguson

Welcome to my blog on the UK Business Intelligence Network. I hope to help you stay in touch with hot topics and reality on the ground in the UK and European business intelligence markets and to provide content, opinion and expertise on business intelligence (BI) and its related technologies. I also would relish it if you too can share your own valuable experiences. Let's hear what's going on in BI in the UK.

About the author >

Mike Ferguson is Managing Director of Intelligent Business Strategies Limited, a leading information technology analyst and consulting company. As lead analyst and consultant, he specializes in enterprise business intelligence, enterprise business integration, and enterprise portals. He can be contacted at +44 1625 520700 or via e-mail at mferguson@intelligentbusiness.biz.

November 2009 Archives

Everywhere I look at the moment I see my clients talking about needing to benchmark themselves against the market, to understand customer and prospect sentiment on social networking sites and to understand competitors in much more detail. It is not just me that has recognised this need. It also seems that new young startup companies have also seen this gap in the market. Over the last few days I have spent some time talking to Andrew Yates, CEO of Artesian and Christian Koestler, CEO of Lixto about their solutions in this area.

Artesian are are focused on monitoring media news, competitors intelligence and market intelligence that can be fed into front-office processes - in particular to sales force automation. Integration with SalesForce.com is provided as is delivery to mobile devices for mobile salespeople on the road. Their intention on media intelligence for example is to track coverage across all media channels contextually matched to commercial triggers or specific areas of interest.  What I like about Artesian is the fact that they have looked at how to drive revenue from intelligence derived from web content by plugging it into front-office processes. Also by adopting social software attached to front-office systems like SalesForce.com's new Chatter offering it becomes possible to collaborate over this intelligence. I would like to see this solution integrate with Microsoft SharePoint and IBM Lotus Connections for more use in large enterprises. However, seeing the need to focus attention on content that has real value in the front office is a real strength of this young startup.  

Lixto has a integrated development environment that allows you to build analytic applications pulling data from web sites such as competitor price information, new competitor marketing campaign data and other information that can be loaded into their customisable analytic applications to monitor competitors for example. 

Extracting insight from external data is definately on the increase with YellowBrix and Mark Logic also in on the act. IBM jumped into the market back in October with their announcement of IBM Cognos Content Analytics. This market is heating up. It seems to me that the start-ups are out there with competitive offerings.


Posted November 27, 2009 10:16 AM
Permalink | No Comments |

Following on from my last blog on data federation, the next data federation pattern I would like to discuss is a Data Warehouse Virtual Data Mart pattern. This is as follows:

Pattern Description

This pattern uses data virtualization to create one or more virtual data marts on top of a BI system thereby providing multiple summarised views of detailed historical data in a data warehouse. Different groups of users can then run ad hoc reports and analyses on these virtual data marts without interfering with each others' analytical activity.

 

Pattern Diagram

 

Virtual DM Pattern.JPG

 

Pattern Example Use Case

Multiple 'power user' business analysts in the risk management department of a bank often need their own analytical environment to conduct specific in-depth analyses in order to create the best scoring and predictive models. This pattern facilitates the creation of multiple virtual data marts without the need to hold data in many different data stores

 

Reasons For Using It

Reduces the proliferation of data marts and also prevents inadvertent 'personal' ETL development by power users who have a tendency to want to extract their own data to create their own data marts. It is often the case that each power user wants a detailed subset of data from a data warehouse that overlaps with the data subsets required by other power users. This pattern avoids inadvertent inconsistent ETL processing on extracts of the same data by each and every power user. It also avoids the duplication of the same data in every data mart, improved power user business analyst productivity, reduces the time to create data marts and reduces the total cost of ownership.  

 

 

 


Posted November 27, 2009 7:01 AM
Permalink | No Comments |

Following on from my last blog on data federation, the next data federation pattern I would like to discuss is a Data Warehouse Holistic Data View pattern. This is as follows.

 

Pattern Description

This pattern, also known as the schema extension pattern, uses data virtualization to create a holistic complete view of business activity by combining the latest most up to date operational transactional activity in one or more operational systems with detailed corresponding historical data from data warehouses and data marts.

 

Pattern Diagram

 

 

Holistic Data View Pattern.JPG

 

Pattern Example Use Case

Front-office staff in a call centre operator or a branch of a bank may need to view current risk exposure for a customer they are on the phone to while also looking at a risk exposure trend for that customer across all loan products held. A second use case is regulatory compliance reporting whereby operational and historical data may both be needed for compliance reporting.  

 

Reasons For Using It

This pattern allows companies to quickly show a holistic view of business activity that includes the more recent transactional activity combined with historical activity. This data can be presented for analysis and reporting even if the latest transactional data has not yet reached the data warehouse. 

 

Look out for the next data federation data warehouse patterns on virtual data mart and virtual data source coming soon

 

 


Posted November 16, 2009 10:08 AM
Permalink | No Comments |

As you probably know, Informatica announced Informatica 9 yesterday in a blaze of publicity with, I am led to believe, over 10,000 people registered to view the announcement. So I thought I would make a few comments on what was announced.

The three main strands of the announcement were

·         Relevant data through business-IT collaboration

·         Trustworthy data through pervasive data quality

·         Timely data through open SOA-based services

 

Relevant data through business-IT collaboration includes new Browser-based analyst tools for analysts to directly specify their business requirements, automatic generation of implementation details from business specifications, and a common metadata repository allowing business analysts and IT developers to collaborate and share specification and implementation artifacts with each other

 

Pervasive data quality allows data quality rules to be specified once and reused repeatedly, ensuring consistency across applications. In addition, role-based tools are offered to allow stakeholders to take ownership of their own data quality requirements. Data quality scorecards, simple analyst tools and productive developer tools are also available to empower business users, business analysts, data stewards and IT developers to be directly involved in measuring and improving data quality.

 

SOA data services includes support for

·         Information catalog services to enable users to discover relevant data be it on-premise or in the internet cloud.

·         Logical data objects

·         Multi-modal data provisioning services to deliver data in a multiple formats using various protocols such as web services and SQL

·         Policy-based data services governance

 

In my opinion, differentials include policy-based data services governance (which is very unique) and the Business Analyst tools and collaboration support. The web based Business Analyst tools look a very compelling story although I would however have liked to see more integration with Microsoft and IBM Lotus collaborative tools and workspaces.

 

Data federation and consolidation on the same platform off same metadata with auto generation is a very strong capability. IBM has the same function but auto-generation in their case is out of two separate tools (InfoSphere Data Architect generates data federation logical objects and mappings while InfoSphere Fast-Track generates ETL jobs for Data Stage. Both IBM tools use common metadata however). I would have liked to have seen Informatica go the extra mile and auto generate XSLTs for XML message translation by ESBs/Message Broker products. I don't see this support but equally I don't see it anywhere else either as yet.  In addition I would have like to have seen MapReduce functionality in the announcement to handle Big Data integration. No doubt this is coming.

 

With respect to data services, I don't see ability to publish data services to an Enterprise Service Repository so that these services can be managed centrally in a common place with all other types of service although UDDI support was announced. Some competitors can publish services to ESRs, e.g. IBM with the InforSphere Services Director. Informatica's approach to Cloud Data integration also appears seamless but more information is needed. I understand a new announcement coming soon although they have already announced support for running PowerCenter on Amazon's EC Cloud.  In terms of competition, Microsoft can already run SSIS on SQL Azure cloud to integrate cloud data. In addition, IBM also has multi-modal support on InfoSphere Information Server beyond SQL and Web Services. They also support JAVA RMI, REST as well as SOAP, SQL and X/Query.

 

I would also have liked to see Informatica stick their neck out and acquire a data modelling tool rather than just integrate with everyone else's products. However, overall, this is a strong announcement with another Cloud announcement to come. There is no doubt that integrated Data Management platforms are here now with Informatica and IBM leading the way with e-Clipse based tool suites. SAP BusinessObjects and SAS DataFlux are clearly not far behind.  Expect more from Oracle and Microsoft in 2010.

Looking at the trend here, it is clear that companies need to look seriously at moving from separate data management tools from many different suppliers, each with its own metadata, to single platforms with integrated shared metadata


Posted November 11, 2009 8:26 AM
Permalink | No Comments |