Blog: Rick Barton Subscribe to this blog's RSS feed!

Rick Barton

Hello and welcome to my blog. I am delighted to blog for the BeyeNETWORK, and I'm really looking forward to sharing some of my thoughts with you. My main focus will be data integration, although I don't plan to restrict myself just to this topic. Data integration is a very exciting space at the moment, and there is a lot to blog about. But there is also a big, wide IT and business world out there that I'd like to delve into when it seems relevant. I hope this blog will stimulate, occasionally educate and, who knows, possibly even entertain. In turn, I wish to expand my own knowledge and hope to achieve this through feedback from you, the community, so if you can spare the time please do get involved. Rick

About the author >

Rick is the director (and founder) at edexe. He has more than 20 years of IT delivery experience, ranging from complex technical development through to strategic DI consultancy for FTSE 100 companies. With more than 10 years of DI experience from hands-on development to architecture and consulting and everything in between, Rick’s particular passion is how to successfully adopt and leverage DI tools within an enterprise setting. Rick can be contacted directly at

September 2009 Archives

I'm back off my holidays and feeling very refreshed and raring to go.  I'd like to say thanks to Phil for covering for me and will be following on from his theme in a later blog.  (For those following; I know I have made a promise such as this before in a blog about warehouse delivery times and haven't yet, but please hang on in there.  It is on my guilt list and due soon!).

Anyway on to the business at hand,  I had the pleasure of attending the Enzee Universe in London last week.  It was a fabulous affair and the new twin fin appliance was unveiled.  With twin fin Netezza have again raised the bar from the price/performance perspective.


Although I have only had a brief overview of the new architecture the key difference is the move to commodity hardware in the form of Intel based blades, making the technology much more approachable.  The FPGA technology is still present thus retaining Netezza's "secret ingredient" and one of the key components that deliver the outstanding performance.


Netezza have also been busy building a set of "on the box" match ups with other vendors and one I am particularly enamoured of is the new KONA platform.  KONA is a marriage of Kalido and Netezza and for anyone considering a new warehouse or re-architecting a legacy platform this is very much worth a look.  Kalido's ease of set up and long term maintenance and Netezza's performance is a potent mix indeed.   It is in effect a whole data warehouse in a single box which is very attractive prospect from a management perspective.  At the moment this offering is geared towards specific verticals, however I would anticipate it being opened out in time.


So all in all a great day and once again Netezza are continuing to demonstrate the forward thinking that has made them the market leader in appliance technologies.

Posted September 28, 2009 2:18 PM
Permalink | No Comments |

The market for data integration tools has never been either as strong or rich as it is today. 10+ years ago today's giants in the marketplace offered the classic Extract, Transform and Load capability, and have been evolving their platforms ever since to provide the current batch of feature rich, high performance packages that now extend to cover data migration, data quality (profile, monitor and cleanse), web services and enterprise metadata management.


A recent Gartner paper (available courtesy of Syncsort here), reveals the market for Data Management and Integration tools is nearly $1.7 billion, and is expected to grow to $2.7 billion by 2013.  This growth is driven by an awareness of the high cost of delivering data centric projects by using the manually intensive programming techniques of the 3GL languages, and businesses are increasingly attracted to the productivity gains offered by Data Integration tools.  At the same time businesses listen intently to the marketing machines and accept at face value that the data integration tool they end up buying really will be the silver bullet to solve all their data problems.


Wrong, wrong, wrong.


Choosing a data integration technology is just the first step on a journey to improving productivity and responsiveness within a business, making it work over the long term is a little more difficult.  Having worked on countless data integration projects over the last 12 years, my biggest source of frustration is when the customer has been set an unrealistic expectation about how easy it is to work with the technology. 


Yes, DI tools are certainly an order of magnitude easier than hand cranking code, but the architecture will not take care of itself, and the out of the box settings almost always last little more than a few months before progress falters - it may even halt until things are fixed.


Why does this happen?


Most DI tools find their way into a business following a Proof of Concept project. Proof of concepts are just that, they prove something.  And the intent is prove the concept as quickly as possible.  They don't usually provide production ready code, and the environment isn't usually set up at this stage to support wide scale usage.  It is usually impressive in terms of results but can also be very fast and dirty. 


It also helps to understand the business model of the vendor, and therefore their ultimate motivation. Some vendors don't have professional services, and rely on licenses and maintenance as their main income stream. This leads to the possibility that the immediate sale becomes the focus - at the expense of making it last


Credit to the vendor presales  who perform the POCs.  They are highly skilled and deliver at tremendous speeds.  The problem however manifests itself in the fact that they make it all seem very easy.........too easy.  Remember, because the vendor does it with consummate ease and the tool is shiny, it doesn't mean that even your brightest technical people can achieve the same feat the day after returning from the training course.  Your team will need support to get to the apex of that learning curve as quickly as possible - particularly around implementing a stable and scalable architecture.


So ask your vendor what their business model is. If they're not geared up to do professional services, you probably need to find a partner to help you with the transition of your delivery team from newbies to experts - and the sooner the partner gets involved, the fewer long term problems you're likely to see. 


Any integration partner worth his or her salt will start with a discovery phase during which they audit the environment and map the business needs to a technology strategy and plan. Many times I've been here, and in very short time I often find that the software has been poorly configured so it won't scale, in one or more of performance, complexity or enterprise growth. What follows is a difficult conversation with the business sponsor who still can't believe that the wonder tool has failed to deliver.


The moral of the tale is this: when you start evaluating a data integration tool, begin to evaluate an integration partner that you can trust and work closely with. Get them in early, preferably when you're doing the proof of concept with the vendor, so they can make ensure a smooth transition between vendor and internal team. They can help with your evaluation criteria, your architecture and governance and help you avoid pain later in your delivery programme.


It's often said that there's no substitute for experience, and that's never truer than when applied to data integration projects; a few days consulting early in the project lifecycle can save tens or even hundreds of thousands of pounds down the line.



Posted September 7, 2009 9:05 AM
Permalink | No Comments |