It’s amazing how similar the various “warnings” are when it comes to data warehouse projects. An article by Dr. Paul Dorsey listed several cautions when dealing with this same issue. Dorsey specifically states “most organizations that undertake warehouse projects dismally disappointed with the results…[because] data warehouse projects are much harder to complete successfully than traditional systems development projects.” Why is this? Here are some fundamental reasons: 1. Designing a data warehouse is fundamentally different from designing
an OLTP structure. Rushing into a DATA WAREHOUSE project is one of the worst mistakes a company can make – as was the case with Close Call. The CEO of that company made the fist blunder in being tempted by a software vendor. The CEO acted on his own malformed “instinct” that a he had to “act fast” in order to keep up with growth. The CEO acted impulsively by buying into the vendor’s sales pitch and set two unrealistic goals: to develop the DATA WAREHOUSE in three to four months and to budget the project at only $250,000. Another initial mistake was in NOT starting out with an experimental pilot project (sometimes, it’s best to start out with a mini version, such as a data mart, rather than a full blown project). As it turns out, the project team had ended up extending the initial build time to 5 months and persuading the CEO into first instigating a pilot project. But even with the pilot inception, the Close Call project was doomed from start up due to one primary fault: lack of a clearly defined business objective. During the requirements phase, it turned out that the functional requirements model revealed a highly complex set of business requirements; this was further complicated by “an inconsistent group of data ‘facts’ that would populate the warehouse.” Obviously, no one had given serious though to data preprocessing stages such as cleaning, integrating and transformation functions. Not only was the data “dirty”, not much of it was useful (or even available) in its current state due to legacy and antiquated-technology issues. Data migration is paramount. According to Dorsey, “data migration can sink the project. Not only are migration scripts large and complex, they must be maintainable because they have to keep the warehouse in synch with the production system when the structure of either changes. This is not like a legacy migration script that is used once and discarded. Because it must be run periodically, the script must be tuned to run efficiently and maintained easily.” The Close Call project overran its deadline and failed due to these
mistakes. According to the article, the “Red Flags” of this
project were: The author of Ten Mistakes to Avoid (http://www.data warehouse-institute.com/papers/10mistks.htm)
offers some sound advice. In addition to the mistakes listed above, which
of mistakes listed in this article did Close Call make? From Close Call own account: “Panicked at the thought of breaking that news to the executive sponsors, the team jury-rigged a way to populate the pilot by parsing the DOS-based Reflex reports and manipulating the report data into a relational database format. But the handata warehouseriting was on the wall—without replacing the proprietary switching systems, there would be no data warehouse.” The best move that Close Call made was to abandon the project. It took a heavy toll (costing them a considerable amount of $ plus loosing half their IT staff). But if the company insisted on continuing, it would have cost them much more. Unbelievably, this fiasco stated at friendly golf game. Some tips for approaching a DATA WAREHOUSE project: A. The project leader must be experienced. The project leader should
have completed other successful data warehouse projects and be aware
of the different types of end-user tools including flexible reporting,
ad hoc query and OLAP alternatives. More Data warehousing and data mining information: OLAP vs. OLTP |