Remy Rosenbaum lays out several reasons why he’s seen business intelligence projects on Hadoop fail:
In order to set up and run an effective Big Data Hadoop project that provides reliable BI, your organization will need to adopt a new mindset that addresses not only the technology, but also the organizational EIM. You will need to conduct a comprehensive analysis of your business with the help of analysts, internal domain experts, and strategists to come up with robust and relevant business use cases. You will also need buy-in from management, and take company politics into consideration.
Your Big Data project needs to work with your existing BI tools, along with your security and monitoring systems. Data security needs to be addressed because standard Hadoop implementations have relatively poor security, and many organizations are wary of keeping all their data in one location.
I do agree with these reasons, though I’m a bit surprised that I didn’t see much about “classic” BI problems like the inability of the company to standardize on terminology or definitions (e.g., what the Kimball method describes as conformed dimensions), the desire to tackle too much of the problem at once, rapidly-changing source systems (and how BI team members tend to be the last to know that something has changed), etc.