This is our blog about designing and implementing Intelligent organizations. Within this area of expertise we write often about the following subjects BI, analytics, the tools, decision management, data visualization, BI success, Business Intelligence concepts, data management, continuous improvement, and the organization of BI & Analytics.
Of course you can order our book here too.
A’dam – October, 17th 2007 – Business intelligence is a priority for enterprise leaders, but BI efforts are being undermined by poor organization and communication. One way to address this according to Gartner is to create a ‘BI competency centre’ within the business. So what does this entail?
For the second year in a row, studies have revealed that business intelligence tools (BI) is the number one technology priority for business leaders. Earlier this year, Gartner surveyed 1,400 CIOs, with the vast majority emphasising the importance of BI as a strategic initiative. But even though firms are acknowledging that BI is instrumental in driving business effectiveness and innovation, there remains a sticking point when it comes to actual execution of the business intelligence strategy.
In a recent poll on this website we asked our visitors to vote for what they think contributes most to the success of Business Intelligence. Most voters (approx. 50%) think that the success depends highly on using the best Business Intelligence tool.
In a recently held intelligence survey among nearly 400 organizations, across several branches, it became clear that the success of Business Intelligence depends highly on the analytical skills of people. Information should be used for analysis and action. The results of the survey are written down in the research paper “Making BI successful”.
The year 2010 is well on its way and many of you posted their insights and predictions for the upcoming year.
If you combine those predictions/outlooks with this enormous list compiled by the Cutter Consortium you’ll see a lot of stuff that repeatedly gets mentioned. Cloud, Agile and Social Media seem to be the hottest trends for the years ahead, and Open Source BI tools, Operational Business Intelligence and Analytical Databases are also part of almost every list.
When it comes to your body’s health, exercise, diet, and proper rest are key to a productive life. The same may be said about getting the most out of your enterprise’s business intelligence (BI) systems.
Shaklee, a 50+ year-old private company that makes nutrition, personal care, and eco-friendly household products, as well as water treatment systems, knows about this. The company operates through a network of more than 750,000 independent members and distributors in the US, Mexico, Canada, Japan, Malaysia, Taiwan, and China.
Use it up. Wear it out. Make it do. Or do without.” That adage from the Great Depression is making a comeback these days among corporations that are digging deep to maintain profitability using business tools they already have in-house.
One of those companies is Creativity Inc., which two years ago was facing a serious threat to its business model. The company, which designs crafting products and markets and distributes its wares to specialty retailers, was being undercut by overseas manufacturers as retailers began to buy direct. The trend preceded the current economic downturn, but it hit with renewed vigor when the recession deepened.
The recession is fostering interest in Business Intelligence software, which helps companies analyze the data they collect for new cost-cutting or sales opportunities.
By Rachael King
With restaurants stretching from Seattle to the Osan Air Base in South Korea, the Chili’s Grill & Bar chain gets a lot of visitors. For parent company Brinker International, that makes for a lot of Triple Dippers to keep track of. Also the owner of such chains as On The Border Mexican Grill & Cantina, Brinker has to gather information on sales, inventory, and other operations at 1,700 restaurants in 27 countries that receive more than 1 million customers a day. An even bigger challenge, though, is sifting through all that data, organizing it, and then using it to make the business run more profitably.
Question: Which ETL tools can support Data Vault modeling out-of-the-box? What are the challenges and issues building a data vault with ETL tools?
1. Marcel de Wit – See another discussion on LinkedIn (still active; Dutch)
2. Daan van Beek – Thanks Marcel, I did read the comments of that discussion too, it was the reason I started this discussion actually. After reading it, it was still unclear to me whether ETL tools do support Data Vault out-of-the-box like slowly changing dimensions or not. So, who knows the answer(s)? The vendors?
After users become familiar with your BI project’s benefits, they’ll likely want more. Be prepared to provide analysis of unstructured data. We’ll show you how to start.
You’ve spent the last five years defining, establishing, and building an analytical environment for your organization. You received accolades for finally providing access to structured information from your company’s transactional systems through a business intelligence (BI) tool with underlying data marts, a data warehouse, and a data integration tool. Now — all of a sudden, it seems — your colleagues are asking for access to other kinds of content such as email, documents, and audio-visual media through your analytical architecture so they can use this content for predictive analytics in the BI application. Where should you start?
You can still hand-code an extract, transform and load system, but in most cases the self-documentation, structured development path and extensibility of an ETL tool is well worth the cost. Here’s a close look at the pros and cons of buying rather than building.
The Extract, Transformation, and Load (ETL) system is the most time-consuming and expensive part of building a data warehouse and delivering business intelligence to your user community.
Many companies are seeing very significant increases in data volumes and these are having an impact on their data warehouse programmes, according to the latest survey from PMP Research. The research has been commissioned by the Evaluation Centre.
Most of the organisations polled (68%) report that data volumes have increased substantially over the past three years, with a further 25% indicating more modest rises. Only 2% reckon that data volumes have stayed constant over that time period.
“I am tired of IT people. They’ll talk to you for half an hour about what they do, but by the time they’re finished, you still don’t have any idea what they’ve told you or why it’s important,” so complained a friend of mine in a recent conversation.
What was the irritant giving rise to my friend’s frustration? Ahem … my own ineffective attempt to explain to him what it is that I, a data warehousing/business intelligence (DW/Business Intelligence) practitioner, do. Sometimes the right metaphor is helpful. It can clarify abstract concepts for the uninitiated and, even for the expert, be a means for synchronizing designs and vocabularies and analyzing problems. To that end, let me propose a metaphor for describing what data warehousing and Business Intelligence is all about and, perhaps more importantly, suggest where the field is broken: the metaphor is that of an information supply chain.
A’dam – September, 22th 2007 – The seeming simplicity of a term like “business intelligence” belies the rich potential its implementation can provide to a small or midsize company. Data mining is the backbone of BI–it means sorting through company data and identifying and extracting valuable information about operations.
In a customer-centric business, the information can be particularly compelling. We turned to Gordon Linoff, a principal at Data Miners, to learn what data mining for BI can do for small and midsize businesses.
When is right-time not real-time enough? That, as it happens, is the million-dollar question. Business intelligence vendors have made an awful lot of noise about a “new generation” of “real-time”, “near-real-time”, or “decisioning” tools, touting everything from their ability to provide “actionable business insight” to other intangible (and usually unspecified) competitive advantages.
Industry watchers, on the other hand, like to stress that there is no single “real-time” Rx for every company or every industry; instead, they argue, companies need to figure out a “right time” window that’s, er, right for them, and go from there. Who’s right? Who’s wrong? Does it matter?Well, everyone. And no one. And, yes, it does matter. In fact, once you get down to brass tacks with most Business Intelligence and Performance Management vendors, you’ll find that their views are actually very similar—for the most part identical—to those of many prominent industry talking heads.
We have recently been investigating why so many data migration projects (84%) run over time or budget. Over half of the respondents in our survey who had run over budget blamed inadequate scoping (that is, they had not properly or fully profiled their data) and more than two thirds of those that had gone over time put their emphasis in the same place.
I mention this because it is symptomatic of all data integration and data movement projects: data quality needs to start before you begin your project (so that you can properly budget time and resources) and continues right through the course of the project and, where it is not a one-time project like data migration, is maintained on an on-going basis through production. Further, in order to maintain quality you need to be able to monitor data quality (via dashboards and the like) on an ongoing basis as well. This is especially important within the context of data governance.
A’dam – March, 3th 2008 – With several books on data warehousing to his name, and a business that trains IT staff in the field, Ralph Kimball has been named in some quarters the “father of data warehousing” — though he appears to share that title with another data warehousing pioneer, Bill Inmon, if a Google search of the two is anything to go by.
He and Inmon have different approaches to the art of data warehousing, but Kimball, who is teaching in New Zealand this month, says that doesn’t make them enemies.
Ab Initio has been around now for a long very time in the ETL market space to make a name for themselves. Their marketing approach appears to be one of mystique: maintain secrecy around the product while allowing some information out about the high-end customers, creating interest because of the tantalizing tidbits they provide.
Their website is more typical of an advertising firm with nothing meaningful to say than a technology company. Years later (2015) this is still the case. Their website loads very slowly and is full of flash (no HTML) and there is even no phone number or contact form. Our guess: they are finished.
IBM today announced plans to acquire DataMirror Corporation, a business information firm based in Ontario, Canada, for approximately $161m in cash.
DataMirror provides technology that identifies and captures data that has been added, updated or deleted, and allows the changed data to be delivered to processes, applications and databases.
“Organizations need the ability to capture and use information in real time to help them make better business decisions, better serve their customers and increase operational efficiencies,” said Ambuj Goyal, general manager at IBM’s Information Management Software unit.
Talend, a global open source software leader, today announced version 5 of its unified platform for integration, becoming the only open source vendor to offer a comprehensive technology portfolio that spans all enterprise integration needs, including data, applications and business processes.
With this new version, Talend introduces a holistic approach to integration, enabling IT organizations to converge traditionally disparate integration efforts and practices through a common set of products, tools and best practices. The Talend Unified Platform provides a common foundation for the products to be used individually or in concert depending on the needs of the organization.
Today from PASS Summit 2010, Microsoft Corp. unveiled the next generation of SQL Server, Microsoft SQL Server code-named “Denali,” and made available SQL Server 2008 R2 Parallel Data Warehouse.
The company also announced the new Microsoft Critical Advantage Program, which provides an end-to-end experience across the complete life cycle of mission-critical appliances, such as SQL Server 2008 R2 Parallel Data Warehouse. Combined, the product and program help enterprise customers successfully design, deploy and optimize appliance-based solutions to help ensure the highest level of scalability, availability and reliability for the most demanding data warehouse requirements.
Oracle today unveiled the Oracle Exalytics Business Intelligence Machine, the industry’s first in-memory hardware and software system engineered to run analytics faster than ever, provide real-time speed-of-thought visual analysis, and enable new types of analytic applications.
Oracle Exalytics enables organizations to make decisions faster in the context of rapidly shifting business conditions while broadening user adoption of business analytics though the introduction of interactive visualization capabilities that make every user an analyst.
Informatica Corporation, the world’s number one independent leader in data integration software, today announced continued momentum in the adoption of Informatica Cloud across the insurance industry.
An increasing number of insurance companies rely on Informatica to address their unique challenges of integrating disparate data from a multitude of channels including adjusters, brokers, service providers, underwriters and other related parties.
MicroStrategy Incorporated (Nasdaq: MSTR), a leading worldwide provider of Business Intelligence (BI) software, and Informatica Corporation (Nasdaq: INFA), the world’s number one independent leader in data integration software, today announced a strategic alliance to make Informatica’s enterprise-class Data Integration Platform a core component of the MicroStrategy Cloud.
MicroStrategy Cloud will offer both multi tenant and dedicated versions of Informatica’s Data Integration technology as an optional service to MicroStrategy Cloud customers. These data integration services will be delivered under the MicroStrategy Cloud brand name.
Integration vendors launch commercial add-on products for the hot open-source framework. Here’s how four products streamline high-volume workloads.
By Doug Henschen
Apache Hadoop is one of the fastest-growing open-source projects going, so it’s no surprise that commercial vendors are looking for a piece of the action.
Witness a spate of recent announcements from well-known data-integration vendors including Informatica, Pervasive Software, SnapLogic, and Syncsort, all of which are aimed at making it faster and easier to work with a very young big data processing platform.
For the last three months of last year Passionned Group have run a poll on their ETLtool.com site asking visitors what they thought were the most important requirements when choosing an ETL tool. The ETLtool.com site has been running since 2005 and advises visitors on the various ETL (data integration) tools available, what their strong and weak points are, how to choose an ETL tool and sells a 100 page report where popular products are analyzed and compared to facilitate choosing the right product for individual circumstances.
This chapter focuses on a new design technique for the analysis and design of data integration processes. This technique uses a graphical process modeling view of data integration similar to the graphical view an entity-relationship diagram provides for data models.
There is a hypothesis to the issue of massive duplication of data integration processes, which is as follows:
If you do not see a process, you will replicate that process.
Jaspersoft has delivered a new release of its open-source Business Intelligence software, offering an improved user interface framework and new customization and integration capabilities geared toward developers who build the toolset into their applications.
Sixty percent of the company’s business comes from ISVs and developers who embed Jaspersoft within their applications, said CEO Brian Gentile, in an interview.
A lot of business intelligence vendors are selling their cloud offerings, which seems like ‘heaven on earth’ to us. It’s cheaper, faster, more scalable and more reliable they say. But can true Business Intelligence be in the cloud? Can we draw parallels between water from the tap and Business Intelligence in the cloud?
In our view it’s running your business better using key information about your processes, your clients and the market. To be able to do that you should gather all kinds of data from a variety of source systems inside and outside your company network, integrate the data and transform it into information to produce insights in such a way that we can speak of ‘intelligence’. Each company has it’s own intelligence which can bring a real competitive advantage, allowing companies to swim in the profit pool.
To help organizations achieve better business visibility and alignment, Oracle today introduced new releases of its complete, integrated and scalable business intelligence products including Oracle Business Intelligence, Oracle Business Intelligence Applications and Oracle Real-Time Decisions. The new product capabilities delivered span out-of-the-box iPad and iPhone support, extended OLAP and in-memory platform support, enhanced real-time decision management features, new certifications, and more.
At the recent annual SQL Server user conference (PASS) there were some exciting news and announcements. Microsoft’s BI team recaps the most important news as follows. Corporate Vice President Ted Kummert of Microsoft shared his views on the ‘New World of Data’. He disclosed a number of important announcements.
Ted unveiled the final name for the next SQL Server release: SQL Server 2012. Not really a big surprise.
REDWOOD CITY, Calif., June 6, 2011 – Informatica Corporation (NASDAQ: INFA), the world’s number one independent provider of data integration software, today announced the Informatica 9.1 Platform. Based on years of pioneering innovation, Informatica 9.1 is the first unified data integration platform designed to unleash the full business potential of Big Data and empower the data-centric enterprise.
Information Builders, an independent leader in operational Business Intelligence (BI) solutions, today announced a new complete end-to-end data warehousing solution as part of the Teradata Accelerate program. The solution is built on a Teradata data warehouse appliance, and leverages Information Builders’ software to both load and query the data. The new solution is designed to speed up the time to value and return on investment for companies building a data warehouse.
Companies just don’t get it… Just because business intelligence sounds impressive and has spawned a whole industry doesn’t mean companies know what it’s really for, says the Naked CIO. I have just been asked to talk at a conference about what not to do when implementing a business intelligence platform.
Given my views on business intelligence, they may not have realized quite what they’re letting themselves in for. Even the term ‘business intelligence’ confounds me. Apart from the two words often being contradictory, my particular beef is that business intelligence is no more than a construct. It is a result of a workaround to systems and applications not doing what they were supposed to in the first place. What’s worse, our collective incompetence at making systems and applications work properly has spawned an entire industry of Business Intelligence software. Even a piece of software that was designed as a workaround for integration is often not deployed properly. Am I am the only one to see the irony in this?
The Business Intelligence vendor QlikTech has released a new version of its flagship QlikView Business Intelligence. In this version a complete new concept is added to the existing capabilities: set based analysis.
Vendors like BusinessObjects and Cognos were from the beginning focused on reporting. The Business Intelligence software pioneer QlikTech has never had some kind of reporting. Instead, they invested a lot of time in developing advanced tools for analyzing information, often very user-friendly. With the introduction of set based analysis they have added a complete new concept to their award winning Business Intelligence software for both businesses and (local) governments.
Microsoft continues its shopping spree to bolster its SQL Server database platform to make it more suitable for large-scale enterprise deployments. On Thursday the company said it plans to buy DATAllegro, a privately held maker of data-warehouse appliances.
The terms of the deal, which comes on the heels of one announced last week to purchase data-quality technology vendor Zoomix, were not disclosed. Microsoft will retain most of the 93 DATAllegro employees, who will continue to work out of their existing office in Aliso Viejo, California.
SAP, the German company that sells and implements ERP software, has acquired the Business Intelligence platform BusinessObjects. Both information technology companies have agreed on the price per share and other conditions. SAP will pay 42 EUR in cash per ordinary share and John Schwarz, CEO of BusinessObjects will stay in charge of the BusinessObjects product group.
With the acquisition of BusinessObjects SAP will get a substantial market share in the highly competitive Business Intelligence market. The company had a Business Intelligence solution with SAP Business Warehouse, the query tool BEx and the business process platform SAP NetWeaver BI, but they run with SAP only. In addition these solutions were far from being innovative and their functionality and usability was relatively poor compared to BusinessObjects and most of the other Business Intelligence tools.