This article describes a practical method for BI professionals, business and information analysts, and controllers, to systematically develop proper KPIs. Our method of defining KPIs, the SMART KPI Toolbox, is well-documented and also makes them technically implementable. One of the advantages of good business intelligence is that every user can effortlessly see the information relevant to them on the screen. Besides descriptive reports and analysis possibilities, that means relevant Key Performance Indicators (KPIs) that provide quick insight into the performance of the team, department, or segment of the business. KPIs are a part of performance management.
If the challenges associated with Big Data are handled well by making things relevant, digestible, and specific, you can go from Big Data to Right Data and then make the Right Decisions. Then, you'll be able to make the four Vs of Big Data work to your advantage: you'll see more and see better as an organization! Volume: see more. Velocity: spot things faster. Variety: see more detail and more nuance.
Do you want to stay ahead of the competition and give your customers the best possible experience? Become a data-driven organization. Instead of making decisions based on opinions, gut feeling, who yells the loudest, or because "that's just the way we've always done it," your organization will take action based on data and facts. A study by the MIT Center for Digital Business shows that organizations that make data-driven decisions have 4% higher productivity while earning 6% more profit. That may not seem like a lot to some, but over $1 million in profits, we're already talking about $60,000. That amount of money can get you a pretty decent BI system. Other studies have shown that successfully using BI can lead to 33% more satisfied customers. And with IoT applications, we can save $63 billion worldwide in healthcare. You can find many advantages of data-driven working in all kinds of areas. That's why you should make data an integral part of your company's DNA now, in just five steps.
The latest edition of the ETL Tools Survey is out: a good reason to chat with Rick van der Linden, Business Intelligence and ETL expert, about the latest trends in ETL. "ETL stands for extract, transform, and load. ETL can be compared to purifying water. First, undrinkable water (data) is extracted from various rivers. This dirty water is purified using a tool. The purified water is stored in a container that you can drink from," says Rick van Der Linden.
The Gemiva-SVG Group aids people with a mental, physical, or multiple disability, and people with non-inborn brain damage. Gemiva asked Passionned Group for advice on the architecture and organization of its data warehouse, plus advice on which ETL tool to use. The advice project was completed to great satisfaction, and was rated a 9 out of 10 by Gemiva. There were four main questions plaguing Gemiva that they wanted Passionned Group to answer.
In this technology-driven world, we like to believe in progress. It can be easy to forget that “new” isn't always better. A prime example of this is the “data lake” phenomenon: a figurative lake of data, which became a hype in Business Intelligence (BI) and Big Data circles. Many companies rushed to jump on the bandwagon and built their own data lake. But was this such a good idea, or would it have been better to think about what can be accomplished using a data lake first?
The demand for data scientists is rapidly growing. This function is becoming increasingly important within organizations and its salary is growing to match its importance. Research by the McKinsey Global Institute shows that the lack of analytical and management talent in successfully implementing Big Data is one of the biggest challenges the USA is facing. The McKinsey Global Institute estimates that there are four to five million openings for data scientists in 2018 alone. The hunt for data scientists has been opened. This jack of all trades in your organization should possess many talents in order to help shape and direct the explosive amount of new possibilities provided by big data. But is this realistic?
During our interviews with the different ETL vendors at the beginning of 2018 we have seen a number of general trends and topics emerge. Especially those vendors that operate on the cutting edge are very eloquent on where they see ETL and Data Integration going. Apart from the “hard”, technical developments, we can see that a small number of vendors have a keen eye for the “soft” side of ETL and Data Integration, being data governance and the role of human intervention in maintaining data quality. This was perhaps the most surprising aspect of our survey. We believe that this is also where vendors can truly distinguish themselves.
Within a few months the unique Business Intelligence training course of the international research and consultancy firm Passionned Group will take place. In just two days you can discover the extreme power of Business Intelligence and Data Integration and learn how to make your data and data warehouse profitable. Our complete business intelligence training also covers the technical side of Business Intelligence and Analytics. The main goals and roles of the data warehouse are discussed and we'll give you an overview of the different ETL tools on the market, as well as the different Business Intelligence tools. In addition, the trainer, Daan van Beek, will deal with Predictive Analytics (data mining), and you will learn the details of all 8 steps you need to take to build a predictive application.
In this article I’m going to share some of my experiences from the SAS Analyst Conference, May 27 - 29, 2015, in Marbella. I had a one-on-one meeting with Scott Gidley, senior director Research and Development, and we had an interesting discussion about the future of data management, Hadoop, Big Data, data warehouses, and Data Lakes. I see a clear trend in data management and integration: the role of the data warehouse is changing rapidly and a lot of analytics will be done without it. Data warehouse developers should be warned!
The Passionned Group ETL Tools & Data Integration Survey has existed since 2003, and is a 100% supplier-independent market comparison and analysis report. Hundreds of organizations use the report worldwide to make the best choice for an ETL tool or data integration solution quickly. The December 2014 edition was recently published. “In fact, it's not merely an update, all the parts have been completely revised”, said Passionned Group Editor, Rick van der Linden.
A decade ago the vast majority of data warehouse systems were hand crafted, but the market for ETL software has steadily grown and the majority of practitioners now use ETL tools in place of hand-coded systems. Does it make sense to hand-code (SQL) a data warehouse today, or is an ETL tool a better choice? What are the 7 biggest benefits of using an ETL tool? We now generally recommend using an ETL tool, but a custom-built approach can still make sense, especially when it is model-driven. This publication summarizes the seven biggest benefits of ETL tools and offers guidance on making the right choice for your situation.
Oracle has optimized its in-memory capabilities across applications, middleware, databases and systems. This increases the transactional performance and it enables organizations to discover business insights in real-time, where this could take hours before. Hundreds of end-users have tested the in-memory for over 9 months. Yahoo was one of the companies that beta-tested and states that it can even be used on their largest data warehouses. Oracle claims that deploying the In-Memory Database with any existing Oracle Database-compatible application is easy. No alterations to these applications are required. It is fully integrated with Oracle Database's scale-up, scale-out, storage tiering, availability and security technologies.
Informatica’s Cloud Spring 2014 is now available. Ash Kulkarni, senior vice president and general manager of Data Integration at Informatica, calls it a game-changer.[global name="Press release warning"]IT developers get a drag-and-drop palette for advanced integration scenarios and line-of-business applications users get a wizard-based, self-service interface – all in one unified design environment.
Contrary to many studies that evaluate ETL tools based only on their current market share and functionality, our study (the ETL Tools & Data Integration Survey) also attempts to evaluate ETL tools based on their expected future performance. With the market changing as quickly as it is today, market share is about the past, and says little about the future. Many products, including WordPerfect, Lotus 123, and Harvard Graphics once had a huge market share, but proved to have very little future potential. The growth potential as defined within the confines of this ETL study looks at how frequently a vendor releases new, valuable features; an important indicator for innovation and the strategic importance of the product to the vendor.
The maximum number of points awarded in our ETL tools & Data Integration Survey to an ETL tool for its ease-of-use was eleven. A maximum of two points was awarded for each of the four attributes, indicated by one or more plus (+) signs, and points were allocated depending on the number of days of training suggested by the vendors (less training required – more points for Ease of Use).If the verdict was negative indicated by one or more minus (-) signs, points were deducted. If, for example, ease of use, WYSIWYG, and logical order earned a plus sign and screen design a minus sign, the tool was awarded two points (one point for each plus sign, minus one point for the minus sign). Since it can take considerable time to improve the usability of a product we have compared the product (where possible) over a three year period. The maximum number of points was 11.
The ETL tools were evaluated in our ETL Tools & Data Integration Survey based on the following characteristics: Functionality Clarity and re-usability Debugging Real-time ETL/EAI/Web services ETL functionality Data sources/targets Architecture and infrastructure Ease-of-use Growth potential and Market Stability Pricing Connectivity Platform support The ease-of-use criterion was the only aspect that involved a measure of subjectivity. We looked at the products from the point of view that they should be usable by a broad range of professionals and not just a few specialized IT professionals. This is clearly not everybody’s view of the world; some of the ETL suppliers we talked to were appalled, suggesting that allowing business users the chance to build data warehouses would guarantee failure. Others agreed that their major market was the business user and not the IT professional. We have made every effort to judge the products objectively in terms of ease-of-use, but admit to a bias in the direction of broad use.
Since 2003, we have been closely monitoring the market for ETL and data integration tools. In the past, the focus was on the market leaders who were often seen as visionaries and leaders.Many organizations used to assume that they had automatically made the right choice if they purchased a tool from one of the market leaders. Since the late nineties, however, the market has changed substantially. Practically all the Business Intelligence (BI) vendors have purchased or developed their own ETL tools. Since a centralized data warehouse is one of the cornerstones of a successful BI solution, this has turned out to be a wise choice. Market estimates show that 70-80% of the costs of a successful BI system relate to the creation of reliable ETL processes and data integration.
Informatica Corporation, the world’s number one independent provider of data integration software, today announced Informatica MDM 9.6, a major milestone in Informatica’s roadmap for powering universal master data management (MDM). The announcement came at Informatica’s annual user conference - Informatica World 2013. [global name="Press release warning"] Universal MDM depends on four essential capabilities: universal services, universal domains, universal governance, and universal solutions. Informatica delivers on all four capabilities with a truly flexible master data management technology that enables companies to start small by solving their most immediate business problem and scaling to others across the enterprise. Informatica MDM 9.6 advances this universal MDM strategy, delivering such key benefits as:
Oracle today announced the availability of Oracle Big Data Appliance X3-2 Starter Rack and Oracle Big Data Appliance X3-2 In-Rack Expansion. Oracle Big Data Appliance X3-2 Starter Rack enables customers to jump-start their first Big Data projects with an optimally-sized appliance. Oracle Big Data Appliance X3-2 In-Rack Expansion helps customers easily and cost-effectively scale their footprint as their data grows.
There's been a lot of discussion about BI in difficult times; to what extent is it rewarding to invest in BI, and why do so many projects appear to be unsuccessful? When we talk about Business Intelligence, especially for big companies, we talk about processing data (from databases and applications) into actionable information. This is the traditional Business Intelligence that companies like Cognos and Business Objects have been working on for years, and that Gartner publishes an annual BI Quadrant about. One of the problems with this type of solution is that a lot of the data needed to get the proper management information isn't contained in a clearly-defined Oracle database with a metadata layer on top of it. It's also not in SAP/R3 with structure and security that we, with some difficulty, can reach. No, it's in web pages, PDF files, Powerpoint presentations, emails, and Word documents, where there are very few standard definitions, and even fewer access rules.
Oracle today announced the availability of Oracle Exalytics In-Memory Machine, the industry’s first high-speed engineered system featuring in-memory business intelligence (BI) software and hardware to deliver extreme performance for analytic and performance management applications. [global name="Press Release Warning"] Oracle today also announced availability of a new release of Oracle Business Intelligence Foundation Suite that features 87 new product capabilities and enhancements including new visualizations, contextual interactions, performance and scale improvements, optimizations for Oracle Exalytics, and simplification for end users, developers and administrators.
Question: Which ETL tools can support Data Vault modeling out-of-the-box? What are the challenges and issues building a data vault with ETL tools? 1. Marcel de Wit - See another discussion on LinkedIn (still active; Dutch) 2. Daan van Beek - Thanks Marcel, I did read the comments of that discussion too, it was the reason I started this discussion actually. After reading it, it was still unclear to me whether ETL tools do support Data Vault out-of-the-box like slowly changing dimensions or not. So, who knows the answer(s)? The vendors?
After users become familiar with your BI project's benefits, they'll likely want more. Be prepared to provide analysis of unstructured data. We'll show you how to start. You've spent the last five years defining, establishing, and building an analytical environment for your organization. You received accolades for finally providing access to structured information from your company's transactional systems through a business intelligence (BI) tool with underlying data marts, a data warehouse, and a data integration tool. Now -- all of a sudden, it seems -- your colleagues are asking for access to other kinds of content such as email, documents, and audio-visual media through your analytical architecture so they can use this content for predictive analytics in the BI application. Where should you start?
You can still hand-code an extract, transform and load system, but in most cases the self-documentation, structured development path and extensibility of an ETL tool is well worth the cost. Here's a close look at the pros and cons of buying rather than building. The Extract, Transformation, and Load (ETL) system is the most time-consuming and expensive part of building a data warehouse and delivering business intelligence to your user community.
Many companies are seeing very significant increases in data volumes and these are having an impact on their data warehouse programmes, according to the latest survey from PMP Research. The research has been commissioned by the Evaluation Centre. Most of the organisations polled (68%) report that data volumes have increased substantially over the past three years, with a further 25% indicating more modest rises. Only 2% reckon that data volumes have stayed constant over that time period.
A'dam - March, 3th 2008 - With several books on data warehousing to his name, and a business that trains IT staff in the field, Ralph Kimball has been named in some quarters the “father of data warehousing” — though he appears to share that title with another data warehousing pioneer, Bill Inmon, if a Google search of the two is anything to go by. He and Inmon have different approaches to the art of data warehousing, but Kimball, who is teaching in New Zealand this month, says that doesn’t make them enemies.
We have recently been investigating why so many data migration projects (84%) run over time or budget. Over half of the respondents in our survey who had run over budget blamed inadequate scoping (that is, they had not properly or fully profiled their data) and more than two thirds of those that had gone over time put their emphasis in the same place. I mention this because it is symptomatic of all data integration and data movement projects: data quality needs to start before you begin your project (so that you can properly budget time and resources) and continues right through the course of the project and, where it is not a one-time project like data migration, is maintained on an on-going basis through production. Further, in order to maintain quality you need to be able to monitor data quality (via dashboards and the like) on an ongoing basis as well. This is especially important within the context of data governance.
When is right-time not real-time enough? That, as it happens, is the million-dollar question. Business intelligence vendors have made an awful lot of noise about a "new generation" of "real-time", "near-real-time", or "decisioning" tools, touting everything from their ability to provide "actionable business insight" to other intangible (and usually unspecified) competitive advantages. Industry watchers, on the other hand, like to stress that there is no single "real-time" Rx for every company or every industry; instead, they argue, companies need to figure out a "right time" window that’s, er, right for them, and go from there. Who’s right? Who’s wrong? Does it matter?Well, everyone. And no one. And, yes, it does matter. In fact, once you get down to brass tacks with most Business Intelligence and Performance Management vendors, you’ll find that their views are actually very similar—for the most part identical—to those of many prominent industry talking heads.
A'dam - September, 22th 2007 - The seeming simplicity of a term like "business intelligence" belies the rich potential its implementation can provide to a small or midsize company. Data mining is the backbone of BI--it means sorting through company data and identifying and extracting valuable information about operations. In a customer-centric business, the information can be particularly compelling. We turned to Gordon Linoff, a principal at Data Miners, to learn what data mining for BI can do for small and midsize businesses.
"I am tired of IT people. They'll talk to you for half an hour about what they do, but by the time they're finished, you still don't have any idea what they've told you or why it's important," so complained a friend of mine in a recent conversation. What was the irritant giving rise to my friend's frustration? Ahem … my own ineffective attempt to explain to him what it is that I, a data warehousing/business intelligence (DW/Business Intelligence) practitioner, do. Sometimes the right metaphor is helpful. It can clarify abstract concepts for the uninitiated and, even for the expert, be a means for synchronizing designs and vocabularies and analyzing problems. To that end, let me propose a metaphor for describing what data warehousing and Business Intelligence is all about and, perhaps more importantly, suggest where the field is broken: the metaphor is that of an information supply chain.
Talend, a global open source software leader, today announced version 5 of its unified platform for integration, becoming the only open source vendor to offer a comprehensive technology portfolio that spans all enterprise integration needs, including data, applications and business processes. [global name="Press Release Warning"] With this new version, Talend introduces a holistic approach to integration, enabling IT organizations to converge traditionally disparate integration efforts and practices through a common set of products, tools and best practices. The Talend Unified Platform provides a common foundation for the products to be used individually or in concert depending on the needs of the organization.
Today from PASS Summit 2010, Microsoft Corp. unveiled the next generation of SQL Server, Microsoft SQL Server code-named “Denali,” and made available SQL Server 2008 R2 Parallel Data Warehouse. [global name="Press Release Warning"] The company also announced the new Microsoft Critical Advantage Program, which provides an end-to-end experience across the complete life cycle of mission-critical appliances, such as SQL Server 2008 R2 Parallel Data Warehouse. Combined, the product and program help enterprise customers successfully design, deploy and optimize appliance-based solutions to help ensure the highest level of scalability, availability and reliability for the most demanding data warehouse requirements.
IBM today announced plans to acquire DataMirror Corporation, a business information firm based in Ontario, Canada, for approximately $161m in cash. DataMirror provides technology that identifies and captures data that has been added, updated or deleted, and allows the changed data to be delivered to processes, applications and databases. "Organizations need the ability to capture and use information in real time to help them make better business decisions, better serve their customers and increase operational efficiencies," said Ambuj Goyal, general manager at IBM's Information Management Software unit.
Ab Initio has been around now for a long very time in the ETL market space to make a name for themselves. Their marketing approach appears to be one of mystique: maintain secrecy around the product while allowing some information out about the high-end customers, creating interest because of the tantalizing tidbits they provide. Their website is more typical of an advertising firm with nothing meaningful to say than a technology company. Years later (2015) this is still the case. Their website loads very slowly and is full of flash (no HTML) and there is even no phone number or contact form. Our guess: they are finished.