For the majority of my career I have railed against “Out-of-the-Box” BI solutions because businesses don’t come out of a box, processes don’t come out of a box, and innovation doesn’t come out of a box. I’ve prided myself on being able to re-engineer OOTB solutions and achieve better performance, extra metrics, and better analytics. Today, I’m eating a few of those words. Do I still believe in the custom solution, absolutely, but I now have a better understanding the right time, place and situation. In the end, every company needs to pick a strategic that is right for them, and unless Babylon is burning, they need to stick with it and support it since data is the most valuable asset.
The biggest concern with the out of the box solution is generally how much customization will be needed? Or conversely, how much will my business processes need to change to fit in the box?
For the first 3 quarters of my career I didn’t pay attention to Online Analytical Processing (OLAP) technology, everything I did was with relational structures because everything I did was for users who wanted reports handed to them. In the last few years that has changed for me in the regard that users are more technically educated and the reports that are being generated don’t move at the speed of business and can’t answer the question “Why is this number the way it is?”
Giving an analyst direct access to the data via a cube only makes sense because only they know the questions that are going to pop in to their head based on a static report. By giving them this access, they don’t have to come back to IT and ask for a new report and it affords them a level of “Self-Service” business intelligence. Users have the ability to slice and dice the data as they see fit, drill down to new and deeper levels of detail and find things within their data that they might not have uncovered with a standard report or dashboard. Cubes also aggregate the data so that if you have a cube with tens of millions of rows you can get the answer you need is seconds. This helps the end user for sure, but it also helps the IT report developer of dashboard creator in that their performance should also see improvement and access and the speed of data retrieval is improved.
Now with the addition of tools like PowerPivot and Tableau users can take these cubes and build their own visualizations which make these cubes even more powerful. Think of the cube as a semantic layer on steroids!
In addition, cubes are more agile and easier to maintain and extend than a relational structure because you have the ability to create calculations as they are needed rather than creating ETL packages, modifying data structures and modifying reports.
My Current Situation
I currently work for a company who has SAP Business Warehouse (BW) which is uses for financial reporting, which is fed from our SAP ERP system. If every piece of data we had been native in SAP then it would make sense to use SAP BW exclusively and handle all the report in that manner, but we don’t. We have data in external systems, cloud/hosted systems and in spreadsheets to the point that some BI professionals would sit under their desk and weep. We need an enterprise data warehouse to bring all those pieces of data together, and we have a user base who is hungry to access that data, so the obvious choice is by using a cube. Cubes present very well in Excel, which out users are very versed at using and they have a high comfort level. To date, our cubes have been well received and we had very positive feedback on their day-to-day business use of them to answer the questions that arise at any give point in their day.
In my previous position we graduated to cube usage as the users got more and more comfortable with BI and in using the data. When I left we had just deployed our first cube base on general ledger information, and from what I am told it is still being heavily used and relied upon by the user community there.
Should You OLAP?
That’s a valid question, and I suppose it depends on the organization and user base, but if you have a product like Microsoft SQLServer you already have the tool in-house so why not give it a try? If you are already building star schemas you are more than halfway there! Analysis Services is fairly easy to learn and use, just like most Microsoft products, and you can have a full functioning cube in less than a day on your first time developing with Analysis Services. There are plenty of resources on the web, and I’m sure if you are really serious there are training classes in your local area.
If you were to look at the needs of your users, the majority of them just need access to the data, probably more than 50% of them fall in to that category, so why not present a cube to them? They don’t have to learn SQL, they can drag and drop in Excel to build what they need and they will stop coming to IT every time they need a new view of the data. The ROI on a cube can be incredible!
]]>In a recent “Tweet Chat” hosted by Howard Dresner a group discussed IT departments using business intelligence to measure internal performance. Internal analytical evaluation is an interesting concept, but this type of project may be a hard sell to a corporation that is looking to cut costs. Is Internal BI something companies should consider? Yes!
Until you are able to measure how productive your IT staff is, there is no other way to justify IT staffing levels or secure more budget dollars for future projects! If a CEO asks questions like “What was the labor-savings realized by Project A?”, a CIO should be able to give that answer. Performance metrics aren’t just for sales or financial dealings of the business in question. Every department in a corporation should look at internal performance metrics to see where they are spending their time, where they are adding value, and where they can ultimately make improvements, cut costs, and streamline themselves as a department… which in the end, does affect the business and their bottom line. It is surprising that CEOs don’t need this type of analytical information for an IT department (or any department) that makes up such a large part of a company’s operating budget (in some cases).
So what do you measure and how do you measure it?
For IT to measure their productivity, they accurately track their time against the correct project or projects that they are working on or supporting. This data will show hours spent in new development verses production support issues, which projects are getting the most time, and other areas where time is spent (i.e. training and education, meetings, help desk tasks, etc). Included in this data would also be IT salary and departmental cost information so that you can measure dollars spent against time for each project. It would make sense to average departmental costs across the headcount unless there are specific costs associated with certain people or teams.
Next, you need to find a way to measure impact of IT projects and tasks to the business. This is a bit trickier to track as Project ROI is hard to nail down. A simple way to start maybe through initiating a survey of specific project users from the business, using questions like “How many hours a week does Project A save you on an average week?” and “How long have you been utilizing Application X?” You will need to ask questions targeting for each project or application that IT is supporting. Coupling this data with salary data from HR (average salary, not actual salary) you can then see what the saving are in regards to labor costs for the business.
Once you have those pieces of data, you can start to formulate an IT Spend to Business Impact ratio. This is where it might get scary for CIOs and upper management in IT, because now the blinds are up and everyone can see what projects were worthwhile and which ones were “flops” on ROI. This exposure is a good thing however, because it helps you understand and hopefully pinpoint where projects go wrong. If there is too much time being spent on support and enhancement requests then perhaps you need to do a better job capturing requirements or perhaps a more rigorous QA process. Do one or two project managers constantly run past SLA or production deadlines? Are they under estimating their time, or are their other issues that need addressing? The list could go on and on as far as analytical study of IT metrics.
Effective budget planning could also be affected by implementing BI within the IT structure. When developing project plans and designating resources you can show your projected spend for a project against the IT budget for the year. Another area that could be explored would be a measure of expected ROI from a project. Based on the requirements gathering, IT groups could capture the time spent on a task, and the potential time that would be saved by employees who would be using the application. This type of data could be useful during project prioritization and eventually you could show perceived savings vs. actual saving against the total cost of the project. By using BI for budgeting and project prioritization management can see the true impact of IT vs. the dollars spent and budget dollars are best used to support the business all while getting the greatest “bang for the buck”.
How accurate is the data?
This is the biggest obstacle I wrestle with on this type of reporting and questions exist in multiple places regarding the accuracy of the data.
First, is the IT staff accurately recording their time and are they recording it against the proper projects? Does 1 hour of charged time REALLY equal 1 hour or was it really a half hour and you needed to fudge your time because the management team gets upset when you time sheet isn’t at 40 hours or more? For this problem, I think IT managers need to realize that 40 hours of real work may not occur each week… some weeks it is 32 and some weeks it is 50. Lording over your employees because their time sheet appears short of the standard 40 hours per week is not a way to manage. When management adopts this more flexibility or realistic attitude, then you will see the accuracy improve on time reporting.
Secondly, is the business accurately reporting their time spent on daily tasks? It may have taken them 1 hour to put a report together, but they don’t think about the 6-12 hours of data wrangling they may have done to get ready to prepare the report. It’s very easy to forget about the steps to complete a task once you are working on the last phase of a task.
Is this project worth the investment?
Whether to invest or not in an IT BI Program is a very valid question, but it may not have a straightforward answer. In theory, it makes sense to use a sustainable method to measure departmental performance. In practice, data validity is “iffy” if certain aspects of general “IT Culture” aren’t brought into question and ultimately changed to support this type of evaluation. With that said, I think that if an IT group “feels” like they are providing value it is positive for moral. Consequently, if the IT group actually KNOWS and can quantify their impact for the business I can only imagine that the moral would be higher, and you would see productivity improve. In addition, having this type of performance monitoring comes in handy in many situations including budget discussions for areas of IT. As the performance reporting improves you can perhaps use this data for individual performance reviews or at least supporting data for this activity. Would this investment eventual pay for itself by way of improved productivity, better project management and project planning? I think in the long run it would, but the key would be to have patience in allowing the Internal BI Solution to mature so the results become more accurate.
So, is your IT department ready to look in the mirror?
]]>