#Data15 Recap – Viva Las Vegas!

IMG_1678Background

There are 10,000 stories from #Data15, and this one is mine.

As a veteran of 3 TDWI World Conferences held in Las Vegas I had some idea of how the week would go; lots of learning, a few drinks, some networking, some gambling and generally a good time had by all in attendance.

As a first time Tableau Conference attendee, I think I underestimated things bit.

Monday

I tried to attend the “Conference Newbie” session, but when I arrived there were 50-75 people standing outside; the session was full.

Pro Tip #1 – If you aren’t registered, show up early.

So, I wandered around the conference area a bit, checked out the Expo, and waited for the Welcome Reception to start up. If nothing else, Tableau knows how to throw a party. I watched in wonder as food and bars were set up all around the conference center.

My favorite activity for the week would be starting random conversations with folks in crowds. It’s a good way to meet people, and you never know who you might run in to using this method.

Case in point, I reconnected with someone I hadn’t seen since college (20 years to be exact). He’s using Tableau at the University of Michigan (Hi Matt!). I thanked someone from Southwest Airlines for getting me to Vegas 30 minutes early… he took full credit and we had a good laugh and conversation.

Pro Tip #2 – Just get out there and talk to people.

The reception was great, but it was only the tip of the iceberg, as I would soon discover.

Tuesday

After grabbing the first of many delicious breakfasts, we headed to the Garden Arena for the kick-off of the conference, the opening keynote address. It was at this moment I saw the scale of the conference. 10,000 people are a set of numbers and words until you see them all in a large arena. The lights and the big screens set the stage for what was to come next. CEO Christian Chabot took the stage first to welcome us all, talk about some amazing user stories and then handed it over to the developers to tell us all about the really cool stuff that is coming to the product soon. For developers, the presented better than most executives I’ve seen, and the work they have done to improve the product is nothing short of amazing.

Bad News – After the keynote, squeezing 10K+ people who are really excited to get to their sessions is problematic. 20-30 minutes passed before all got to the sessions.

Luckily I was pre-registered for my first session.

Pro Tip #3 – Pre-register for sessions if possible.

Everyone. Loves. Sets. was my first session. I’d played with sets a bit, but this session was very good in expanding my knowledge and usage of sets. The presenters were very knowledgeable and did a great job working with such a large group of students. Shout-outs to the room helpers as well.

My next session was “Getting Your Performance Up”. While I’m sure it was useful for some, I didn’t get much out of it and actually exited in search for other content.

Pro Tip #4 – If you aren’t getting what you need from the session, check the app and find your alternatives

The session I did find was excellent; Turbo-Charging Your Dashboard Performance. While it wasn’t hands-on, the content was solid information, and very applicable to what is going on at work.  Shout-out to Dr. Kate Morris and Rapider Jawanda for a great session!

The last event from Tuesday was Data Night Out. My team decided to take a cab and check out Fremont Street a little early. I’m glad we did as we were able to get in before the majority of people arrived, and got the lay of the land. Tableau rented out all of Fremont Street. All. Of. It. Food trucks, bars and two performance stages made up the landscape and once the conference attendees arrive, the party was amazing! The bands were incredible, and getting to meet some more of the attendees, as well as Tableau employees was awesome. Truth be told, the evening ended as a bit of a blur, but it was amazing none the less.

IMG_1692

Wednesday

Wednesday started with a keynote address from Dr. Daniel Pink. I was a little late in waking up so I opted to watch the keynote from the Expo Hall; This was a excellent choice. The seats were much more comfortable, and the crowd much smaller. I was able to have a couple of conversations around the Expo Hall, and had no problems exiting after the keynote address to make my next session. Dr. Pink’s keynote was very interesting, and pointed out a few things on motivation which I was pleasantly surprised with since my current company is doing many of those things. This is probably why it’s consistently ranked as one of the best places to work.

My first session that morning was 50 Shades of Data: A Zen Master’s Guide to Color. I’d been looking forward to Matt Francis presentation for a while, and I was not disappointed. He’s funny, smart and knows a hell of a lot about Tableau. I highly recommend checking his blog and podcast out. I can’t wait to apply some of the techniques he talked about at work.

Next up was The Data is in the Details: Advanced LOD Expressions. I’d played with LODs quite a bit since they came out, but I wanted to take this class to understand better how they worked, and if I was missing anything. We concentrated primarily on the FIXED calculation, and learned how we could next the LODs (AWESOME).  Bronson Shonk did a great job with this, and shared part of his blog through the Tableau site that you should check out for some good Tableau challenges.

Last event for this day was Neil deGrass Tyson’s keynote address. I again opted to watch from the Expo Hall. Dr. Tyson killed it on stage, and his story about harassing James Cameron about the Titanic start field was hilarious. The other movie inaccuracies he pointed out were funny as well.

Thursday

Dr. Hannah Fry kicked off the morning with her keynote, which I was really impressed with, and thought was a great session. I grabbed my breakfast and settled in the Expo Hall (see a trend?). She studies some really depressing and heavy subjects through data, but her presentation of it was great. He information about Wikipedia “rabbit-holing” was really interesting, especially that over 90% of all topics can be drilled down to philosophy. I must remember to pick up her book and learn more about her research.

Today’s star session was Tableau Jedi Calcs: Welcome to the Dark Side. Big shout out to Lauren Bearden and Keshia Rose for a great presentation, and answering my questions as I pieces things together and stepped further into the matrix of advanced calculations. What they presented was totally applicable and will add so much value to my current work.

Last up was Sir Ken Robinson to deliver the closing keynote address.

Wait, what? Closing Keynote Address? How did the week pass so fast?

IMG_1707Sir Ken was awesome… so funny and so brilliant at the same time. He delivered an inspiring keynote injected with humor and though provoking content which I’m sure caused everyone to think a little deeper and tap in to their creative rivers.

To top things off, I got extremely lucky and ran in to Sir Ken back in the MGM’s elevator lobby and snapped a selfie and shook his hand, thanking him for a great talk.

What a week!

Final Thoughts

Tableau has an amazing community of users, zen masters, jedi, and employees. Putting them all in close proximity of each other makes for quite an amazing time to be had by all in attendance. I’d love the opportunity to attend this conference again, and if the opportunity arises I will take full advantage. There is SO MUCH CONTENT you will walk away wondering if you made the right choices in the sessions you attended… but I honestly I don’t think you can make a wrong choice.

Thanks to all involved with putting on an amazing conference!

IMG_1659

Can One Size Fit All?

Out of the Box vs. Custom Build

For the majority of my career I have railed against “Out-of-the-Box” BI solutions because businesses don’t come out of a box, processes don’t come out of a box, and innovation doesn’t come out of a box. I’ve prided myself on being able to re-engineer OOTB solutions and achieve better performance, extra metrics, and better analytics. Today, I’m eating a few of those words. Do I still believe in the custom solution, absolutely, but I now have a better understanding the right time, place and situation. In the end, every company needs to pick a strategic that is right for them, and unless Babylon is burning, they need to stick with it and support it since data is the most valuable asset.

Size Matters

When does an out of the box solution make sense, and when is it the only rational solution?

Size matters. If the company or client you are dealing with is a global presence, and implementation speed is a necessity, then the OOTB solution has to be considered. Dealing with multiple locations on multiple continents, with multiple conversion rates and currencies? OOTB may be the solution you need to use. If you are dealing with a small company, with a small footprint, a custom solution makes total sense and the ROI will be more achievable than dropping millions of dollars for an SAP or Oracle solution. With that said, something like  SAP’s “Rapid Mart” solution may fit the bill for some small to medium size companies, you just have to do the proper evaluation.

 Some of the “Pros” of Buy vs. Build:

  • Standard Data Models and ETL processes
  • Standard Reporting Platform
  • Extendability (modifying ETL/Data Model to meets needs)
  • Quick to user deployment when compared to custom warehouse


The biggest concern with the out of the box solution is generally how much customization will be needed?  Or conversely, how much will my business processes need to change to fit in the box?

Some of the “Pros” of Build vs. Buy:

  • Tailored solution for the business
  • Freedom to select “Best of Breed” tools that fit your budget
  • No concern over updates and upgrades from the OOTB Vendor

The biggest hit a custom solution will take is always the time to deliver the solution, which depending on size can run anywhere from 3 months to 18 months or longer. At the pace that business moves and the importance of a competitive edge it may be hard to convince the business that it is worth the wait. (Agile BI anyone?)

Regardless of the direction, you will still need talented staff to build, maintain and extend your BI solution, there is no choice in that matter.

Balancing Corporate Reporting with Local Reporting

If you are a multi-national company, with billions in revenue, having a corporate set of standard KPIs is a must to have any idea of your performance and your areas of improvement. There is a balance that must be struck though, and that is the local needs of the business. A location in China may look at their business in a different way than a location-based in Southern California, and those two location might look at things differently that corporate headquarters looks at the numbers. While ultimately all three should be looking at and managing to the corporate KPIs, the day to day operational needs need to be met. This is where the extendability of an Out of the Box solution is key. I have seen OOTB solutions that were not flexible at all, and those solutions tend to get replaced by spread-marts, or shadow reporting initiatives. Larger, more mature, OOTB vendors realize this and allow for extension and customization of their solutions. The statistic you hear regularly is that a good OOTB solution meets about 60%-75% of the needs, and the remaining percentages are where the customization comes in to play.

Show Me the Numbers

In the end it comes down to the numbers, and all the business users really want and need is to see the numbers that drive their business. They don’t care if it is a big “Out-of-the-Box” solution or a home-grown data warehouse. What they do care about it accuracy, consistency and validity. If you are giving them those three things, the solution is semantics.

Why OLAP?

Why OLAP?

 For the first 3 quarters of my career I didn’t pay attention to Online Analytical Processing (OLAP) technology, everything I did was with relational structures because everything I did was for users who wanted reports handed to them. In the last few years that has changed for me in the regard that users are more technically educated and the reports that are being generated don’t move at the speed of business and can’t answer the question “Why is this number the way it is?”

Giving an analyst direct access to the data via a cube only makes sense because only they know the questions that are going to pop in to their head based on a static report. By giving them this access, they don’t have to come back to IT and ask for a new report and it affords them a level of “Self-Service” business intelligence. Users have the ability to slice and dice the data as they see fit, drill down to new and deeper levels of detail and find things within their data that they might not have uncovered with a standard report or dashboard. Cubes also aggregate the data so that if you have a cube with tens of millions of rows you can get the answer you need is seconds. This helps the end user for sure, but it also helps the IT report developer of dashboard creator in that their performance should also see improvement and access and the speed of data retrieval is improved.

Now with the addition of tools like PowerPivot and Tableau users can take these cubes and build their own visualizations which make these cubes even more powerful. Think of the cube as a semantic layer on steroids!

In addition, cubes are more agile and easier to maintain and extend than a relational structure because you have the ability to create calculations as they are needed rather than creating ETL packages, modifying data structures and modifying reports.

 My Current Situation

I currently work for a company who has SAP Business Warehouse (BW) which is uses for financial reporting, which is fed from our SAP ERP system. If every piece of data we had been native in SAP then it would make sense to use SAP BW exclusively and handle all the report in that manner, but we don’t. We have data in external systems, cloud/hosted systems and in spreadsheets to the point that some BI professionals would sit under their desk and weep. We need an enterprise data warehouse to bring all those pieces of data together, and we have a user base who is hungry to access that data, so the obvious choice is by using a cube. Cubes present very well in Excel, which out users are very versed at using and they have a high comfort level. To date, our cubes have been well received and we had very positive feedback on their day-to-day business use of them to answer the questions that arise at any give point in their day.

In my previous position we graduated to cube usage as the users got more and more comfortable with BI and in using the data. When I left we had just deployed our first cube base on general ledger information, and from what I am told it is still being heavily used and relied upon by the user community there.

 Should You OLAP?

That’s a valid question, and I suppose it depends on the organization and user base, but if you have a product like Microsoft SQLServer you already have the tool in-house so why not give it a try? If you are already building star schemas you are more than halfway there! Analysis Services is fairly easy to learn and use, just like most Microsoft products, and you can have a full functioning cube in less than a day on your first time developing with Analysis Services. There are plenty of resources on the web, and I’m sure if you are really serious there are training classes in your local area.

If you were to look at the needs of your users, the majority of them just need access to the data, probably more than 50% of  them fall in to that category, so why not present a cube to them? They don’t have to learn SQL, they can drag and drop in Excel to build what they need and they will stop coming to IT every time they need a new view of the data. The ROI on a cube can be incredible!

BI Teams: Building Them and Joining Them

If you had to build your BI Dream Team, who would be the players? Your first instinct might be to go the ultra-experienced route, filling the team with seasoned veterans bringing their wealth of knowledge to the company and letting them drive to the BI Promised Land. To be honest, that was my thought… bring in the most experienced people with the brightest BI minds and that can only make the BI solution that much better, right?

I was marching down this path, within a budget or course, until I had a conversation with Dallas Marks over lunch a couple of months ago. He talked about building a  BI team and compared it to making beef stew. I scratched my head and pondered this parable… and it started to make sense.

While a good BI Team does have a sprinkling of veterans, there is a need for mid-level team members as well as “newbies”. When you make beef stew, you don’t fill a pot with 4 pounds of filet minion, you’ve got to have celery, potatoes and onions as well as a good beef stock and some seasonings to get a quality beef stew.

 Where Experience Counts

The first piece that needs to be in place is the Business Intelligence Architect. This person oversees all aspects of the solution from the requirements gathering to the training, and preferably they have experience in many, if not all, of these areas so they can act as a mentor to the entry-level and mid-level employees. The BI Architect needs a few implementations under their belts so they can react to and anticipate curves in the road as the project goes along.

The second area where some expertise is needed is in the data modeling and ETL development. These areas have to be executed near flawlessly for the other pieces of the solution to fall in to place. These are areas where you can augment the team with entry and mid-level employees, but they need experience mentors who are willing to teach them the skills needed to be successful.

As you move further towards the users, with cubes, reports and dashboards you have have the mid-level people more involved because this process may be more iterative and there can be more room for “error” and the developer and the user go back and forth on features and aspects of the consumables.  Reporting in general is a good place to start in BI, if you have a good mentor in place, and once they “master” the art of reporting and visualization they can start to move backwards in the process and learn cubing, ETL programming and data modeling.

Mentoring

I mentioned mentoring a couple of times already, and I am a staunch supporter of the mentoring process. The first few years I had some good managers, but didn’t really get a good mentor until a few years in to my career. My early managers taught me more about business in general and dealing with users, but they did but me on the right path by exposing me to Ralph Kimball in the form of sending me to one of his seminars on the Data Warehouse Life Cycle. I was also given the opportunity to learn on my own through trial and error as BI tools were loaded on the my computer and I was asked to produce. Today, I value my mentor a great deal and even though I consider myself established I still ask question and seek advice when I’m considering a problem I’ve never faced. Finding a great mentor can be as challenging as finding the right job opportunities, but once you find them I suggest you hold on to them.

BI Career Survey

I conducted an informal BI Career Survey on this website to see how others got in to data warehousing and business intelligence and thought I would share some of the results here for those who may be looking on the best route to break in to the field. the survey had a relatively small sample size, but I think the results still have some value.

Of the 95 people who took the survey they broke down as followed: (not all respondents answered all the questions)

 80% Male — 20% Female

77% working within IT — 23% working in a line of business or finance department

BI Certifications

61% do not hold a BI specific certification

23% had a software specific certification

9% had a Microsoft Certification

5% held a CBIP from TDWI

One question I found interesting was “How they came to work in the BI/DW area and it was nearly evenly split with 46% starting as an entry-level employee while 52% transferred from a department (or within the IT department). The fact that 52% transferred in supports my thought that while there are distinct differences between a traditional IT workers skill set and a BI/DW workers skill set, there is a good amount of cross over and the right person can make that transition.

The people who took the survey also came from a very good mix of disciplines within BI as you can see. (Respondents were able to mark multiple areas of specialty.)

 

WhatAreaofBI

 And lastly, I asked for each respondent to share some words of wisdom on how to get in to BI/DW and here are a few of the best ones.

 “Technical skills only get you so far, business skills make a big difference.”

“Read Kimball and Inmon and start with Data Warehouse Modelling.”

“Learn the methodology; the software can vary. Have LOTS of patience.”

“It’s really rewarding yet challenging. Develop a solid understanding of the fundamentals.”

“Get into the community, there are a ton of smart people to learn from that freely give.”

“Get as broad a knowledge base as possible. Don’t tie your career to any one technology.”

“For technically skilled people: Try to start entry-level at a specialized IT consulting firm so you can learn all the aspects whilst working on various BI projects and see what you like.”

“Don’t become just a tool specialist; develop an understanding of the fundamentals and never forget SQL.”

“1. Study, study, study. 2. Learn excellent communications skills, you will need them.”

I hope this help shed some light on not only building a BI Team, but also getting starting with your career in BI. If you still have question, please post a comment and I will do my best to answer them, or I’m sure other BI professionals will share their thoughts as well.

Developing Change Management for BI

Developing Change Management for BI

One of the things I struggle with is change management for business intelligence. One of the toughest things to explain to non-BI people, whether they are in QA/QC or general IT areas is that BI is not a transactional system or IT application, so traditional change management process doesn’t really fit the mold.  In some regards the pace of change in mature BI implementations can be a day or even a few hours for some aspects like reports or dashboards, were on the other hand your database schemas, aggregate tables and cubes change at a slower pace. The challenge comes from trying to fit one change management structure on to this processes that spin at different rates and responsive needs.

The first thing we should address is the name, because we are not managing change in a typical application development situation, we are managing extension or addition. Rarely do we “change” anything in a traditional sense. We are developing new ways of looking at the data and throwing out the outdated parts. We are creating new views of the data or incorporating additional data into our warehouses.

Secondly, I think it makes sense to break the deliverable from a BI project in to 6 pieces, because these pieces have different churn rates. From a high level those 6 pieces are Staging, Stars, Aggregates, Cubes, Semantic Layers and Reporting/Dashboards/Visualizations.

FreqOfChange

If you are familiar with the Gartner Pace Layer Applications Strategy, you will understand when I say that the Staging and Stars are “Systems of Record” and you wouldn’t want these things changing at a rapid pace because of the impacts that it could cause downstream. These systems should follow a rigid process for object promotion because of the potential implications. Your aggregates and cubes are more like a “System of Differentiation” where you will see more frequent changes than the staging and star schemas, so there needs to be more flexibility in how you manage object “promotion through your development, quality and production environments. Finally, the semantic layers and reports can be viewed as a “System of Innovation” where you are constantly experimenting with the data, developing new analytics and mining the data for new predictors and indicators for business performance. These items have the potential to change daily, and if you have followed a rigid process for the ”system of record” and painstakingly tested the code for validity and reliability then you should have the confidence that the reporting will be correct based on the requests and whims of the business users.

As an aside, you have to take your project management method into consideration. If you are working in an Agile fashion you obviously can’t work on a monthly promotion cycle because by the time your last minimal marketable feature moves in to production, you’ve completed the next sprint and you will always be playing catch up from a change management perspective. If you are still working in a “traditional waterfall” you have a bit more room to plan for those production moves.

Lastly, consider what the business truly needs and place your organization on the Business Intelligence Maturity Model before deciding what change management strategy is right. If you do not work for an analytic minded company, then your initial BI offerings aren’t going to be used as a system of innovation, you are going to be replacing reporting that exist from legacy systems and pushing towards the next level of maturity. As your BI program matures, you need to reassess you change management needs to fit the companies level of maturity and look at your business users current needs. These practices and procedures should be reassessed every 6 to 12 months depending on the speed of development within the BI program.

Business Intelligence, Data Warehousing and Analytics