Oct 19 2018

BI Evolution

Categories: General Dave Rathbun @ 11:12 am

As a long-time Business Objects user / consultant / trainer it has been interesting to watch the growth and evolution of BI tools. I work for a fairly large company with a significant investment in Business Object technology. It’s not going anywhere. But a few years back Tableau got a foothold with their “try it free for 30 days” strategy. Tableau became the flavor-of-the-month because people could download it for free, put it on Excel data (which no doubt was provided via a Business Objects extract), and make pretty pictures.

Fast forward to today and we have a Tableau infrastructure with over 100 cores and the number of Tableau users is in five digits. And the new flavor-of-the-month is Microsoft Power BI because, quote, “Tableau is too expensive.” :lol: But ultimately the tool should not matter. A plastic spork gets food into your mouth as effectively as a silver spoon, right? A good (or bad) dashboard design is good (or again bad) no matter which tool is used to deliver it.

I think Tableau today is where Business Objects was years ago. Business Objects was one of the first tools to eliminate the need for business users to learn SQL to work with data. Tableau was one of the first tools to focus on visual exploration of data. Business Objects started out as a “departmental” BI tool. Anyone could set up an environment, build and distribute a universe, and let users start building reports. Tableau has also nailed departmental functionality and is working towards becoming a true enterprise tool. They’re missing some important parts like a true shared semantic layer, enterprise scheduling and distribution, and security. Business Objects has all of these things, plus the capability of exporting to Excel. Remember how long it took for us to get that feature? Anyone want into the pool to see how long it takes Tableau to bow to the same pressure? ;) In my opinion Tableau also needs to get better at working with live data connections instead of requiring extracts for everything.

I actually anticipated this way back when I started this blog, which is why I named it “Dave’s Adventures In Business Intelligence” rather than Business Objects. The goal is to distribute smart, intelligent, useful information. I should not care about the tool.

But tools are important, so next week I will be at the Tableau user conference in New Orleans. It will be weird to go to a conference with so many people (20,000+) and none of them know who I am. 8-)


Sep 13 2018

Copying Content Is Not A Complement

Categories: Rants Dave Rathbun @ 12:45 pm

It was brought to my attention recently that someone has taken content from my blog and re-posted it to blogspot. I have used the contact form available on their blog to request the removal of my content and have gathered the information I need to file a DMCA complaint if that is not successful.

I don’t have advertisements on this site, and I don’t ask for donations. Everything I have done over the years remains in place and available for the broader user community. Why someone would think they should (or could) take what I have created it and post it as their own work is beyond me. :-? Bottom line: if you find my content posted elsewhere in the Internet (quotes are fine, references are fine, copying is not) please let me know. Thanks.


Aug 28 2018

Resetting a Running Function

Categories: Calculation Context, General, Report Techniques, Variables! Dave Rathbun @ 11:02 am

I’ve been inactive on BOB for a while as well as here…recently I was reminded that some of the old questions still come up periodically. For example, how do you reset a running total? It’s not hard, but the syntax is not immediately obvious. Continue reading “Resetting a Running Function”


Aug 14 2018

The More Things Change…

Categories: Calculation Context, General, Report Techniques Dave Rathbun @ 7:53 am

…the more they stay the same.

I’ve been fairly quiet here for a long time. Part of the reason is I’ve been busy with family and work. My older son will be a senior in high school this year, and my younger is just starting driving lessons. But something happened a few weeks ago that made me want to write again.

At work I’m now overseeing the Tableau team. We still have a lot of Business Objects users, but Tableau has become the “flavor of the month” that everyone wants to use. Tableau certainly has the right idea in that their product works identical whether viewed on the web or via a mobile device, and their mapping options are really nice.

But at the core, it’s still a data presentation engine.

A few weeks back, I was trying to solve a problem that I knew how to solve in Web Intelligence. Specifically, I wanted to generate a grand total that would return the same value no matter where I dropped it on my dashboard. In Web Intelligence this is done via Calculation Context and it looks something like this:

=Sum([Revenue]) In Report

By adding the key words “In Report” I am telling the tool that I want to see the grand total. I’ve covered calculation context before (Calculation Context Part I: Overview) and it remains one of my more popular posts. It’s definitely the one that has the most people asking, “Where’s part 2?” Other calculation options in Web Intelligence include “ForEach” and “ForAll” to go along with “In” that I have already mentioned. Here are the definitions I wrote the first time around:

  • In is used to specify exactly which dimensions to include in a context. Other dimensions in the block are ignored. Adding / removing elements from the block does not impact this calculation unless a removed dimension was specified in the context. In that case a #MULTIVALUE error is displayed.
  • ForEach is used to include a dimension in a context. The calculation context is still affected by other values in the block.
  • ForAll is used to exclude a dimension from a context. Other dimensions of the block will still be considered. Adding or removing values from a block might change the value, but it will always ignore the ForAll items.

Why bring this up again? Because Tableau has exactly the same options. :lol: Instead of Context they call their option Level of Detail. You activate a Level of Detail operation by using curly braces { } around your calculation. Suppose you want a grand total, it looks like this:

{Sum([Revenue])}

That is the equivalent of “In Report” for Web Intelligence. What about the others?

Tableau provides FIXED, INCLUDE, and EXCLUDE as additional tags for their Level of Detail operations. Each of those is a direct equivalent to a Web Intelligence option.

FIXED is the equivalent of IN. These two statements are an equivalent way to request sales by year, even if other dimension values are present in the block or worksheet.

{FIXED [Year]: Sum([Revenue])}
=Sum([Revenue]) In [Year]

INCLUDE is the equivalent of ForEach. These statements are going to make sure the Year is included in the calculation, even if it’s not part of the displayed block or worksheet.

{INCLUDE [Year]: Sum([Revenue])}
=Sum([Revenue]) ForEach ([Year])

Note that if there are multiple Years then Web Intelligence will show a #MULTIVALUE error, and Tableau will show a * as the result.

EXCLUDE is the match to ForAll. These formulas will specifically ignore the Year value when generating Total Revenue, even if it’s present in the block or worksheet.
{EXCLUDE [Year]: Sum([Revenue])}
=Sum([Revenue]) ForAll ([Year])

The person who was starting to explain the Level of Detail function in Tableau started out by saying, “Now this is one of the more complex techniques that Tableau offers, so it may take a bit to get a handle on it.” As soon as he started the technical explanation I cut him off, saying, “Oh, it’s just like Calculation Context. I’m good.”

:-)

So yes, the more things change, the more they stay the same…


May 19 2016

Visualize Whirled Peas

Categories: General Dave Rathbun @ 9:37 am

Keeping with the food them from my prior blog post, I present to you: 40 Maps That Explain Food in America. In fact, it was this article that lead me to the Waffle House article to begin with.

The 40 maps use a variety of visualization techniques to show different food-related subjects. For example, what’s the most typical barbecue in Texas? It would appear to be brisket. Do you drink Soda? or Pop? Or that red drink? (I’m not allowed to type out the name…) How far are you from a McDonalds?

There are some predictable visualizations including color-coded maps, line charts, bar charts, and of course even a pie.

Mmmmm, pie.

Pie Pie Chart


Mar 15 2016

What Is Your Waffle House?

Categories: Rants Dave Rathbun @ 9:14 pm

With “reporting tools” we have become extremely good at letting people see things they already know. At some companies it can be very difficult to break out of that mode. I can’t tell you how many consulting engagements that I had where I was simply asked to take something they already had and recreate it in a newer tool. While possible, work like that was hardly exciting.

In an unrelated note, FEMA, or the Federal Emergency Management Agency, apparently likes waffles.

What do these two concepts have to do with each other?

FEMA realized that Waffle House was an excellent indicator of whether electricity was available in an area, and therefore could be considered an indicator of an area that they would have to assist. If a Waffle House was open, the area had electricity and passable roads. If the Waffle House was closed, maybe not so much. Even better, Waffle House locations were sprinkled all along the Gulf Coast as well as the Eastern seaboard; both hurricane-sensitive areas. You can read the full article (originally published in Popular Science) at the link below.

What I found interesting about the article was the fact that something that on the surface would seem to be completely unrelated becomes interesting when viewed through the proper perspective. I have been looking within my current company, trying to find something similar, and perhaps I am too close to things but so far I have not been successful.

What about you? Have you found your Waffle House?

Related Links


Feb 22 2016

When Charts Go Bad…

Categories: Rants Dave Rathbun @ 4:16 pm

I was reading about the FBI versus Apple issue earlier today and came across this graphic:

Pie charts are viewed with anything from disdain to downright hatred at times. I don’t want to open a debate on that. But you would think for something as simple as this, they would have managed to get it right…

I emailed the author of the original article and hopefully they’ll get the picture fixed.


Feb 17 2016

“Working As Designed”

Categories: IDT, Rants Dave Rathbun @ 12:19 pm

Those three little words: “Working as designed.” I hate them.

I don’t know why it took me so long to experience this, as I found this topic on BOB that brought the issue to light years ago. That being said, here’s what’s going on…

In Universe Designer (UDT) a designer can change the state of an object from visible to hidden without impacting a report. I use this technique to create objects that have a specific purpose but keep them out of general circulation. For example, I will often create a class called “Report Objects” that contains objects with special exception handling or a unique purpose. That class is visible in the development environment only, which means my report developers have access to this class. Before the universe is migrated to the QA (and production) environments that entire class is hidden.

This allows me to create report-specific objects without risking an ad-hoc report using one of them incorrectly. It’s a great technique.

A secondary reason why I use the “hide” option is to retire old objects. I don’t want to break reports, but I don’t want any new reports to be created with these objects. Hiding them means existing reports continue to work.

In Information Design Tool neither of these strategies will work. Once an object is hidden, any report that includes that object fails. :evil: Based on information from SAP, that’s the expected behavior. They acknowledge that it’s a change in behavior, but so far I have not found the reason for the change. The bottom line is that it’s “working as designed” and will not be fixed.

Keep this in mind when you consider converting your UNV universes to UNX, as if you’re using the hidden object trick for any of the reasons I outlined above (or others that I may not have considered) that technique will fail in IDT.


Nov 05 2015

Downgrading from Multi-Source to Single Source

Categories: IDT, Universe Design Dave Rathbun @ 10:01 am

I mentioned a few weeks ago that I had found a way to “downgrade” a multi-sourced universe to a single-source. I wanted to create a beautiful blog post with screen shots to walk everyone through the process, but have not managed to carve out the time. :oops: Rather than continue the radio silence, I’m going to write up the steps and hopefully come back and make this post prettier later on. I also have a request to make which will be at the end of this blog post, so please read to the end, and if you can help, please do so.

Background

We, like I am guessing many Business Objects legacy customers, have been slow to dip our toes into the “multi-source” waters offered by Information Design Tool (IDT). When we did our major upgrade to BI 4 a few years back, we evaluated whether we needed to convert all of our UNV universes to UNX, and decided not to, for a variety of reasons. But for new development we are typically looking at IDT as our universe creation tool. Continue reading “Downgrading from Multi-Source to Single Source”


Oct 12 2015

Yes, Virginia, You Can Switch a Multi-Source Universe Back to Single Source

Categories: IDT, Universe Design Dave Rathbun @ 10:07 pm

What’s the first decision you have to make when creating a new universe with the Information Design Tool (IDT)? You have to specify if you want a single-source or multi-source data foundation. Once that selection is made, it cannot be changed.

Well, sort of.

We had an odd performance challenge with a particular universe. It seemed that when it was created, the developer thought they might want to eventually (perhaps) use multiple sources, so they went ahead and created a multi-sourced data foundation. But the project never ended up needing a second data source, so for over a year they’ve been using a single-source multi-source universe. (Did you follow that?) As a diagnostic tool, I thought about recreating a new universe as a single-source data foundation and resetting the various reports and dashboards so they would use the new universe. That would have been a lot of work, and with no guarantee that it would fix anything, much less have an impact on the issue.

Then I wondered to myself, what if I could figure out a way to “downgrade” the existing multi-source universe to a single source? That way I could still test my theory that our data federator engine wasn’t working as well as it should without having to re-point each report and redo each dashboard.

“spoiler alert” … it worked. :) I was able to convert a multi-sourced universe to a single-sourced version without impacting the reports and dashboards, and we’ve been running on that version ever since.

How was it done? Well, I’m working on a presentation for the local DFW ASUG chapter meeting this coming Friday, and once I have that done I’ll post the details here as well. You’ll just have to wait a few days. :P

Update: I won’t be presenting at the DFW chapter meeting after all, but I will still be posting this solution soon, hopefully next week.

Secondary note: I have also done the process in reverse… converted a single-source universe to a multi-sourced version, but with significantly more work involved. If you have a large number of reports, however; it may be easier to rework the universe and not have to re-point every report. Time will show if I’m successful or not…


Next Page »