Aug 13 2013

Mastering Business Analytics with SAP

Categories: Conferences Dave Rathbun @ 12:27 pm

I have said this before: if you ever get invited to speak at a conference hosted by The Eventful Group, don’t wait, don’t think, just do it! They treat speakers fantastic, and their events are well put together. Thanks to Sophie, Debra, and crew for making my stay in Melbourne wonderful despite the fact that my luggage got there 36 hours after I did…

Josh Fletcher tweeted the following link to a collection of tweets and pictures on Storify that summarize the event:

I had two sessions. The first one on Monday was scheduled opposite Mico so I didn’t have that many people attend. πŸ˜‰ In my first session I talked about PepsiCo and our initial success (after a long and winding road) with Explorer. I will be repeating this talk (with some minor changes) at the SBOUC event coming up in California. On Tuesday I had a second session talking about the Information Design Tool and various items to consider when upgrading to BI4. I will be posting more about that topic here on my blog in the coming months.


Aug 01 2013

Is External Data Always Good?

Categories: Rants Dave Rathbun @ 6:43 pm

Note to readers: I started this post back in 2011. After taking a break from blogging I am going back and looking through some of my old drafts and seeing what might still be current, and what has expired. I thought that this one merited some additional attention and publication, even though some of the notes are from two years ago. — Dave

SAP had some fun on the BI 4.0 launch in New York a while back. For years SAP (and other vendors) have been talking about their ability to bring in external data from various social medial sources. Two SAP presenters at the launch event took a vote via Twitter as to which tie would meet the “Scissors of Destiny” at the end of the session. (Steve Lucas made an impassioned plea to save his tie, which he said was a gift from his wife, versus Dave’s tie which he “… just bought last night.” Steve won, and his blue tie survived.) It was a fun display of technology, but is it really that important? How impressive would it have been if the “fail whale” had picked that moment to make an appearance?

I don’t usually spend a lot of time here on my blog talking about philosophical aspects of BI as I am personally more interested in technical issues and solving problems. But the apparent consensus as to the importance of social media bugs me.

The Internet is a wild place where rules are not always followed. If there is money to be made, then someone will figure out a way to abuse the system. It’s not just the “little guys” either, as evidenced by the way retailer JC Penney apparently took specific steps to trick Google during the holiday shopping season. Again, this was back in 2011.

What do you do with the information?

Does it do any good to listen to what is being said on social media without having an action plan to respond?

Do you really trust an external entity (such as Facebook) to host critical data?

Did you know that you can reportedly buy Twitter followers now? (Seriously, google for “buy twitter followers” and see what you find.)

There are rumors that Sarah Palin got caught setting up a secondary Facebook account, just so she could “like” herself and skew the results shown on her main page. This type of abuse – if performed manually – should have minimal impact. However it is apparently far too easy to set up bots that can be tasked to perform the same sort of task. In fact there are companies that you can legitimately hire (as opposed to going underground) to do this for you. One term I came across while researching background for this article was quite amusing: hactivist. πŸ™‚

Is there a point to all of this rambling? Not really, I guess. Or if there is, it’s that despite SAP and everyone else appearing to really want to make social media relevant, I find myself asking why is it so important?

Human behavior – online or not – often boils down to risks and rewards. The problem is that rewards can inspire the wrong behavior. I talked about this in a guest blog at The Decision Factor: Lessons in Business Intelligence: Be Careful What You Wish For. The cost of setting up a web site today are extremely minimal. The ability to generate advertising revenue, however, is also very minimal. Suppose that it costs $10 a year to host a site and it makes $0.50 per month in revenue. It’s hardly worth doing, right? But what if you scale that up. Now it costs $1,000,000 to set up the sites, but you’re generating $50,000 per month in income. I can’t find a link at the moment, but there was some guy that was making millions of dollars buying expired domains and putting junk content on them.

By one estimate, the Internet will soon have more garbage than valuable content! Some might say that this has already happened.

That being said, there are certainly valid reasons to consider using social media. The recent (yes, this really is recent) phenomena of Sharknado proved that. πŸ˜‰


Jul 30 2013

Pivot UserResponse() Values Into Rows

Categories: Report Techniques,Variables! Dave Rathbun @ 6:15 am

Several years ago I wrote a post that has generated a fair number of follow-up questions and comments. The post was related to splitting user response values onto separate rows and it used some basic string operations to do that. The important distinction is that the values were on different rows but remained in a single cell, which meant the output was suitable for a header cell.

Recently I got a comment that posed this question:

In one of my reports there is prompt to select the Fiscal Year and the user can select multiple LOVs. Prompt Name is β€œYear”. Say for example the user enters 2011,2012 and 2013. On using the function userresponse the report will show 2011;2012;2013

My requirement is to identify minimum value from the LOVs that the user has entered. In this case the report should show the mininum value as 2011. Can you please guide me on how to achieve this?

Continue reading “Pivot UserResponse() Values Into Rows”


Jul 25 2013

Updated Strategy Renders Schema Change SDK Tool Obsolete

Categories: Universe Design,VBA Tools Dave Rathbun @ 10:23 am

Many years ago when I first started working with Teradata we had a challenge. The DBA team had defined a standard where each table was tagged to indicate the purpose of the table. Specifically, and development table would be called PROJ_D.TABLE_NAME and the equivalent table in production was called PROJ_P.TABLE_NAME. Why do this? In a Teradata connection I connect to a server, not to a schema (or instance as Oracle would call it). One of the DBA strategies was to use the QA or “system test” hardware as a preliminary DR (disaster recovery) platform. That means they keep a copy of the tables in PROJ_S and the same table exists as PROJ_P on the same server. In order to have specific queries sent to the database I had to include the schema name (or prefix) on every table name in my universe. During DR testing I could reset my schema from PROJ_S to PROJ_P without moving the universe (it’s still on the QA server) and now I would be using the production tables.

While this did provide the advantage of being specific, it also presented a problem during migrations because I had to change my code. I first wrote about this back in 2008 when I shared a utility that I wrote to make this process easier. (Read the details in the Using the Designer SDK to Ease Migrations post.)

With the advent of BI4 the new challenge is that we don’t (yet) have an SDK. At the same time we have become a much more mature Teradata shop. Our DBA team recently introduced a new strategy that eliminates the issue altogether, meaning I don’t have to worry about either of the issues listed above.

New View Layer

Our standard for reporting says that our tools (Business Objects included) never access the source tables. As a result, the naming convention I described above was used on all of the views that I used in my universe. Assume we have views called PROJ_D.TABLE1, PROJ_D.TABLE2, and PROJ_D.TABLE3. In the “old days” I would have to update each of these tables when I want to migrate my universe out of the development environment. Our new strategy seems simple on the surface, probably because it is :), but it solves both issues.

For each and every table that I am using we have now created a “semantic layer” view on top of the base view (which is on top of the base table). So for each of the above tables I know have this:

create view SEM_PROJ.TABLE1
as
select * from PROJ_D.TABLE1

The “PROJ” portion of the schema related to the project, so it remains as part of the view name. The “SEM” prefix is of course short for “semantic layer” and indicates that this view is used by some sort of external tool. What is missing from the new view name is the location tag (either _D or _S or _P for production). This seems like a very simple solution (and it is) but it took a while for us to get here. We have created the semantic layer views for one project so far, and it’s a real pleasure to be able to migrate a universe without touching it anymore. 😎 I anticipate we’ll be using this strategy for all of our Teradata projects from this point forward. Obviously the select clause changes in each environment, but finally I have a consistent view name in all environments.

When I have to reset my universe due to a disaster recovery exercise, it’s now up to the DBA team to re-point the semantic layer views to the proper schema. When I migrate the universe I no longer have to touch anything, except to perhaps change a connection. It’s a much cleaner process, and no longer requires me to be concerned about waiting for the full SDK to become available in BI4.


Jul 23 2013

Really Cool Stock Market Visualization

Categories: General Dave Rathbun @ 3:08 pm

I was looking over some financial sites the other day and ran across this visualization that is based on the stock market. The overall presentation is divided into blocks of companies grouped by sector. The size of the company within their sector is based on their market capitalization, and the color indicates the current stock price trend. Clicking on a sector allows the viewer to “drill” down into the sector data. At the bottom of the frame is a drop-down that allows you to specify the time range used to identify the up or downward trend, with options for since last close, 26 weeks, 52 weeks, or year to date. While the color (green or red) shows the direction of the stock price move, the intensity of the color shows the percentage of the move. A company like Barrick Gold which has dropped substantially during 2013 (down ~48%) is fairly bright red, while PepsiCo (up ~25% YTD) is a muted green.

I think this is one of the best uses of this visualization style that I have seen, and decided to share it.


Mar 16 2013

Funny For The Day

Categories: General Dave Rathbun @ 9:30 am

Click through for more from the same artist.


Feb 27 2013

Dagira Universe Compare Tool Bug Fix

Categories: VBA Tools Dave Rathbun @ 2:28 pm

A user recently commented that outer join settings were not being properly captured by my universe compare tool. It turns out there is a very simple fix. If you have previously downloaded a copy of the tool, simply open the VBA editor and find this section of code:

aColTypes(dsCharacterColumn) = "Character"
aColTypes(dsDateColumn) = "Date"
aColTypes(dsNullColumn) = "Null"
aColTypes(dsNumericColumn) = "Numeric"
aColTypes(dsTextColumn) = "Text"
aColTypes(dsUnknownColumn) = "Unknown"

Immediately after that (or after a blank line if you prefer) add these new lines of code:

aJoinTypes(dsFullOuter) = "Full Outer"
aJoinTypes(dsNoOuter) = "No Outer"
aJoinTypes(dsOuterLeft) = "Outer Left"
aJoinTypes(dsOuterRight) = "Outer Right"

That is the missing piece that was causing the compare tool to skip recording the outer join types.

I will update the downloadable version and upload it shortly. For now, this will fix the bug.


Feb 18 2013

Back After A Nice Break…

Categories: General Dave Rathbun @ 5:26 pm

Hello folks, I’m back after a nice break with a quick comment from our favorite curly tie engineer:

I needed a break from various things, but I should be back and blogging soon. Conference season is starting up. BI2013 was going to be first on my list, then SAPPHIRE. I had to back out of BI2013, and the jury is still out as to whether I’ll make it to SAPPHIRE/ASUG Annual Conference just yet.

I have finally started to do some serious work using the Information Design Tool, and my first blog post related to work flow differences between IDT and Universe Designer should be ready to come out shortly.

Finally, I have added a new permanent page to my side menu titled, “Donating To Dagira.” Please check it out and see if it’s something you would be willing to support, thanks!

I am way behind on approving comments. 😳 I will be working through those this week.


Nov 27 2012

Guest Blog Post: Be Careful What You Wish For

Categories: General Dave Rathbun @ 9:40 am

It has been a busy couple of weeks (months!) and my production here has suffered. My most recent post is not here but a guest blog at The Decision Factor: Be Careful What You Wish For. I was inspired by a post at the Freakonomics Blog that discussed human nature and the unanticipated result of incentive programs as I remembered one particular client experience with a dashboard. If you haven’t read it yet, give it a try.

I’ll be back here soon.

Author Note: The original link to The Decision Factor no longer works so I have recreated the post here. The link above will take you to that post instead.


Oct 10 2012

2012 SAP Business Objects Presentation Posted Soon

Categories: 2012 SBOUC Dave Rathbun @ 6:07 pm

My virtual machine that I used at the conference seemingly crashed, or at least it would not come up in any way that I could interact with it. I fixed that this morning, and created the PDF file that I will need to send to ASUG and to post here. My apologize for the delay, but it will be up within the next few days.


« Previous PageNext Page »