Friday, December 11, 2009

Wednesday, October 28, 2009

OBIEE 10gR3 and Essbase System 9/11: Changing the measure dimension of a cube import in the RPD

Something I wanted to post since a while and was just asked again today: when you import a cube with multiple measure hierarchies into an OBIEE rpd...what are the steps to swtich from using one measure hierarchy to another?

Let's start with a very simple cube which was imported as uses "Account" as its default measure hierarchy:

In the structure you see the second measure hierarchy "Scenario" as one of the dimensions. I want to use "Scenario" as our measure dimension and hence our fact in the Business Model. The first thing to do is to clone the whole physical object "Sample Accounts":

In step 2 I double-click the "Custom" physical cube table, to the hierarchy tab and open the "Account" hierarchy.

Note the dimension type currently is "Measure Dimension". I change this to "Other" to transform "Account" into a normal dimension where OBIEE is concerned.

Now confirming with "Ok" shows us the effect of this change:

Now both our dimensions "Account" and "Scenario" are showing as hierarchy objects. The physical cube columns "Net Sales", "Margin" etc. are useless now so I delete them. Next up is the change of "Scenario" into the new measure hierarchy and hence the fact. Again I go into the properties of the hierarchy object and change its dimension type to "Measure Dimension":

That removes the "Scenario" dimension from the display in the physical layer:

What is missing now are the actual members inside the Scenario dimension. During import all the account members are created automatically as physical cube columns. A simple change of the dimension type doesn't automatically create the corresponding cube columns, so I do it manually:

It's actually very simple, you just need to take care to exactly match the MEMBER_NAME property or the ALIAS from Essbase. And make sure to follow the standard: if there's an alias, use the alias otherwise use the member name. In my case I just quickly create two physical cube columns "Actual" and "Budget":

And with that I'm done. I now have a new representation of my cube using the scenario dimension with its members as the measure dimension for analysis and the fact of the OBIEE Business Model:


Wednesday, October 21, 2009

OBIEE - Dynamic data source switching

I stumbled upon a post from Shiv Bharti and thought I'd expand a bit on this idea. First of all to prevent the necessity of using the "Advanced" tab; one of the "don't do this at home, kids" parts of Answers. Secondly since just recently there was yet another question on OTN asking for advice on how to assign / force new values to a session variable during runtime.

Let's do the example with an Essbase cube as a data source to showcase that this approach works with any data source. First, I create an initialization block which goes against an Oracle 10gR2 source to be able to write a simple dummy select statement from DUAL indicating CoFSM42 as my default Essbase server. CoFSM41 is a backup instance and hence my Essbase server which I want to switch to in order to see what the backup contains in terms of data loads (in this example, 42 will contain data until today whereas 41 will be the backup from last week, so no data this week).

select 'CoFSM42' from DUAL

Of course you can implement more elaborate solutions with control tables holding the servers and schemas of your different environment and instances.

In the connection pool to my Essbase source, I change the "Essbase Server" parameter to "VALUEOF(NQ_SESSION.Essbase_Server)

Starting the server, let's to a quick check of the data through answers by using the approach Shiv proposed. First, I create the request on my standard subject area which currently points to CoFSM42 due to the session variable.

I see that the data I entered for this week is present. Using the SET VARIABLE prefix, I change the source to CoFSM41.

Ok, that's empty. Now off to making this switch available on the dashboard through a prompt. The prompt is a fake prompt simply unioning my two server names. Here again, you may use a control table which holds your servers.

The important bit is to choose "Set Variable" as "Request Variable" and in the variable name "Essbase_Server" (my session variable used in the connection pool). Now I combine the prompt and the request in a dashboard to see the effect the "Set Variable" has on the request I've built. Bear in mind that there is no "is prompted" speficication or filter on that request apart from my week specification.

Here's the results prompted with the current server:

And here the ones for the backup server:

So the prompt nicely switches between the different data sources for us without the need of Answers access or within Answers the need to play with the query prefix. Plus, it has shown the use of prompts to changes the values assigned to session variables during runtime.

Now for all those who were hoping for another OBIEE/Essbase post or are thinking about using this to switch cubes easily in their architecture:
Unfortunately this approach can NOT be used to easily switch between cubes. And who's to blame? The substitution variables.

Essbase substitution variables, when defined on database level ("cube" in OBIEE terminology), arrive in the form of "server:application:database:varaiable". E.g.: "CoFSM42:Sample:Basic:vCurQtr".
The "CoFSM42" bit can't be switched out using a variable since the variable name is always interpreted as a literal string. Taking the example from above, doing something like "VALUEOF(NQ_SESSION.Db_Server):Sample:Basic:vCurQtr" won't work since the variable is then actually called "VALUEOF..."

Pity, since that would make your development and testing of different Essbase sources extremely flexible.


Thursday, October 15, 2009

OBIEE / Essbase 9.3.1: UDML for cube outline changes

UDML constantly keeps popping up in conversartions, questions I receive and - as can be seen from a quick query - on the OBIEE OTN forum. So before I go on with this post, a reminder: UDML is NOT supported as an rpd modification mechanism! Everything you do is at your own risk.

Right-o. I'd like to tie my post to the official OBIEE-Essbase modelling guide which can be found here.

Page 8, paragraph 4.1 "Subsequent Changes to the Essbase Outline" mentions the following:

"Cube structure changes (that is, adding or deleting dimensions, and levels) require either a re-import of the cube, or manual modification to the BI Server physical metadata objects to reflect changes."

This is something that quite some people have contacted me about and honestly, I doubt that there's really any case which justifies re-import of a cube if you know your way around the Admin Tool and UDML. Even though it's an Essbase source! (I'll stick to using the term "cube" to denominate Essbase "databases" for this post.)
One thing needs to be noted though. It's a small thing, but it basically forces you to use UDML rather than manual modification through the Admin Tool.

Let's start with a basic cube which I have already imported into my RPD while I was still developing on the Essbase side. So far - where OBIEE is concerned - it only consists several accounts which I can analyze by time.

On the Essbase side, the cube has grown somewhat and I've added my "Scenario" dimension.

To get this into OBIEE, I have two possibilities: re-import of the cube of manual creation of the dimension in the physical layer. Not wanting to lose my work on the BMM and Presentation layers, I choose the latter.

Right-clicking on the "Physical Cube Table" object, select "New Object" -> "Hierarchy".

Then create two new "Physical Cube Columns" below the physical cube table:

Now I have the hierarchy and the two columns:

Next we create the actual hierarchy tree out of them. "New Physical Level":

And add the column:

And the same thing for level 2 giving us this:

Now this structure is correct, usable and transformable into a corresponding business model:

However in the background there is one little thing going wrong which can give you headache in Answers...especially since tracking down the source of the weird errors this produces is a real pain. I have to admit it took me a while to figure it out.

Let's take both the time and the scenario hierarchy and copy+paste them into a text editor.

Looking at the top level of the exported UDML, the two hierarchies are alike and don't differ:

DECLARE HIERARCHY "Demo"."Sample".."CustomDemo"."Time" AS "Time" UPGRADE ID 2161957949 HAVING
"Demo"."Sample".."CustomDemo"."Time"."Gen6,Time" ) MEMBER TYPE ALL EXTERNAL "Time"
DECLARE PHYSICAL LEVEL "Demo"."Sample".."CustomDemo"."Time"."Gen1,Time" AS "Gen1,Time" UPGRADE ID 2161959167 HAVING
"Demo"."Sample".."CustomDemo"."Gen1,Time" )
KEY "Demo"."Sample".."CustomDemo"."Gen1,Time"

DECLARE HIERARCHY "Demo"."Sample".."CustomDemo"."Scenario" AS "Scenario" UPGRADE ID 2161960605 HAVING
"Demo"."Sample".."CustomDemo"."Scenario"."Gen2,Scenario" ) MEMBER TYPE ALL EXTERNAL "Scenario"
DECLARE PHYSICAL LEVEL "Demo"."Sample".."CustomDemo"."Scenario"."Gen1,Scenario" AS "Gen1,Scenario" UPGRADE ID 2161960612 HAVING
"Demo"."Sample".."CustomDemo"."Gen1,Scenario" )
KEY "Demo"."Sample".."CustomDemo"."Gen1,Scenario"

Looking at the two respective extracts for the second level, we see the difference:

DECLARE PHYSICAL LEVEL "Demo"."Sample".."CustomDemo"."Time"."Gen2,Time" AS "Gen2,Time" UPGRADE ID 2161959169 HAVING
"Demo"."Sample".."CustomDemo"."Gen2,Time" )
KEY "Demo"."Sample".."CustomDemo"."Gen2,Time"

DECLARE PHYSICAL LEVEL "Demo"."Sample".."CustomDemo"."Scenario"."Gen2,Scenario" AS "Gen2,Scenario" UPGRADE ID 2161960614 HAVING
"Demo"."Sample".."CustomDemo"."Gen2,Scenario" )
KEY "Demo"."Sample".."CustomDemo"."Gen2,Scenario"

For the imported hierarchy "Time", the "LEVEL NUMBER EXTERNAL" is correctly incremented and stored as "1" (and in fact represents the level number in Essbase) while for the manually created hierarchy "Scenario" the external level number stayed at "0".
If you have hierarchies with more than 2 levels, each level from 1 to N has an external level number of "0".

In the rpd, there is no way for you to affect the external level number, so UDML is your only choice. In all honesty, I normally write my new dimension hierarchies - which should be reflected in the rpd due to cube changes - simply inside a text editor. Starting with an existing hierarchy which I copy+paste, I then write the UDML to fit the Essbase outline and then adapt the external level number to fit the real Essbase level number.

With that problem out of the way there's really nothing you can't represent in terms of cube outline changes without having to re-import the whole thing.

So much for today. Until next time!

Thursday, September 3, 2009

Thomas Siebel's Hydaspes moment

Tom Siebel was attacked by an elephant while on a photo safari back in August, but I actually didn't hear about it until now:

What can I say? Damn, what a holiday...wish you a complete recovery!

Jeff McQuigg's new blog location

Jeff McQuigg moved his blog away from IT Toolbox so I thought I'd quickly share this:


Monday, August 31, 2009

Essbase cube imports do not correctly generate the business model (dimensions, logical tables,…)

I realized this one while importing a rather large cube on a test machine. After the initial import of the cube definitions into the physical layer, I pulled everything over into the BMM layer and saw that dimensions I’d expect were missing completely from the business model. Checking back on the physical layer I saw the corresponding hierarchies were missing as well.

Re-importing and fumbling around didn’t resolve this while doing a cross-check import on my laptop produced a correct representation in both the physical layer and the BMM layer.

Luckily, it seems I wasn’t the only one hitting this issue since a Metalink search yielded document 872342.1. The issue is, that the Essbase API doesn’t find enough open ports to import the outline successfully.

Workaround: Open the registry and navigate to:


Modify / create the DWORD parameter “MaxUserPort” to / with value 65534. Then apply the changes, start the Admin Tool again and re-run the import.


Tuesday, July 28, 2009

RittmanMead Oracle BI Training Days 2009

I'm pleased to announce that the RittmanMead training days for Oracle BI are now officially open for registration.

Speakers at this event will be Mark Rittman, Venkat Janakiraman and myself.


Day 1
o Oracle BI and EPM architecture overview – Mark Rittman
o Oracle BI EE Data Modeling against DW and 3NF sources – Mark Rittman
o Oracle BI Delivers + Integration with Java and BI Publisher – Venkat Janakiraman
o What’s new in Oracle BI, DW and EPM from Oracle Open World – Mark Rittman

Day 2
o Oracle BI EE Data Modeling against Essbase – Venkat Janakiraman
o Leveraging MDX functions and calculations in OBIEE – Christian Berg
o Integrating Security across OBIEE and EPM – Venkat Janakiraman
o I can do this in Hyperion – how do I do it in OBIEE? – Christian Berg and Venkat Janakiraman

Day 3
o OBIEE Systems Management with OEM BI Mgmt Pack – Mark Rittman
o OBIEE Configuration Management Best Practices – Christian Berg
o ODI functionality in Oracle BI Applications – Mark Rittman
o ODI Integration with Essbase, Planning and Oracle EPM Suite – Venkat Janakiraman

Once more, here's the link to the full event details:

You can find the registration page at


Thursday, July 23, 2009

OBIEE - Catalog path containing "."

I was just testing multiple web catalogs when I realized that it's a bad idea to put version numbers into the web catalog name.

As an example, I have my web catalog "samplesales_paint_v1.3" sitting ready in OracleBIData\web\catalog and my instanceconfig.xml looking like this:

< ?xml version="1.0" encoding="utf-8"?>
< WebConfig>
< ServerInstance>
< DSN>AnalyticsWeb
< CatalogPath>F:\OracleBIData\web\catalog\samplesales_paint_v1.3

Starting up the server will not load the refrenced catalog, but rather create a new one from scratch: "samplesales_paint_v1"

And the Oracle BI Presentation Services Administration duly notes:

Physical Presentation Catalog Path \\?\F:\OracleBIData\web\catalog\samplesales_paint_v1\root

The sawlog0.log reads as follows:

Type: Error
Severity: 40
Time: Wed Jul 21 23:46:36 2009
File: project/webcatalog/localwebcatalog.cpp Line: 1507
Properties: ThreadID-4328

Could not load catalog F:\OracleBIData\web\catalog\samplesales_paint_v1.3. Either it does not exist or insufficient permissions.
Type: Warning
Severity: 40
Time: Wed Jul 21 23:46:36 2009
File: project/websubsystems/httpserverinit.cpp Line: 49
Properties: ThreadID-4328

Creating Catalog F:\OracleBIData\web\catalog\samplesales_paint_v1.3.

The log is incorrect on both accounts. "F:\OracleBIData\web\catalog\samplesales_paint_v1.3" does exist and the folder creation in the warning message may use the correct naming but actually creates folder "F:\OracleBIData\web\catalog\samplesales_paint_v1".

XML normally accepts "." inside the element content so I guess this is a legacy fragment from Siebel Analytics versions where the web catalog was a .webcat file. Why? Well, using this element:

< CatalogPath>F:\OracleBIData\web\catalog\paint.webcat

starts my "paint" folder ;-)

Any comments on this are welcome.


Monday, July 20, 2009

"in between" filters for MDX sources

Fiddling around with some more functional options we've all come to know, like and use frequently, I found that "in between" filters in answers requests going against MDX sources (Essbase, MS Analysis server etc) don't work as expected.

To showcase this, I have created a simple request. Planning and Current numbers by Fiscal Month within Europe. And I'm interested in all months between Q1 2009 and Q4 2009:

Checking the results tab we see that it's not necessarily what we'd expect:

Doing a full data scroll reveals that basically all Fiscal Month members are being pulled up.

Here's the MDX from the log:

set [Geo3] as 'Filter([Geo].Generations(3).members, (([Geo].CurrentMember.MEMBER_ALIAS = "Europe" OR [Geo].CurrentMember.MEMBER_Name = "Europe")))'
set [Time Period3] as 'Filter([Time Period].Generations(3).members, ( NOT (([Time Period].CurrentMember.MEMBER_ALIAS < "Q1-09" OR [Time Period].CurrentMember.MEMBER_Name < "Q1-09"))) AND ( NOT (("Q4-09" < [Time Period].CurrentMember.MEMBER_ALIAS OR "Q4-09" < [Time Period].CurrentMember.MEMBER_Name))))'
set [Time Period4] as 'Generate({[Time Period3]}, Descendants([Time Period].currentmember, [Time Period].Generations(4),SELF), ALL)'
member [Scenario].[MS1] as 'AGGREGATE(crossjoin {[Geo3]},Scenario.[Current])', SOLVE_ORDER = 100
member [Scenario].[MS2] as 'AGGREGATE(crossjoin {[Geo3]},Scenario.[Planned])', SOLVE_ORDER = 100
{ [Scenario].[MS1],
} on columns,
NON EMPTY {{[Time Period4]}} properties ANCESTOR_NAMES, GEN_NUMBER on rows
from [Sample.Sample]

And here's the important bit:

set [Time Period3] as 'Filter([Time Period].Generations(3).members, ( NOT (([Time Period].CurrentMember.MEMBER_ALIAS < "Q1-09" OR [Time Period].CurrentMember.MEMBER_Name < "Q1-09"))) AND ( NOT (("Q4-09" < [Time Period].CurrentMember.MEMBER_ALIAS OR "Q4-09" < [Time Period].CurrentMember.MEMBER_Name))))'

I've opened an SR for this since there's no mentioning on metalink, the release notes or the new features guides.


Updated my blog list

Added two new Oracle BI blogs to my blog list:

Changing print controls through catalog manager

Recently I was off to change the print options on all of my dashboard pages. Naturally, as with all setting, this is encoded in the XML and can be mass-manipulated by the catalog manager.

Here's the section in the XML, changed for landscape format, A4 paper size (yes, no weird US sizes ;-)) and explicit exclusion of headers and footers:

< ?xml version="1.0" encoding="UTF-8"?>
< sawd:dashboardpage sawd="" saw="" xmlversion="200705140" isempty="false" duid="hsdfj3478387bs82" personalselections="true">
< span style="font-weight:bold;">
< saw:pageheader show="false">
< saw:pagefooter show="false">
< /saw:pagefooter>
< /saw:pageheader>


Friday, July 17, 2009

Sun shareholders "ok" Oracle takeover

Yes, I know they're always talking about "mergers"...just like they "merged" with Siebel or they "merged" with Peoplesoft ;-)

"There can be only ONE!"


Monday, July 13, 2009

OOW 2k8 and 2k7 presentations

Thanks to Steven Chan for reposting the links to the OOW presentations. Very useful since I had lost the URL myself a while ago.


Thursday, July 9, 2009

You now you're surrounded by geeks when...

...on, there's a posting on the return of Monkey Island!

God I love this industry. And yes, I have the game on order already and fdefinitely will get me each and any incarnation of the old classics from Steam and then the ones they're preparing for the Wii ;-)

Tuesday, July 7, 2009

OBIEE / Essbase MDX generation issues finally resolved

In one of my recent posts I was talking about the different patch levels for and and their impact on functionality and the MDX produced. Interestingly enough not all of the problems appear with every cube outline imported into OBIEE. Oracle initially had troubles reproducing some of the errors (using the normal OOB Essbase sample cubes).

To recap: the changes to the connectivity DLLs first introduced in patch 7349048 (for enabled OBIEE / Essbase security integration, but had some side effects. Performance dropped up to 3000% in some cases, standard report level totals or pivot totals stopped working etc. This is mostly due wrongly produced MDX. Especially for the grand totals the situation is quite clear:

set [Currency2] as 'Filter([Currency].Generations(2).members, (([Currency].CurrentMember.MEMBER_ALIAS = "US Dollar" OR [Currency].CurrentMember.MEMBER_Name = "US Dollar")))'
set [Time Period2] as 'Filter([Time Period].Generations(2).members, (([Time Period].CurrentMember.MEMBER_ALIAS = "2009" OR [Time Period].CurrentMember.MEMBER_Name = "2009")))'
set [Time Period3] as 'Filter(Generate({[Time Period2]}, Descendants([Time Period].currentmember, [Time Period].Generations(3),SELF), ALL), (([Time Period].CurrentMember.MEMBER_ALIAS = "Quarter 1" OR [Time Period].CurrentMember.MEMBER_Name = "Quarter 1")))'
member [Currency].[CurrencyCustomGroup]as 'Sum([Currency2])', SOLVE_ORDER= AGGREGATION_SOLVEORDER
member [Time Period].[Time PeriodCustomGroup]as 'Sum([Time Period3])', SOLVE_ORDER= AGGREGATION_SOLVEORDER
{ [Account].[Sales]
} on columns
from [Sample.Sample]
where ([Currency].[CurrencyCustomGroup],[Time Period].[Time PeriodCustomGroup])

The MDX simply contains no "on rows" specification. These changes remained in all subsequent patches and are also part of Hence, this release suffers from the same problems.

On, there exists patch 8444119 which basically was constructed after 8444119 itself does not solve any of the issues mentioned above. It does however include a small, non-documented switch which resolves all these issues. NOTE: this is NOT documented and will only work on! Oracle told me they're working on a similar fix for and on updating the patch information, but no sign of either yet.

In a environment, install patch 8444119, then create a new environment variable: "OBIS_Essbase_CustomGroup_Generation". Set the value to "0" (without the quotes). This will correct the issues and correct the MDX fired against Essbase. Also, performance will be stabilized and go back to the level of an unpatched

Oracle unfortunately didn't specify in detail what this
OBIS_Essbase_CustomGroup_Generation "performance knob" (as they call it) does behind the scene and what the other two available settings "1" and "2" entail. Hopefully they update patch 8444119 soon since its documentation still says nothing about the existance of this environment variable.

Document 845594.1 has been created an published on Metalink. This is the only document existing which points to or mentions
OBIS_Essbase_CustomGroup_Generation. So if you're out there, going against Essbase cubes and running with 8444119 (or any MDX-influencing other patch below) or a full blown then check it out.

WARNING: While document 845594.1 outlines the issue and the solution, the version it refers to is WRONG!

"Applies to:
Business Intelligence Suite Enterprise Edition - Version: [1900] - Release: 10g"

Patch 8444119 and the use of environment variable
"OBIS_Essbase_CustomGroup_Generation" do only apply for patch You can not use this solution with


Monday, July 6, 2009

OBI Apps ODI vs. 7.9.6 Informatica

I just was asked what's actually in the BI Applications release which comes with ODI rather than Informatica. In general, the ODI release is while the newest Informatica one is 7.9.6

Here's the link to the ODI Apps (Controlled Availability Release) documentation:

So BI Apps with ODI is an Oracle eBusiness-exclusive release with eBusiness being required to be at release 11.5.10. These are the Applications covered in the release:
  • Oracle Financial Analytics
  • Oracle Human Resources Analytics
  • Oracle Supply Chain and Order Management Analytics
  • Oracle Procurement and Spend Analytics
The newest DAC on the other hand (it's no longer included in the files you download for or 7.9.6) is and can be found here:


Saturday, June 27, 2009

Time Series and Fragmentation

I received a hint on OBIEE time series functions from my friend Peter S. Apparently the time series functions AGO and TO_DATE don't work when you're using it on tables with fragmented LTS.

A quick test yielded this:

State: HY000. Code: 10058. [NQODBC] [SQL_STATE: HY000] [nQSError: 10058] A general error has occurred. [nQSError: 22042] AGO function may not be used on 'Amount' because its source is horizontally partitioned. (HY000)

Checking metalink, I found document 739584.1 - AGO function with combined multiple sources. Enhancement request #7438154 has been raised and is targeted for 11g.

Cheers all!

Thursday, June 11, 2009

Silence of the posts

My blog is somewhat playing the sounds of silence recently. I don't really have the chance to write posts a lot these days. At least not if I want to blog with sufficient details. And yes, I know I still owe you several posts / topics.

Sorry guys. Content coming up soon ;-)

Thursday, May 28, 2009

The vendor client relatioship

Thanks for tipping me off on this one really made me laugh :-D Plus, it is SO true!

Thursday, May 21, 2009

Update on the OBIEE/Essbase security integration

Hi guys,

Just quickly to let you all know. I'm not cutting you out of the loop by not writing on the details, but if I'd have to document all tests I run and all issues I find, I will still be here next month.

Bottom line:
- watch out for with 734908
- watch out for with 844119
- watch out for

With those three versions be VERY careful that you test each and every report you have running against Essbase.

I will update you as soon as I can with a finalized overview. Probably only once Oracle has come back...

Tuesday, May 19, 2009

Reminiscing Brighton

Last week I had the chance an (honestly) privilege to take part in the BI Forum in Brighton. First and foremost, I'd like to say a huge thanks to the team of RittmanMead for organizing this and giving us geeks a chance to unite. We've got to do this more often!

Next up, something personal. I was extremely happy to finally meet Venkat face to face. We seem to have developed one of those friendships that start online and when you meet, it's like you know each other since years. Also, I was extremely content to meet Ed again. Damn already a year since we finished our last project together.

Let's get back to business. The lineup of speakers was quite something to say the least. I'm not going to name anyone in special since I then need to name each and every one of them.......ok here we go: Craig Stewart, Edward Roske (We're not wooorthy!), Mark Rittman, John Minkjan, Venkat (I think, you're actually entitled to carry "Venkat" as your sole identifier. Everyone else must change his name ;-)), Adam Bloom, Andreas Nobbmann, Mike Durran, Emiel von Bockel, Maarten-Jan Kampen, Adrian Ward, Antony Heljula and Peter Brink. Total quality in all presentations and here's my 10 cents on them:

a) "BI Apps with ODI" from Craig Stewart: Now here we have an interesting new piece in Oracles product portfolio. Good for new customers. Questionable (at best) for existing ones.

b) Edward Roskes "Internals of Essbase". What can I say? Edward Roske. On Essbase. Ocean. Lots of Water. Understatement. And I just loooved his texan humour.

c) Mark Rittman spoke about "BI Apps Optimization". Do have to say "see b) ?". Didn't think so.

d) John Minkjan had a very nice presentation on caching. I would have like to taped it and hit a lot of people over the head with the tape (and the camera). Listen to him, people. Caching is a tool, a technique, something that can help you. It's NOT the solution for your problems! Thanks John.

e) Venkat had a cool gimmick prepared. He had actually written a program to keep OBIEE and Essbase security in line. Extremely nice stuff! It was his first presentation in a forum like this and in Europe. Well done! I'm going to steal that ;-)

f) Then we "blew up a BI server" with Adam Bloom. Talk about small cuase with huge effect. Ouch.

g) Andreas Nobbmanns "UDML and XML for grown-ups" session. I vote for his presentation to be judged "Too dangerous to be known widely!". Please don't let kids touch that one. They destroy enough as it is. Personally...I loved it ;-)

h) Friday started with Mike Durran on 11g. I won't say more. Or I will have to kill you. Seriously. Ed Roske has the gun.

i) Emiel von Bockels presentation. Now THAT spawned discussions. Great stuff! I'd love to see how their solution evolves over the nexst 3 to 5 years. I hope he keeps us up to date.

j) Oh yeah...I'm still not over Maarten-Jan Kampens statement on using "a bit of code" for his "OBIEE and Mapviewer" integration. Dude...copyright that stuff.

k) Adrian gave me permission to skip his session since I had stuff to do, so not much to say here.

l) Antony Heljulas "SOA" bit was cool. Reminded me of my first trials after Marks mastclass last year. Gotta say though. Very nicely presented. Better than most of Oracles sales force.

m) Peter Brinks "Cube Organized Materialized views". Essbase cubes? Oracle OLAP? Cube Organized Materialized Views? It seems we're actually getting too many options from Oracle right now. Would be interesting to see how Oracle positions those options with the clients.

Wrapping the whole event up in one word sounds like this: awesome. In two words: fraking awesome! Loved it, look forward to the next, want to do one myself in Switzerland now. (Check my blog or Andreas for news on that idea).

That's it from my side for today. Off to catch some sleep.


Grand Totals in Essbase sourced requests after

I noticed something very disturbing after applying to our development environment: report grand totals for tables and row totals for pivots don't yield any results anymore unless you manually set all fact columns to "Sum" or "Avg" in the report. The "Aggr_External" which has normally had its effect (by importing it from the cube through the physical to the BMM layer to presentation layer) no longer takes care of this.

Let's look at the issue in detail:

Upgrading to, this is what we find as a default behaviour in a simple table report:

And in a pivot:

Now think about the concept behind building an rpd on top of an Essbase source for a second. Ponder what Venkat wrote about aggregations in for 4-article series here, here, here and here. No longer utilizing Aggr_External and just no longer aggregating at all is just plain stupid. (Yes, I'm on kind of a rant here...)

Now to correct that quick & dirty in the report, we assign an aggregation rule in the column formula:

And...oh miracle...we have data again:

Now as I said, I find this nothing but retarded.
Why has this been changed? Is this actually expected behaviour? If so...what's the rationale behind that? Normally, aggregation rules set on the BMM layer always persist unless they are explicitly overwritten either in the column formula or for pivots in the measures. Why drop this all of a sudden?

My manual intervention is a workaround, but I'm facing several cubes and almost a hundred reports. Furthermore, one of the Account dimensions has more than 400 members. So manually changing 400 potentially mapped account members isn't fun...

So the solution (if I will actually keep and not just drop if for a downgrade) would be to modify the BMM layer and change all normal account members to "Sum" and all percentages to "Avg" as aggregation rule.

Stay tuned for my upcoming rant on dimension members and filters. Another extremely intelligent change with

Venkat reacted to my post here. I tested it and it seems that despite having label only top level members like he mentioned, some analyses seem to work with totals and grand totals. This will need more investigating.



I still haven't blogged about the BI Forum in Brighton, the things I saw there, the things I wanted to test afterwards OR the things I need to blog in detail to explain my ad-hoc speech...

...all because of bloody! God that release is a pain when you're using Essbase. I know it's bad form to blog about what you're about to blog about, but I will definitely fire off a huge rant with detailed descriptions soon.


Saturday, April 25, 2009

OBIEE is out

John Minkjan has won the race for the first post on!

You just gotta love the little things he immediately found.
  • 100% installation
  • old samplesales
  • BIP still not working OOB
Well let's hope that the background looks better and all the bugs have been fixed that plagued us in the base release. Off to searching for a list of fixed bugs and patches included.


Thursday, April 23, 2009

Swiss readers?

No use hiding, folks. I see that there's people from CH browsing my blog. Yes, and I've also seen the connection from Baden and the Oracle CH offices ;-)

So please leave a comment. I'd like to see who's actually interested in this here in Switzerland.


Tuesday, April 21, 2009

The ominous patch 8293083

Greetings, programs!

Back on April 3rd, when I initially wanted to get patch #8293083, I hit a wall.

Product Oracle BI Suite EE
Last Updated 18-MAR-2009

The patch had been made obsolete over night. So I raised an SR with Oracle. First response on the 14th:

"Bug 8293083 is a duplicate of Bug 8332167.

Bug 8332167 is fixed in
Also in patch for Linux ARU 11105211
No patch for Windows."

Sweet. So because it was a duplicate of a Linux patch the Windows patch was made obsolete? My mind boggles trying to understand that logic, but anyways I insisted on getting the patch and voilĂ ! The patch exists again for Windows since the 17th.

Sounds good? No, not really.

8293083 is actually QF420 and in order to install it, you need to replace your whole OracleBI\server\bin folder. In itself, not an issue, but there's also patch 8290868 (QF410), which is a patch for 7482937 which resolves a rather serious issue for MDX generation for and/or filters.

The issue being that 8290868 - like 8293083 - replaces the whole bin folder. And Oracle still can't confirm that 8293083 is an incremental patch and includes all of 8290868.

So right now there's a choice.
a) Use 8293083, but not be sure that 8290868 and 7482937 function correctly


b) Use 8290868and 7482937 and wait for a final statement from Oracle.

I've gone for b) since it's extremely difficult to track down each and every weird behaviour in the MDX generation for all filters, EVALUATE statements, unions etc etc.

Of course I will keep you guys updated with my latest findings and Oracles statements.

End of line.

Troubleshooting problems with Target Level – Sampling Tables

Greetings, programs!

A little excursion into the little-known world of Marketing Analytics. Namely sampling tables for target levels. Here's some tips for getting them to work correctly:

I.) Check that there is a physical table mapped that to the W_ table defined as SAMPLE_W_ has Object Type “None”…otherwise you can’t select a dynamic name for the table. “General” tab:

II.) Check that the correct Sampling Table identifier is associated with the physical table in the “Dynamic Name” tab:

III.) Creating new target levels in Siebel Analytics:
  • After creating new target levels for marketing segmentation, it is necessary to administer them correctly in the Presentation Server
  • Start the BI Server with the new rpd
  • Go to “Admin”
  • Go to “Manage Marketing Defaults”
  • Choose the “Default Campaign Load File Format”, the “Default Global Audience Segment” and the “Profile Dashboard” for each Target Level
  • If you don’t yet have these items, create them

IV.) Creating an Answer request as "Profile Dashboard" for a target level:
  • Create the request as usual with the columns you want to have displayed.
  • To make it work for the drill down on the segment count, add a filter that prompts for the ID. E.g.: "ROW_ID is prompted"
End of line.

Monday, April 20, 2009

Special characters in EVALUATE-wrapped MDX

Greetings, programs!

Something I ran into recently is a little bug in the way OBIEE produces MDX out of an EVALUATE wrapper when the MDX contains special characters. (yes, it is a bug)

Let's start with a basic and pure OBIEE report. We want to see the average coverage % by the 3rd generation of our product hierarchy:

Now let's switch out the presentation column for an EVALUATE function with MDX as what we really want is the average coverage % for the scenario "Actual":

EVALUATE('(%1.dimension.currentmember, [Scenario].[Actual],[Account].[Coverage % Avg]).value' as INTEGER, Product."Gen3,Product")

Let's run it.

"Unknown Member Coverage % Avg"? Liar. I see it directly in front of me. Off to checking the log:

set [Product3] as 'Generate([Product].Generations(2).members, Descendants([Product].currentmember, [Product].Generations(3), leaves))'
member [Account].[MS1] as '([Product].Generations(3).dimension.currentmember,[Scenario].[Actual],[Account].[Coverage % Avg])).value'
member [Account].[MS2] as 'RANK([Product].Generations(3).dimension.currentmember,[Product].Generations(3).members)'

MS2} on columns,
{{[Product3]}} properties ANCESTOR_NAMES, GEN_NUMBER on rows
from [MyCube.Ess931_ASO]

-------------------- Query Status: Query Failed: Essbase Error: Unknown Member Coverage % Avg used in query

Do you see the difference? While on the guy, it states "Coverage % Avg", the log and the MDX which is actually fired, reference "Coverage % Avg". TWO spaces!
So let's work around this by changing the EVALUATE to include the "StrToMbr" function:

EVALUATE('(%1.dimension.currentmember, [Scenario].[Actual],StrToMbr("[Account].[Coverage % Avg]")).value' as INTEGER, Product."Gen3,Product")

When running this, we get the correct results again:

So if you want to reference members containing special characters, use StrToMbr since otherwise you will get a non-functioning MDX from the BI server.

End of line.