Want to know The Truth About CPM?

18 May 2015

I don’t ordinarily write about conferences other than Kscope, but when I do…

…I write about the Rittman Mead BI Forum 2015

Tim German and I just spoke at Rittman Mead’s annual BI Forum.  Alas, we didn’t go to the one in Brighton (I attended the Comshare EMEA conference there in 1992 but I cannot find any links to it – I think it was in the hotel the IRA tried to assassinate Margaret Thatcher in although I note that RM’s conference was in a different hotel).  Instead we went to the one in Atlanta (come think of it, I believe I attended a Comshare conference there as well – maybe 1991?  1993?  I can’t remember for the life of me.) and it was absolutely brilliant.

Why?  I find technology just fascinating (erm, you would imagine that based on this blog) and the presented technology at this level is all new to me.  Fascinating.
NB – For those of you who think this is a sycophantic recounting of their conference, it isn’t.  I’m pretty hard to impress and I was just blown away by the conference.  Think of this post as enthusiasm and well deserved at that.

NB #2 – Mark went out of his way to let the two token Essbase guys feel at home.  Many thanks, Mark.

Day by day

Day 0

The conference started out with a master class on Oracle Big Data.  We’ve all heard a lot of hype and hyperbole about Big Data.  I have to say that after watching Jordan Meyer and Mark present on it, I think Big Data might just deserve it all.  I also have to admit this is the first time I’ve had a bird’s eye view of R, EDA, Oracle Big Data SQL, and I-don’t-know-how-many-other-techologies, but the kinds of data, the statistical analyses, and reporting just blew me away, as did the rest of the conference.

Day 1 & 2

Rather than try to give my impressions of each one of the sessions, which would be difficult ‘cos I was mostly completely at sea, the conference covered subjects and technologies that are not in the toolkit of at least this Essbase hacker.  I’ll list it and try to give you a feel of my overall impression of the conference.

The list


  • Brian MacDonald or Chris Lynskey, Oracle Corporation : “Looking Ahead to Oracle BI 12c and Visual Analyzer”
  • Robin Moffatt, Rittman Mead : “Smarter Regression Testing for OBIEE”
  • Bill Creekbaum and Gabby Rubin, Oracle Corporation : “Solid Standing for Analytics in the Cloud”
  • Hasso Schaap, Qualogy : “Developing strategic analytics applications on OBICS PaaS”
  • Tim German / Cameron Lackpour, Qubix / CLSolve : “Hybrid Mode – An Essbase Revolution”
  • Kevin McGinley, Red Pill Analytics, “Agile BI Applications: A Case Study”
  • Stewart Bryson, Red Pill Analytics, “Supercharge BI Delivery with Continuous Integration”
  • Ryan Stark or Chris Lynskey, Oracle Corporation : “Big Data Discovery”
  • Andy Rocha & Pete Tamisin, Rittman Mead : “OBIEE Can Help You Achieve Your GOOOOOOOOOALS!”
  • Christian Screen, Sierra-Cedar : “10 Tenants for Making Your Oracle BI Applications Project Succeed Like a Boss”
  • Sumit Sarkar, Progress Software : “Make sense of NoSQL data using OBIEE”

All of the sessions were good (somehow I think even the session Tim and I presented on Hybrid was at least adequate – sorry Tim), but the ones that I found most fascinating were the conceptual ones that could apply to EPM as well.  Robin’s presentation in particular really hit home as it captured the kind of structure and rigor that we should be bringing to testing in EPM projects.  Stewart also almost made me believe in Agile, which is really quite an accomplishment.  Even the more BI-specific technologies and approaches were absolutely fascinating.

A data visualization bake off allowed BI geeks to take a known dataset around public school classroom projects that need funding and presented their solution at the conference.  Sorry, I can’t tell you who won as I was continuing my OMG-this-is-so-f’ing-cool bit of shock and awe, but both of the solutions were really great.

There was also a debate on self-service BI vs. the traditional IT-oriented and controlled top down BI.  Oddly, there was quite a bit of discussion about buffets vs. Michelin three star restaurants (it was a metaphor for the different kinds of BI but it all left me a bit puckish).  I have to say that I was on the side of the former – this is where Essbase got its start and where my heart with the product still is.  Empowering people to take control of their data is one of the things that makes Essbase so great.  Self-service won.  :)

In action

If you want to download the presentations have a look here.

Want to see us in action?  Check us out on Flickr.

One other thing

This sounds kind of ridiculous, but the food at this conference was just fantastic.  Perhaps that’s down to the hotel, or perhaps the small size (60-ish) which allowed a different kind of buffet

Wrapping it all up

Will I go again?  If I can afford it (remember, yr. obt. svt. funds himself) yes, I will.  It was that good.

Thanks again to RM for inviting Tim and I to present and hopefully we didn’t let the Essbase side down.

Be seeing you.

15 May 2015

EPM documentation – The little things mean a lot

The problem

If you’ve ever needed to send a link to a particular subject within the Essbase Technical Reference, henceforth called Tech Ref ‘cos I am lazy), you may have had a frustrating experience.  Why?

Copying and pasting of said links don’t work

What do I mean?  Watch and (not) marvel.

Filter MDX function documentation


Yr. obt. svt.’s attempt to copy and paste

The copy

The paste

Alas and alack

We’re right back where we started from, as you can see.

Getting round this

I whined (whinged for you Commonwealth readers) about this to the EPM Information Development team and they, perhaps in an effort get me to shut up or perhaps in an effort to help the general EPM community out – I prefer to think of the latter but the former is likely the case – showed me how to get to the right place.

It’s just this little chromium switch

Instead of copying the frame url, instead look at the frame’s information.  Here are the steps in Firefox.

Here’s looking at you, kid

Grab it and go

This copy is going to result in the url http://docs.oracle.com/cd/E57185_01/epm.1112/essbase_tech_ref/mdx_filter.html.  Could this be right?
Oh yes it is

Isn’t that nice?

So is this important?

If you share links from EPM documentation, yes, absolutely.  I know it seems trivial but I think each and every one of we Essbase Hackers have been frustrated by this.  

Again, my thanks to the great people at the EPM documentation team for showing how this works.  If you’re not taking advantage of their offerings beyond the base documentation page, you really ought to.

Here they are on YouTube, Google+, Twitter, Facebook, and LinkedIn.  You can’t say they aren’t trying to reach out to us.   

Be seeing you.

07 May 2015

Stupid Programming Trick No. 24 – Sub Vars to tables via Calc Scripts

Introduction

This is another in a series of good ideas that have been shamelessly stolen inspired by Martin Slack.  I’m learning an awful lot from Martin; I’m not entirely sure that he is from me.  Such are the fortunes of war.

The problem

I needed a way to get the current month value from an Essbase substitution variable so I could do all sorts of weird and wonderful data integration things with it.  How to do it?


The idea that springs to mind for most of us (or at least yr. obt. svt.) is some way of getting it out from MaxL using the display variable syntax, piping it to a log file, and then face the somewhat unpleasant task of reading in a text file, parsing it via some language (VBScript, Perl, Powershell, etc;) and then from that getting it into a table.  

Ugh.

Stolen glory

Martin said something to the effect of, “What if you could just get it from a calc script?”  A calc script?  Hmmm.  Calc scripts, even using DATAEXPORT, write data, not metadata.  Or do they?

The Stupid Trick its own self

DATAEXPORT is an eponymous command – it exports data.  But what if there was no data to export?  Even more useless, right?  Here’s the Stupid Trick bit:  this is exactly what is needed – no data is fine because we’re going after metadata.  So how does metadata come from data?  And how does it get into a relational table?  Read on Gentle Reader and all will be revealed.

It’s all in your head in four parts

Part the first – the outline

Using my VFEDITWWW, I’ve added members that, at least in this blog, can never have data.  To wit, I have added the members No Measures, No Product, and No Market.  You will likely have different requirements unless you are The Beverage Company.


Part the second – the calc script

This member combination now gets exported to disk.  Ah, you say, yet another worthless bit of programming drivel from Cameron.  You’re wrong, for once, to think that.

Looking at the code below, note the structure of the export, with the DataExportColHeader option set to  Year.  When that is combined with an export of the Members That Can Never Have Data, look at what gets exported.

Part the third – the output

Brilliant, isn’t it?  And I mention that because the brilliance is all Martin’s.

I now have an output file with just the current month and nothing else.  This is a lot easier than trying to running a MaxL file and then parsing the output.  But how to get that into a relational data store?

Part the fourth – the SQL

The last problem now facing this approach is getting the value of Oct into a relational database.  Again, there are all sorts of complex and sophisticated ways of doing this:  ODI or SSIS spring to mind but there is a significantly easier way to do this, and to even do it within a SQL query.

Say hello to your new friend BULK INSERT

In SQL Server (in Oracle there is a concept called external tables that does much the same thing), there is a command called BULK INSERT.  While it is mostly used for doing just what it sounds like – a bulk insert of lots of data – in fact it can be used to bring any amount of data into a database from a file.  And if it can do that, and there is a file that has the current month, connecting the dots is pretty straightforward.

Here’s what my code looks like:
BULK INSERT Current_Period FROM
    '\\epm11123\C$\Tempdir\Current_Period.txt'
    WITH
    (
        CODEPAGE = 'ACP',
        DATAFILETYPE = 'char',
        FIRSTROW = 1,
        LASTROW = 1,
        FIELDTERMINATOR = '\t',
        ROWTERMINATOR = '\n'
    )
One consideration is that there’s no way to map the columns in the data source to the target.  Instead BULK INSERT is positional, e.g., the first field in the source file is loaded into the first field in the target table.  This is different from INSERT INTO…SELECT FROM syntax which allows an alteration of field order.

I used UNC naming not because I had to – c:\tempdir would have worked as well – but because mapped drives *must* use UNC naming, not a shared drive so it’s best to just get in the habit.   Also, this continues the tradition of Martin + UNC so what’s not to like?

Lastly, I defined the file by specifying the character set, file type, first and last row, the tab field terminator, and lastly DATAEXPORT’s line feed row terminator.  For giggles I threw in a first and last row but really everything other than the CODEPAGE and DATAFILETYPE settings are just a belt for the braces.

What does it look like?

Pretty good, actually, but there is a fly in the soup:

Do you see the issue?  The text file contents is “Oct” and that is exactly what BULK INSERT imports.  

There’s a pretty good chance that the double quotes are superfluous; the REPLACE function ought to take care of that:

There’s only one more issue to fix

This query will be run more than once, and that’s a problem as can be seen below when I ran the query twice:

The issue is that the query as it stands doesn’t clear out the table before inserting the Current_Period.txt file.  My brilliant data integration code expects one and only one record so perhaps it is not all that brilliant and will thus go KABOOM when two records are encountered.  Even worse is that this table could have multiple values, e.g., if this were to be run in both September and October the first record would be Sep which isn’t the current month.  

The solution is to run a TRUNCATE TABLE first, then the BULK INSERT, and finally the SELECT REPLACE command as below:

Now no matter how many times I run this, I get one and only one result.  Huzzah!

So what do we have?

A combination Stupid Trick that pulls out metadata, not data, via DATAEXPORT and then painlessly imports an evaluated Essbase Substitution variable into a table with nary a hint of MaxL script output or complex data import to SQL.  Take this query and stick it into a stored procedure, an ODI variable or procedure, etc.  The sky is the limit with this and it is dead easy.

Thanks again, Martin.  You keep thinking them up, I keep stealing them, and world+dog gets the benefit.  

Be seeing you.