Configuring Single Sign On for Cognos Express

4200321541_5cd04b5005_oSaw people having issues with configuring SSO for CX a few times already, so it’s probably worth puting this quick tip up here.
If you followed the mighty document called “IBM Cognos Proven Practices: Enabling Kerberos SSO in IBM Cognos Express on Windows Server 2008” and Single SignOn still doesn’t work, think about how much you need Kerberos authetification. If you, as well as myself, don’t really care about canine family so much, you can probably use something called REMOTE_USER for authetification. It’s less secure (potentially), but you don’t need to set up Kerberos delegation (which is a lot of tedious work & troubleshooting sometimes). Most of Cognos BI sites I know use it, cause it’s easier to maintain.

How to do it? Cognos Express is basicall BI + TM1 in one integrated box, so we just need to follow the REMOTE_USER setup instructions just for BI part.
1) Navigate to your CX installation folder/configuration
2) run cogcofig.bat file, your familiar Cognos BI Configuration will launch
3) Apply changes from link above for Active directory namespace, namely add singleSignOn=IdentityMapping option and input binding credentials (any domain user will do)
4) Just save the configuration and !!!Important!!! don’t start the Cognos BI service, just close the Cognos configuration utility.

Restart your Express service and everything should work.

If you start the Cognos BI service in step 4), you’ll end up with a newly registered and not working cognos bi service that you can safely remove later.

Cognos BI Javascript: Add a Listener to Tab selection and all sort of prompts manipulation

jack-ziegler-cat-thinks-of-a-complex-equation-to-get-a-ball-off-of-a-table-new-yorker-cartoonTime for another multi-page dashboard related JS. It’s fairly complex (I’m not a js programmer, so am biased), works for 10.1 (and should work onwards and backwards, I don’t rely on cognos prompt syntax). The story behind it is quite simple: we’ve started doing dashboard pages as a global prompt on top and multiple report frames in the bottom. All went well up untill the point when global prompts had to change with regards to selected tab (some versions avalaible only on certain tabs, some have to be renamed, etc).

So this script adds an “onClick” function call to every tab name click, so when user selects a tab, you can add additional processing, for example, show/hide prompts or add / remove prompt values. I’m manipulating prompts as basic html.select objects, so it’s only for drop-down prompts now.

I generally try to avoid javascript at all costs, so after looking at resulting code for a day or two, we rolled our sleeves and “moved” global prompts into detailed reports. Way easier to understand and maintain, no javascript used anymore. Javascript is “dark side of the power”, use it wisely.

I’m leaving some tab names in the script as an example:
Continue Reading »

Learning@Coursera: Data Analysis

studyingI’ve recently taken my first online course: Data Analysis from Coursera. Enrolled while playing with SPSS and, as usual, it started in the middle of working madness.
There’s a nasty part of myself that believes that piling things higher would actually mean that more get accomplished and it’s really lame at learning from experience. So 10+ hours at work and then a couple hours of stats almost killed me. Of course, I didn’t complete the course (quit at week 7 out of 8 with 100% test score) due to my perfectionism: I had to submit 2 assignments and couldn’t sign my name under whatever I could come up with in couple of hours of conscious effort.

Other than that: course was brilliant! It was good fun to play with R (clocked first 1k lines in a new language), to reread some of the stuff, read a lot of new stuff (I underestimated linear regressions before this course) and get to play with interesting data. And assignments were fantastic: how about a trying to reconstruct a credit risk scoring model from loans dataset as a first assignment or predict what person is doing given 500+ spatial measurements from his smartphone’s GPS as a second one?
The course itself was very intense, nowhere near estimated workload of 3-5 hours per week. I spent around 6-8 hours weekly, and that’s without assignments, just lectures& tests (each was took an hour at least) and judging by complaints in forums, that was pretty low, people casually mentioned 20 hours.

I had “how hard can that be” approach to online learning before, but now I’m pretty convinced that the answer is: “hard enough to make you regret enrolling” ) Tests were a set of programming assignments (a very clever idea) like “download this data set, perform regression / clustering / whatever, apply model to this set — what’s the error rate in third element from the bottom”? 10 such questions and you revisit all lectures, reread a few text books and sift through a pile of R examples.

Dust is settling and now I really think it was a great learning experience and a very good chance to reassess what I know about stats and data analysis.
Will I do other courses in future? I sure will.
Will I repeat this one next year? Only if I’ll have a vacation scheduled to tackle the assignments. ;)

Cognos BI Javascript to select a tab in multi-page dashboard

HardworkOverwhelmed with work but still alive and have some proof: a quick javascript recipe that allows you to add hyperlinks to select tabs in multi-page dashboard reports. Something like multi-page tab drill-through ) Googled around and haven’t found anything copy-ready, so here it goes.

 

We want to have a link like in picture below and clicking on it should transfer us to Sales tab.

Screen Shot 2013-03-15 at 6.34.48 PM
How to do this:

1) Add following Javascript to the page:

<script type="text/javascript">
function clickTabByName(tabName)
{
 
     //all links on page, tabs are links
     var allElements = document.getElementsByTagName('a');
     for (var i = 0; i < allElements.length; i++)
       {
         // all links with role = 'tab'
          if (allElements[i].outerHTML.indexOf('role="tab"')!==-1)
           {
               // Check tab name
               if  (allElements[i].innerHTML == tabName)
               {
                      //alert(allElements[i].innerHTML);
                      allElements[i].click();
                 }
 
           }
       }
 
}
</script>

2) Add a new hyperlink object to the report with following URL

javascript:clickTabByName('Sales');

Tested only for 10.1.1 fp1, but it should work in all 10 versions. I’m using “role=’tab’” tag that Cognos 10 (dojo framework) generates for tab names, this could be different in 8 or CRN (but I doubt it and am too tired to check).

Update 11/06/2013:
Was asked for Cognos 8 version (one above is 10+ only), so here it goes. Only one line (to filter out tabs from other links) is changed:

<script type="text/javascript">
function clickTabByName(tabName)
{
 
     //all links on page, tabs are links
     var allElements = document.getElementsByTagName('a');
     for (var i = 0; i < allElements.length; i++)
       {
         // in Cognos 8 all tabs have href="javascript:noop()" attribute
        if (allElements[i].outerHTML.indexOf('href="javascript:noop()"')!==-1)
          {
               // Check tab name
               if  (allElements[i].innerHTML == tabName)
               {
                    //  alert(allElements[i].innerHTML);
	// alert(allElements[i].outerHTML);
                      allElements[i].click();
                 }
 
          }
       }
 
}
</script>

2012 results

Screen-shot-2012-12-27-at-12.23.50-AM-444x450I’ve long avoided it, but let’s try a “results & plans” post for 2012. Hoped to publish it on XMas break, then on  New Year, then on Old New Year and then just though that “done is better than good” and decided just to post it anyway.

2012 was a busy year ( although I really can’t remember a non-busy year since I started working, seems like a life attitude), changing countries and starting in a new market was a good challenge.

 

Project stats:

* 2-3 (depending on the way you count) Enterprise planning projects. Controller integration was for one of them, deciphering PAD contents for another. Using dbms_lob.instr and substr is now a proven way for dependency analysis in EP. WIsh I didn’t know that;

* A full-blown Cognos BI dashboard with a bit of PL/SQL in background, multiple sources / countries / iterations, all in all long and tedious and the last ever project I’ll do without signed requirements upfront;

* A very interesting, although unexpected, SPSS gig. Unfortunately, we did only the first 2 weeks step (though I even got a small working model in that timeframe) and it didn’t turn into anything more serious. Reading tons on how to approach the problem, problem itself and toying with SPSS and Weka (I used it to compare SMOTE results with SPSS)  was really a cool experience and I disappointed it didn’t grow into a proper project. I still think that using Data Mining requires a tremendous level of data literacy and maturity from organisation (or a well specified and isolated business domain), so it’s pretty hard to come by such a project;

* A very, I would not say big, but complicated (with a bit of over to it), DWH with Cognos BI that we’re now supporting / enhancing. With Datastage as an ETL tool and Excel files as source — there’s tons of things you can do better. I strive to design systems to avoid handling things like multiple report copies for security for poor sods supporting them. And a lot of “proper process”, with deployment comities and proper documenting. Real fun, as you can imagine;

* A couple of TM1 gigs. There’s a lot of TM1 gurus around, so I’m only helping a bit. TM1 stress testing tool got a few updates and TM1MN was released in 2012. And some mind-numbing Excel exercises;

* And recently started DWH + BI project that has great potential. Most thorough approach to project I’ve encountered so far and proper documentation every consultant can only dream of. There’s both a set of complicated reports (with iPad version, all bells-and-whistles) and a start of a proper DWH, really interesting one. Let’s see how it goes,  very exciting so far.

 

The so-famous “other” category:

* Been on a cool IBM BI Forum in Melbourne, sessions I’ve visited were top level and I managed to learn a few new names and shake a few hands. It’s a pity there won’t be one this year, I was getting to like it. Passed TM1 master certification test there, no more TM1 certs out there for me;

* Started working with tools from BSP and Apparo, very useful stuff (deserving a separate post);

Jira and Confluence that I stubbornly rolled-in at the second working week are starting to pick up in PMSquare. Good old fun, the same story of the uptake as in Croc: jira is more useful from the beginning, wiki needs a lot of nurturing;

* Finally got into cloud story with Amazon EC2. Converted all PM2 machines into EC2, so that we don’t have a physical server anymore (not that there was a lot to start with);

* Obviously there was a lot less blogging than I wanted to.

 

Plans for 2013:

* More BI & DWH work and, equally important, posts on this blog. There’s a 2 year-old draft about common techniques in DWH modelling that I use and it’s just the beginning. For some reason there’s only a few DWH/ETL related posts on this blog, and I really like doing these things. Maybe that’s because books are enough, it’s pretty hard to add anything?

* More “advanced” stuff like SPSS or other “data science”. I’m looking at all the hyped Big Data (IBM Big Insights, for example), social media analytics and keep thinking that I have quite a nice background to do this (I did my PhD prototype using Hadoop, Cognos is what I do daily and SPSS is something that I like & have enough stats background to do). But I’m all so sceptical about the whole field and the way it’s sold right now. Let’s see, maybe by some odd chance I’ll see something like that on my plate.

TM1 Active Forms Excel Bursting — Cannot Empty Clipboard Excel Error

kate-thompson-close-view-detail-of-a-milkweed-seed-pod-bursting_i-G-28-2888-EV4PD00ZWhen you’re TM1-to-the-bone and somebody says “and then we need to send these reports out” , you think ”Well, that’s easy, it’s just an Active Form and a bit of vbs to convert TM1 formulas to values, pack and send”. And it’s all nice and straight-forward journey up to the point when you hit refresh in an AF and an ”Cannot Empty Clipboard” message box pops-up. No worries if you’re refreshing something manually, just click ok and off it goes, but it’s a complete show stopper for any automated updating script (triggered by TI’s in my case). Really interested whether somebody saw the same error.

 

I tried all possible Excel versions (2003, 2007, 2010) and got this error occasionally (refreshing 2 AFs at the same time would cause it almost certainly). TM1 was fixed at 9.5.2 FP2, couldn’t upgrade, so it might be solved in higher versions of add-in.

Converting TM1 formulas to values is well described on tm1forums and bihints, but I’ve found that traditional “run around, look into each cell formula and store it’s value if it’s a DBRW or anything else TM1-related” is terribly slow in big excel reports and switched to “paste-whole-sheet-as-values” (processing time dropped from 10 minutes per report to 20 seconds).

Paste as values approach contains a nasty nuance (did I tell you I “love” Excel already?): TM1 loves to store “almost-zero” values with very high precision (Enterprise Planning loves to do same stuff in publish containers, you end up rounding it to some meaningful 6-7 digits), if you ever saw a negative zero — that’s it. Negative zeros pasted as values cause ”A potential data corruption” error when opening a worksheet. Setting up precision as displayed (http://support.microsoft.com/kb/214118) helps.

Back to clipboard errors: they appear when refreshing data from TM1 and even switching from TM1REFRESH (that refreshes all worksheets in workbooks) to per-worksheet refreshes (there’s an undocumented function for it) doesn’t make them go away. IBM recommends cleaning clipboard (http://www-01.ibm.com/support/docview.wss?uid=swg21362037) before refreshing, but it’s a useless approach, user32.EmptyClipboard empties system clipboard that has nothing to do with Office Clipboard that causes this error.

By the end of the day, I just wrote an overseer script that runs the script updating Excel report, grabs Process Identifier (PID) of started Excel process and monitors it’s completion in given timeframe. If there’s a clipboard error, Excel “hangs” and is restarted after timeout. Tested with lots of parallel updates and was amazed with the cruel “clipboard-contention” scenarios. But that’s really a sledgehammer solution and I don’t like going to OS level programming (PIDs, reliable inter script communications and etc) for such a menial task.

PS: Honestly, use Cognos BI bursting functionality, it’s way easier )

Update 23/01/2013: Looks like Clipboard errors are mitigated / fixed in 9.5.2 FP3 — http://www-01.ibm.com/support/docview.wss?uid=swg1PM77007

Splunk for Cognos?

diagram_wheelWas reading about Splunk the other day. I’ve heard about it a few years ago and recent post by Altis Consulting made me revisit their site. Basically, it’s a logs aggregator on steroids with search capabilities and nice dashboards. And it’s Big Data, heh )

 

Given that it’s not so hard to add a app / reader for Cognos BI logs (we all know their structure, right?) and there are already apps for JMX, J2EE servers, Windows and *nix servers, you can get realtime stats on:

    Cognos errors (for example, BIBus processes terminating)
    CPU and memory stats
    JVM metrics
    Tomcat / WAS stats

on all servers in your Cognos cluster.
And they’ll be available for post-error forensics. And real-time monitoring dashboard in your browser. Then I thought, that would look a lot like Accelatis )

Adding TM1 logs shouldn’t be a huge issue as well. And if you could tap into Operations Console stats (possible with Java API, I would guess), that would close the loop for TM1 as well. Just imagine: client operations and CPU / memory usage on the same screen. And spanning back a few days to avoid “was working better” conundrums )

PS: I’ve spent a few weeks tying nmon logs from an AIX box to Cognos BI logs and adjusting settings on both for max performance and would’ve loved something integrated back then.

Using Amazon EC2

Virtual machines are the greatest thing since sliced bread and most of us techies use them daily. In this post I would be talking about using Amazon EC2 as a VM hosting platform (we as a company moved everything up there over last 4 months) and as a bonus, will describe the process of converting existing VMWare Workstation images to EC2 instances.

There are so many things Virtual Machines (VMs) help you if you’re a consultant:

  • you can configure the same environment as on client site,
  • run through product updates & play with new features without screwing up any physical boxes,
  • be able to undo any changes (Ctrl-Z for whole server),
  • run trainings on “definitely working” image,
  • can configure “vanilla” demo server and use for all your “show & sell” meetings. I remember the time we had special “extra powered” laptops for that. Screwing with those laptops was severely punished (by having to reinstall everything and getting your share of mockery).

 

Anyways, you all know that and are surely using lots of VMs in your daily life. Using VMs leads to quite a few problems, though: lack of processing power and hard disk space.

In a small company (and in a big one as well), it grows quite painful over the time. You can do a lot with modern laptops (mine has a quad-core i7 and 16GB RAM + fast disks), but it’s really a single user story, you can’t do proper development like this. You usually have a few powerful servers that you use a VM hosts and they most of the time looks this:

Although there are probably a lot more sucker fish (usb portable hdd’s) attached to them )

 

And then you need “just another VM” for “another important project” and you’re running around trying to find what VM you can stop and copy to external drive, because there’s no more disk space. Rings a bell?

Or one of those big boxes collapses and all hell breaks loose.

 

Consulting VM usage pattern is quite different from normal infrastructure story, you usually need quite powerful machines for relatively short periods of time (development), but you have to be able to put them back online quickly if something needs investigation. Which makes Amazon pay-by-usage quite a good idea.

Enter Amazon EC2 (or any other cloud provider, to be frank)

You’ve probably heard about Amazon EC2, and if not, just read one of these links (official, wikipedia). In general, it’s a utility computing cloud where you can rent a server for quite low cost on per-hour basis. I’ve used Amazon EC2 for testing Hadoop-based OLAP implementation I’ve made for my PhD almost 5 years ago, but they’ve added Windows support a few years ago and an ability to import VMWare images.

Using Amazon EC2 pro’s:

  • you can add more servers on flight, running a new one from scratch takes about 15 minutes,
  • you can choose and change “instance size” for existing ones, like adding 64 Gb of RAM for a day to test out a few TM1 server issues and then reverting back to 16,
  • automated VM backups and no more “part time administering a server in the closet”( after a while it’s not fun anymore and your time becomes too valuable),
  • snaphots that allow you to restore VMs to a selected point in time, or just start new VMs from snapshot. Turned out to be quite useful when I branched out a copy of Datastage server to test IBM patches on and kept it running in parallel while we tested all support suggestions,
  • cost reduction, cause you’re never using servers 24*7 hours in consulting, it’s more “burst-while-devevelopment-dust-while-support”.

Con’s:

  • don’t expect too much from performance standpoint, Amazon CPU’s are a lot slower than “physical ones”. Although there instance types with decent allocated CPUs, but they’re not there yet in the regions we’re using now,
  • network configuration issues. Each Amazon EC2 gets a dynamic ip on launch, so if you want to have a demo box with properly configured Cognos mobile stuff, you’d need an Elastic IP and a bit of configuration,
  • bandwidth. Really a pain in AU. Uploading / downloading VMs can be a lot of pain. We started with Singapore (it’s our second office anyway) as region, so all uploads went through my NBN and still took a whole night per VM. There’s a Sydney region (ay!) now, haven’t tried that out yet.

Overall, after 4 months of usage, overall impression is quite positive. We’re using EC2 for a fair share of development and all training gigs (you don’t copy and leave VM after training, performance is stable, no running around with hdds).

 

Bonus. How to convert an existing VMWare Workstation VM to EC2?

Not a straight forward process and implies command-line manipulations (scaaary). There’s built-in VM conversion for ESX server VMDKs, but not for Workstation / Fusion ones yet.

What I did was:

1)    Convert VMWare image to VHD (Microsoft VM) using StarWind or VirtualBox

2)    Upload resulting VHD with standard EC2 command-line tools

Downloading back was the same process, download as VHD, import to VMWare (built in Fusion).

 

PS: Our Amazon machines were recently attacked by Trojans, so be mindful, it’s an hard out there, bring band-aids antivirus. Although it still puzzles me that somebody can attack a server with dynamic ip available only 30 minutes a day with just a couple ports open )

Using Cognos Connection Jobs for Orchestration (with Cognos Enterprise Planning and Controller in the pit)

Most of the systems we implement don’t live in vacuum, they have to communicate with lots of neighbours. People often overlook the ability to call stored procedures from Cognos BI reports as a method of running some action, so I’d like to steal your 5 minutes for explaining how to use it to build better systems.

Let’s imagine a very common scenario: when budget data is “ready” it needs to be transferred to some other system.  When I have a process that so user driven (it takes a real financial analyst to estimate budget “readiness”, just looking at submission states doesn’t work), I try to make it’s execution as simple as clicking a link on a web page. Nexting multiple screens and copying text files around is out of question.

If it’s a major project with a designated ETL tool, I try to implement all required workflow in ETL and provide users with an ability to run ETL packages by pressing links. I usually put all these urls in Cognos Connection to ensure that there’s a single entry point for whole system.

But sometimes there’s no ELT tool around and in that cases Cognos Connection itself may be enough.

In a recent project I had to make up a Cognos Enterprise Planning data into Cognos Controller import flow. Each part of the equation may be substituted with something else in this example, logic stays the same: we use Cognos Connection to define the workflow.

In this instance Cognos Connection job would:

* run Incremental Publish job to get EP data into readable format
* run a stored procedure call to load data into Controller. To run a stored procedure you add it as a Query subject in Framework Manager and then running any report on this QS will trigger stored procedure execution
* (optional) send “all done” report to group of users



For users it’s just a link that they can run directly or schedule for later.

And you get Cognos Connection jobs security settings, run history and scheduling capabilities for free )

 

Bonus: Generalised controller import stored procedure, MS SQL Server

Continue Reading »

TM1 Stress Test tool gets an update

I’ve updated TM1 Stress \ Load Testing Tool over the weekend (vesrion 0.2, probably).

Now it handles memory a lot better and has a new parameter called seconds_between_actions in session configurations. Parameter is quite obvious, it just says how long will the coffee break between reading and writing bursts be within each session. To  make things less determined, the break is not exactly seconds_between_actions, but a random number that is less than seconds_between_actions. My coffee breaks never go for as long as I plan them, for sure.