SAP BusinessObjects Monitoring – Part 3

The Monitoring Application is a new application in SAP BusinessObjects 4.0. You will find the application in the CMC under Manage. This is the 3rd, and final, article in the series on Monitoring. In Part 1, we covered how to set up the application. Make sure you do this before trying to use the application. In Part 2, we covered how to use the application. Now, in Part 3, we will discuss the creation of the Monitoring universe, so you can track the health of the system over time.

By default, the monitoring trending data are stored in four tables in a Derby (Java) database. However, with SP04, we now have the option to transfer this data to the Audit database. This provides several advantages:

  • The Derby tables will only store three months worth of data. The Audit DB tables will store as much data as you want.
  • Connecting to the audit tables, to build a universe, is much easier than connecting to Java tables.

Therefore, it is recommended to migrate the data from the Derby tables, to the Audit database. Let’s talk about how to do that.

Migrating the Trending Data

The Trending database contains four tables:

  • „MOT_MES_DETAILS
  • „MOT_MES_METRICS
  • „MOT_TREND_DATA
  • „MOT_TREND_DETAILS

We’ll cover what’s in these tables later. Prior to migrating the trending data from the Derby database to the Audit database, these four tables need to be created in the Audit database. SAP provides the scripts to create the tables. They are on the BusinessObjects server in the following location:

<Install Dir>\SAP BusinessObjects\SAP BusinessObjects Enterprise XI 4.0\Data\TrendingDB
Choose the script that matches your Audit database platform. Then, using a tool that can access the Audit database, log into the database with an account that has full control, and run the SQL script. It will create all four tables, as well as the needed indexes for those tables.
Next, you’ll need to export the data currently in the Derby tables, to CSV files. It’s pretty simple: Log into the CMC, and go to Applications –> Monitoring. Click on the button that says, “Export Data from Embedded database as CSV files”. This will output three CSV files, one for each table, to the following location:
<Install Dir>\SAP BusinessObjects\SAP BusinessObjects Enterprise XI 4.0\Data\TrendingDB
Before you import the CSV files into the Audit database, you may need to turn IDENTITY_INSERT on for the four tables. The reason for this is that the tables have identity columns, with auto-generated values. Some database do not allow the insertion of values into those columns. To turn on IDENTITY_INSERT, run the following command against the database, for each of the four tables:
„SET IDENTITY_INSERT <TABLE NAME> ON
NOTE: IDENTITY_INSERT can only be turned on for one table at a time. You will need to turn it on for a table, then import the data to that table, then turn it off again before proceeding to the next table.
Once the data has been imported, make sure you turn IDENTITY_INSERT off again by running the following command:
„SET IDENTITY_INSERT <TABLE NAME> OFF
Proceed to import the CSV files, one at a time, into the Audit tables, using the appropriate tool for your database. It is recommended that the files be imported in the following order:
a. MOT_TREND_DETAILS
b. MOT_TREND_DATA
c. MOT_MES_DETAILS
d. MOT_MES_METRICS
Activating the Audit Database
The next step is to activate the Audit database for use with the Monitoring application. Start by editing the SBO file, adding the needed alias to it. You will find the SBO files in the following location on the server:
<Install_Dir>\dataAccess\connectionServer\odbc\<db Type>.sbo
Open the one that matches the Audit database platform that you are using. Here’s an example of what to add for SQL server:
<DataBase Active=”Yes” Name=”MS SQL Server 2008″>
<Libraries>
<Library>dbd_wmssql</Library>
<Library>dbd_mssql</Library>
</Libraries>
<Parameter Name=”Extensions”>sqlsrv2008,sqlsrv,odbc</Parameter>
<Parameter Name=”CharSet Table” Platform=”Unix”>datadirect</Parameter>
<Parameter Name=”Driver Name”>SQL (Server|Native Client)</Parameter>
<Parameter Name=”SSO Available” Platform=”MSWindows”>True</Parameter>
</DataBase>
Refer to the Data Access Guide for more information on editing the SBO file(s).
Finally, log into the CMC, and go to Applications –> Monitoring, and change the Trending database from Embedded to Audit. You will need to restart the Monitoring service after making this change. The Monitoring service is located in an Adaptive Processing Server.
OK. You are ready to start creating the Monitoring universe.
Creating the Monitoring Universe
BI4 doesn’t come with a universe for the Trending database. So, you’ll need to create one. Unfortunately, finding information on the schema, such as a data dictionary, can be very challenging. Here is the basic schema for the Trending tables:
Schema
Here’s a basic description of what is stored in each of these table:
  • MOT_TREND_DETAILS: Dimension table; stores information about metrics, probes, and watches
  • MOT_TREND_DATA: Fact table; collects trend data about metrics watches, and probes, including DateTime and value (By default, values are collected every 15 seconds)
  • MOT_MES_DETAILS: Fact table; records information about threshold breaches and alert notifications
  • MOT_MES_METRICS: Dimension table; stores information about Watches and the metrics that are part of each Watch
From these tables, following are the columns that you will most likely be interested in:
MOT_TREND_DETAILS
  • MetricName: Obviously, this is simply the name of the metric. This will be a dimension object in the universe
  • Type: There are three possible types of metrics: “Subscription”, “ManagedEntityStatus”, or “Probe”.
  • Name: If the Type is “ManagedEntityStatus”, this will provide the name of the Watch. Otherwise, it will simply contain the same text as the Type, in caps.

MOT_TREND_DATA

  • Time: This is the time at which the data was collected. It is a BIGINT or NUMBER or FIXED field.
  • Value: This is the value of the metric at the time it was collected.
  • MessageKey: If an error is thrown, this will contain the error message key. If it was successful, this will be null. For Watches, it can also contain either “watchEnabled” or
    “watchDisabled”.

MOT_MES_DETAILS

  • Time: This is the time at which the data was collected. It is a BIGINT or NUMBER field.
  • AlertType: This represents the notification delivery type. It is a SMALLINT or NUMBER data type.

MOT_MES_METRICS

  • Name: This is the name of the watch.

As you can see, the universe will be fairly simple. Of course, you may want to create some filter objects, so that you can do reporting based on specific historical periods, such as current month, current year, etc.

I hope you have found this series as useful as I have, in researching it and learning it all. Let me know if you learn anything that I haven’t covered here.

SAP BusinessObjects Monitoring – Part 2

In Part 1, we looked at setting up the new Monitoring Application in SAP BusinessObjects 4.0. So, now that we have the application set up, in Part 2, we will look at using the application. Monitoring is a Flash application (What were they thinking?) built into the CMC. So, to get there, log into the CMC, and click Monitoring from the main menu. There were some substantial changes made to the Monitoring application in Support Pack 4 (SP04). This article will be based on SP04, but I will point out what is different from the BI4.0 base install.

Dashboard

There are five tabs across the top of the application. The first tab is the Dashboard, so that’s what you see when you first launch the application. The Dashboard has five panes (four prior to SP04):

  • Overall Health: A single icon gives you a quick overview of the health of the system. When you mouse over it, it looks like you can click to drill into it, but you can’t.
  • Recent Alerts: We will talk about Alerts later, but this pane shows any Alerts that were triggered within the past 24 hours, so you can get a quick view of the recent health of the system.
  • BI Landscape: We’ll talk more about Watches later, but this pane includes a graphical or tabular view of the Watches within the system. You can drill into any of the Watches shown here to get more information.
  • Watch Graph: The Watch Graph is directly below the Dashboard Watches pane, and displays a graph for the select Watch above. There are two views of each Watch. There is a Live view, which constantly updates with the status of the Watch, and there is a Historical view that allows you to view the status of that watch over a historical time period.
  • KPIs: This pane displays three Key Performance Indicators for the system. You can click on any of them to view more details for Running Jobs, Pending Jobs, or User Sessions.

Monitoring Dashboard BI4 SP04

Metrics

Metrics are individual pieces of information that contribute to the overall health indication of the system. They are items that are measured within the system. The system comes with over 250 metrics. In addition, you can create your own metrics, based on the built in metrics. A few examples of standard metrics would include the following:

  • Number of Defined Web Intelligence Reports (I believe they meant documents, not reports)
  • Current Number of Tasks (Web Intelligence Processing Server)
  • Completed Jobs (CMS)

What you do with these metrics is completely up to you. You can use them in Watches and Alerts, setting thresholds for Warning and Danger alerts. The list of Metrics is quite impressive, and can generally tell you almost anything you want to know about the system.

You can view the current, or historical, value of a metric by selecting the metric in the left pane, then adding it to the right pane.

Note that the default view is a live view, that is updated every 15 seconds. Click the Go to History button to view historical data. When viewing historical data, you can define the time frame by clicking the calendar buttons.

Watchlist

The Watchlist tab allows you to monitor the Watches in the system. What is a watch? No, it’s not that thing on your arm that tells you the current time. 🙂 A Watch allows you to set thresholds for the Metrics, and determine what values for a Metric represent healthy, warning, or danger levels. For example, You may decide that when the disk space for the Input FRS reaches 70% capacity, you want to trigger a warning, and when it reaches 85% capacity, you want it to trigger a danger status.

There are a number of built in Watches, which you will probably want to edit, changing the thresholds to levels that make sense for your environment. You can also decide which watches will be a KPI (Shown on the dashboard) and/or written to the trending database, allowing you to see historical data for that watch.

If a Watch evaluates to True, what action do you want to see happen? This is called the Notification. And how long should the Watch evaluate to True before that action is taken? This is called the Throttle. For example, if the number of concurrent running jobs exceeds X, should the system send a notification immediately, or wait until it sustains that level for X number of minutes? These are options that you can set within a Watch.

Some of the options on watches include sending an email, or triggering a Probe. You can have an email sent in case of a Caution alert, or Danger alert, or both. To do this, click the Directory button next to the Notification Settings.

Probes

A probe is a device that executes a workflow, mimicking the actions of a user. The purpose of a probe is to test user actions, and make sure that they work as expected. There are nine built in Probes. Some of them can be run as is. Some of them will need additional setup before they can be run. For example, the CMS Ping Probe, which simply checks to see if the CMS is up and running, can be run as is. However, the Interactive Analysis Probe (Which you should rename as Web Intelligence Probe) needs to have the CUID of a Web Intelligence document, so it can use that document to test if a document can be opened, refreshed, and saved in various formats.

You will need to modify the URL of the BI Launchpad Probe. Here is the correct URL:

http://<servername&gt;:8080/BOE/portal (Make sure you replace <servername> with the name of your server)

For the Web Intelligence Probe, I would recommend using one of the Sample documents, built against the eFashion universe. This way, the probe doesn’t hit your production database, and the document runs quickly. Remember, the purpose of this probe is simply to make sure that users can open and refresh Web Intelligence documents.

The system allows you to register new Probes, either Java based, or Script based. You can also delete existing Probes that you don’t need.

Alerts

All those Alerts that you set up in the Watchlist, will show up here. This is simply a list of all alerts that have been raised within the system. You can click on the alert to see the details of the alert. Keep in mind that alerts can also be sent to email addresses. For example, you can set up an alert that emails you if it crosses one or both thresholds. This is set up in the Watchlist tab.

Conclusion

Using the Monitoring application can be a challenging task, given the number of built in Metrics, and lack of training available from SAP. But don’t give up. It’s a powerful tool, with a lot of potential. If there is something that you want to do, and can’t figure it out, comment here, and I’ll see if I can help.
In part 3, we’ll discuss building a universe on the Monitoring trend data, so you can do historical reporting on the system’s health.

SAP BusinessObjects Monitoring – Part 1

SAP BusinessObjects 4.0 includes a new application, available through the Central Management Console (CMC) called Monitoring. What is this new application all about? Well, the bad news is that the documentation, and training, available for this new application is virtually nonexistent. Aside from a brief section in the Administrator’s Guide, there is no training or documentation available on how to use Monitoring. So, I’ve spent quite a bit of time trying to figure it out. This article is designed to document my learnings. This will be the first in a series of posts explaining how to set up monitoring, how to use monitoring, and how to create a universe for monitoring.

What is Monitoring?

Monitoring is a built in application, that allows administrators to monitor the health of the system. The most important aspects of Monitoring are Watches and Probes. Watches allow you to set thresholds for over 250 metrics within the system. You can be notified when these thresholds are breached. For example, you can have a Watch that will monitor the disk space consumed by the Output FRS, and be notified when that consumption reaches a specific point.

Probes are applets that perform workflows, just to make sure that system is working as intended. For example, there is a probe that will open a Web Intelligence document, refresh it, export it as Excel, or PDF (or both), and then close the document. If any part of that fails, you can be notified.

Monitoring includes a dashboard that allows you to see quickly the health of the system. You can see the current, or the historical, state of the system.

Setting up Monitoring

The data for Monitoring is stored in the Monitoring database. That’s right, BI 4.0 has 3 databases. But the Monitoring database doesn’t have to be set up. It is installed with the system. The Monitoring data is stored in a Derby database. This is simply a set a java tables. By default, they are stored in the install directory of BusinessObjects. And, since they are not stored in a typical RDBMS, we don’t ask a DBA to back them up. Backups are done within BusinessObjects, using the CMC.

On your BusinessObjects server, create a directory for the backup of the Derby tables. For example:

E:\MonDBBackup

Next, log into the CMC, and go to Applications, and double click on the Monitoring Application. This will open the Properties window of the Monitoring Application. Make sure that “Enable Monitoring Application” is checked. In the “Trending database backup directory” field, enter the location for the backup, that you just created. Click Save and Close.

Next, go to the Servers area of the CMC, and restart the Adaptive Processing Server (APS). If you have more than one of them, restart the one with the Monitoring Service.

Once the APS is running, go back to the Applications area, and open the Monitoring Application again. Next to the “Run database backup task” label, you will see a button labeled “Now”. Click that button. This will create a backup of the Derby Database into the specified directory. Depending on how much data has been collected, this may take a while. Be patient. Once it completes, you will get a message that the backup completed successfully.

You can also, in the same place, schedule automatic backups, anywhere from every hour, to every 750 hours. I would recommend no more then 24 hours for your automatic backups. If you plan to create a universe on these tables, you may want to backup even more often, depending on your reporting needs. I’ll cover creating the universe in Part 3 of this series.

Monitoring History

The data that is stored in the Monitoring database tables isn’t kept forever. By default, the database has a 1Gb limit, although you can increase that size in the CMC – Applications – Monitoring Application. However, at some point, the application would slow down, if you let the tables get too large. If you want to keep the historic data for a long time, I would suggest extracting the data from these tables, and storing it in a RDBMS.

The Monitoring Application also integrates well with SAP Solution Manager, as well as Wiley Introscope.

Conclusion

That’s all there is to setting up the Application. In part 2 of this series, I’ll cover tips on using the application. And in part 3, I will cover the creation of a universe based on the Monitoring database. Stay tuned.

Differences Between Desktop Intelligence and Web Intelligence

What are all the differences between Desktop Intelligence (Deski) and Web Intelligence (Webi)? The question has been asked so many times, including from clients, that I decided to attempt to put together a comprehensive list of the differences. Why is such a list important? Well, with the release of SAP BusinessObjects 4.0 in September 2011, Desktop Intelligence is no longer available, and is now completely replaced by Web Intelligence. So, for organizations that use Desktop Intelligence, it’s important to know what those differences are, as they transition from Deski to Webi.

The differences can be divided into two categories:

  • Deski features that are missing from Webi
  • Webi features that never existed in Deski

Fortunately, that first category is getting shorter with each new release of Webi. However, since that category is most important to those migrating from Deski to Webi, we’ll cover that category first.

Deski Features that are Missing from Webi

  • Display a table in multiple Columns: In Deski, you can set a table so that, when it reaches the bottom of a page, it “wraps” to the next column on the same page, similar to a how a newspaper column continues on the same page.
  • Grouping button: Deski has the ability to create grouped variables with the click of a button. In Webi, we have to write the If – Then – Else logic manually to create such functionality. This can be quite tedious. I have submitted an idea to SAP to have this functionality added to Webi. You can vote for this idea here.
  • Freehand SQL queries: I’ve never been a big fan of this, as it defeats the whole purpose of having universes, and also creates maintenance nightmares. But there is some value for it in a test environment. This idea has been submitted to SAP here, and it is currently under review.
  • Save as RTF, TXT, HTML: Currently, Webi can save as Excel, PDF, or Webi only.
  • Edit List of values: In Deski, users could edit a List of Values for a universe object, so that it would only display the values that are valuable to them.
  • List of values in formula editor: In Deski, when creating a formula, you can select values for a particular object, from a list, provided in the formula editor. This idea has been accepted by SAP, and should appear in a future version of Webi.
  • Center on page: Deski has a Page Layout option to center a block, or cell, on the page, between the margins.
  • VBA Macros: Deski support Visual Basic macros. I don’t know if Webi ever will support this, but I suspect it will support some kind of programming language in the future.
  • Delete multiple tabs simultaneously: This seems like such a simple thing, but Webi still doesn’t have this capability.
  • Export multiple documents simultaneously: That’s right. In Webi, we can only export documents to the repository one at a time.
  • Create a variable in it’s current context: In Deski, we could create a variable from a formula, and lock in the current context of the formula, even if it was not explicitly included in the formula.
  • Support for all installed fonts: Webi uses it’s own font library, not the Windows fonts. So, unlike Deski, we can’t add additional fonts to Webi. An idea has been submitted to this here.
  • Schedule to a printer: I realize that printing is becoming more and more obsolete, but it isn’t there yet. Deski reports could be scheduled to print automatically. Webi reports cannot.
  • Automatic custom sort for month names and weekday names: Deski has built in custom sorts for Months and Weekdays.

So there you have it. As far as I know, that’s a complete list of features that Webi is still missing from Deski. If you know of more, please mention them in the comments below.

Webi Features that never existed in Deski

  • Track Data Changes: Allows you to see the difference between two different refreshes of a document, and compare the results.
  • Hyperlinking Wizard: You can create an OpenDocument link in about 5 minutes, as opposed to the days or weeks spent doing this in Deski.
  • Most built in formulas: Webi has more built-in formulas than Deski. These formulas come in the way of standalone cells that you can simply drag onto your report.
  • Save as Excel with 65K+ rows: You can save a Webi document to the Excel 2007 format, which allows for over 1 million rows in a single worksheet.
  • Add temporary values to custom sort: When creating a custom sort, you can add values that are missing, to the sort, so that, when they eventually show up on the report, they will sort in the correct order.
  • Box chart: A new chart in Webi that allows you to see a measure at five different data points across a dimension.
  • Tag cloud chart: A new chart in Webi that allows you to easily compare two measures against a single dimension.
  • Heat map chart: A new chart in Webi. It’s the equivalent of a crosstab table, but in chart format.
  • Bubble chart: A new chart in Webi that compares two measures against a single dimension, in an easy to read format.
  • Optional Prompts: Prompts created in the query can be marked as optional, allowing users to ignore the prompts, if they wish.
  • Display query summary on report: Webi has a QuerySummary function that allows you to easily display information about the query on the report.
  • Inner join between queries: Webi’s Merge Dimension feature allows you to choose how the data is displayed on the report. It doesn’t have to be a full outer join, like in Deski,
  • Query Drill: Allows you to keep your data set small, while drilling through large amounts of data.
  • Drill in InfoView: Webi document, of course, can enter drill mode in InfoView (BI Launchpad). Deski documents cannot.
  • Edit in InfoView: Webi documents can be easily edited in InfoView (BI Launchpad), by anyone who has permission.
  • Alternate row shading: We used to have to do some slight-of-hand with variables to get this to work in Deski. But Webi has it built in.
  • Input Controls: This allows for user interaction in the reports, making What-if analysis quite simple. It also make it easy to build powerful dashboards in Webi.
  • Advanced formatting rules: That’s the new name for Alerters. Webi’s Formatting Rules editor allows for more powerful options, including using formulas in the rule.
  • Publish as a BI Service: Individual blocks in Webi can be published as a Web Service that can be consumed by other BI applications.
  • Compatible with Live Office: Report parts form Webi documents can be consumed by Live Office, allowing integration with MS Office.
  • Compatible with BI Widgets: Webi report parts can be consumed by BI Widgets, allowing them to be dropped onto the desktop as standalone objects.
  • BEx queries: Webi can run queries directly against BEx queries.
  • Query stripping: Allows for the automatic removal of objects from a query against an OLAP cube, that do not directly contribute to the report(s).
  • Synchronize drill on report blocks: When drilling on one report block, other blocks on the same report can drill at the same time, staying in sync.

As far as I know, that’s everything that Webi offers over and above what Deski had. If you know of any others, feel free to mention them in the comments, below.

Conclusion

So there you have it. By my count, Webi wins, 24 to 14. While there are still some features missing from Webi, there aren’t that many, and hopefully, none of them will be show stoppers for you.

Impressions of Web Intelligence 4.0

SAP BusinessObjects Web Intelligence 4.0 has been Generally Available since September, 2011, and has generated a significant amount of buzz in the BusinessObjects community. I have no plans to create more buzz. My only desire here is to give an honest assessment of the new version. And by honest, I mean I will share what I have seen, both good, and bad.

Ribbons are for Packages

With this version, SAP has gone the way of Microsoft, replacing toolbars with ribbons. I’ve been using the ribbons in MS Office for several years now, and I still prefer the old menus and toolbars. The same is true for Webi. I don’t like the ribbons, for several reasons:

  • I have to remember which ribbon to select, for the feature that I need. If I don’t remember, then I end up hunting through ribbons. This wastes time. I don’t like to waste time. I can easily remember which ribbon to select for the commonly used features. But for the less used ones, I rarely remember which ribbon to choose.
  • Operations that used to take one click (simply click the button) now take two clicks. Click the appropriate ribbon, then click the appropriate button.

I’m sorry, but ribbons don’t make me more productive. They only slow me down.

Charts and More Charts

Web Intelligence now shares a common chart library with other tools. That was a smart decision. It creates a common look and feel across the various tools. But it also includes a host of new charts that we’ve never had before. Here are a few examples.

Box Chart:

The box chart shows the following five different values for a pair of dimensions, with a measure:

  • Minimum
  • Maximum
  • First quartile
  • Third quartile
  • Median

Users can mouse over the chart to see the values.

Column chart:

Yes, we’ve always had column charts, but now we can make the columns round, and we have many more powerful options for these charts.

Tag Cloud Chart:

The Tag Cloud chart allows the comparison of two different measure against a dimension.

Heat Map Chart:

The Heat Map chart is the charting equivalent to a crosstab table. It displays the intersection of two different dimensions with a measure.

Bubble Chart:

The Bubble Chart allows for the comparison of two measure against a common dimension.

Some of the new charts looks quite useful. But, in addition to the new charts, we have a host of new features for the charts. There are far too many to mention. But, to name a few, you can base the chart title on a formula. You can also use gradient colors on charts.

Unfortunately, we’ve lost the ability to create custom color palettes. I’m disappointed with that.

Improved Compatibility with MS Excel

We now have the ability to save a Webi document to the new .xlsx format. When we do, we can save over a million rows of data to a single worksheet. Of course, we still have the ability to save in the old format (.xls) as well. In that case, if you have more than 65,636 rows, it gets split into multiple worksheets. So now you have a choice as to which format to choose.

What Happened to Tracking?

I don’t know how this was missed in UAT, but the Data Tracking feature, which was introduced in XI 3.x, is broken in 4.0. Here’s the issue: If you use Data Tracking, the only rows that will be formatted by the Data Tracking feature, are the rows that are also formatted by the Alternate Row Shading feature. Seriously.

So, if you use Data Tracking, here’s the trick: Set your Alternate Row Shading frequency to 1. The Data Tracking works as expected. Yes, SAP is aware of this bug, and plans to issue a fix in SP04, which is currently scheduled to be out in July 2012. It is being tracked on ADAPT01618287.

What’s in a Name?

Some things have been renamed in Webi 4.0. Some of the new names make sense, others do not. The problem, of course, is that, with the new Ribbons, buttons have been moved around, and features have been renamed. This makes it that much more difficult to find what you need.

Alerters have been renamed to Formatting Rules. That makes sense.

The three axis of a 3D chart have also been renamed:

  • X Axis is now called Category Axis (This makes sense)
  • Y Axis is now called Value Axis (This one makes sense, too)
  • Z Axis is now called Region Color (Huh? What were they thinking?)

Since the rest of the world still refers to these as the X, Y and Z axis, I have no idea why SAP decided to go against the grain.

The Data Manager is a Welcome Addition

One of the complaints from people moving from Desktop Intelligence to Web Intelligence has been the lack of a Data Manager. In Deski, we could always see the raw data, and information about the data in the Data Manager. Now Web Intelligence has the same feature. It’s a big improvement. The only thing missing, that Deski had, is the ability to see metadata for multiple past refreshes.

Hide and Go Seek?

For a long time now, I’ve missed some of the hide capabilities in Deski. They’ve now been added to Webi. 🙂

We can now hide a block, based on a formula. If the formula evaluates to true, the block is hidden. Otherwise, the block is shown.

We also now have the ability to hide columns in a block. Simply right-click in a column, and hide it. So simple. In fact, it’s easier than Deski. 🙂

Don’t Have a Fit

Fit to Page has finally been added. That’s another feature that former Deski users have been begging for, and now we have it. Thank you, SAP!

Conclusion

As you can see, not all news is good news with Webi 4.0. But SAP has made significant strides in adding in the missing functionality from Deski. There is still room for improvement, but we’re getting closer. Look for another post detailing the differences between the two products.

If you’ve been using Webi 4.0, what is your opinion?

Impressions of Information Design Tool

It’s been well over two years since I got my first look at the SAP BusinessObjects Information Design Tool (IDT). However, that was pre-Beta, and the tool has undergone some changes since then. The version that went GA last September, 2011, wasn’t the same version that I originally saw in early 2010. Since the GA release, I’ve spent quite a bit of time with IDT, and would like to record my current impressions.

It’s important to keep in mind that Universe Design Tool (Formerly known as Designer), is a very mature product, and quite stable. IDT, on the other hand, is a relatively young tool, and, as such, has some quirks and bugs. I’ll address some of those here.

Let’s Start With The Good News

If you’ve read my earlier post on What’s New in Universe Design, you would know that the universe has a new structure, consisting of three different layers:

  • Connection Layer (.cnx)
  • Data Foundation Layer (.dfx)
  • Business Layer (.blx)

A Data Foundation can consume one or more connections. So you can bring tables from multiple databases into a single Data Foundation. The Data Foundations contains the tables, joins, and contexts. Optionally, you can build multiple Business Layers on a Data Foundation. The following diagram summarizes the new universe structure.

This structure adds some significant benefits over the old method of building universes. The ability to build a universe with tables from multiple databases has been a long time coming. And, the ability to build the Data Foundation once, and reuse it for multiple universes, means less work in the long run.

These layers are built in a local, or shared, project. One of the benefits of using shared projects is that multiple people can work on the same project at the same time. For example, one person can be working on the Data Foundation, while another is working on a Business Layer.

One of the new features that I really like is the ability to add calculated columns to tables. You can add a column to a table in the Data Foundation, based on a calculation using columns from that table, or constant values. Once created, the column appears as just another column in the table.

You can also create time hierarchy columns automatically. This is so much better than the old Automatic Time Hierarchy feature.

I also like the new Families feature. A Family is a collection of related tables. Families can be identified by colored backgrounds and fonts. You can create as many families as you want, and add as many tables as you want to each family. A table can belong to one, and only one Family. You can then search for tables by Family, Name, Columns or Contexts. You can search by wildcards as well.

Another feature that has good potential is that Lists of Values, as well as Prompts, can be built as standalone objects in the Data Foundation. This means that they can be reused in multiple objects in a universe. But they can also be reused in multiple universes. However, the process of creating them is cumbersome and quirky. The various options aren’t well organized, and require too much experimentation to get them right. But once built, they work quite nicely.

In the Universe Design Tool (UDT), we’ve always had an issue with testing as we build. We could test in Desktop Intelligence, or, more recently, in Web Intelligence Rich Client. There is a Query Panel in UDT, but it’s mostly useless. However, in IDT, we can fully test as we go, without relying on another tool. We can build, and execute, queries, within the Business Layer. These queries can be saved in the universe so they can be used again. Once we run these queries, we can analyze the results with sorts, filters, and changing the order of the columns.

A feature that completely eliminates the need for Linked Universes is Business Layer Views. When you create a Business Layer, that becomes the Master layer. You can then create Views of that Business Layer, choosing which folders and objects to expose in that view. You can then grant rights to those views for various user groups.

Not All News is Good News

As I mentioned, IDT is a fairly young tool, and, as such, has some bugs and missing features. In fact, some of these missing features can cause huge problems.

I always recommend applying formatting to every numeric object in a universe. After all, we don’t want to force the users to apply formatting every time they use these objects. In UDT, we could select multiple numeric objects at once, and format them all at the same time. However, in IDT, we have to format objects one at a time. And the process of creating a format is very weird. For example, here is what the format for typical currency might look like:

That’s right. You can’t simply type in the format. You have to use the objects on the left, and add the items to your format. And, to make matters worse, once you create a custom format, you can’t save it. If you want to format another object the same way, you have to create the same custom format again. It’s very time consuming, and will likely discourage many people from formatting objects in the universe.

In the UDT, when we are using the @Select or @Where functions, we can check a box to “Show object SQL” to see the full SQL of the object. It’s very helpful for troubleshooting. Unfortunately, this option is missing from IDT. There is no way to see the full SQL when using these functions.

Another missing feature that is, in my opinion, a serious issue, is the ability to change a dimension to an attribute (Formerly known as detail), or vice versa. For example, if you create an object as an attribute, and later decide that it needs to be converted to a dimension, so it can be included in a Navigation Path (formerly known as Hierarchy), you can’t do it. You have to recreate it as a dimension. This, of course, would cause existing reports, that use that object, to fail. In my opinion, this is a huge problem.

The process of converting UNV universes to the new UNX format is very buggy. I have seen parameters get changed, and universe settings also get changed. As my friend, Dallas Marks, discovered, even the eFashion universe cannot be converted.

Now, to be fair, SAP seems to be doing a good job of listening to our concerns with the new tool. I have submitted quite a few ideas on the SAP Idea Place, and several of them have already been accepted. That impresses me. Now we’ll see how quickly these issues are addressed. In the meantime, I suggest keeping your current UNV universes in their current format, and creating new universes in IDT.

If you’ve worked with IDT, what is your experience so far?

The Importance of Governance

I don’t know if this is a new phenomenon, or if I’m just more observant of it, but there seems to be a trend in BI today that is rather troubling: the lack of governance. What is governance, you ask? That’s a good question.

Before anything is published into a BI system, it should be subjected to a process that insures that it meets certain standards. In other words, someone needs to look at it and see if it is formatted correctly, has the right naming convention, includes the right information, or disclaimers, etc. This process is known as governance.

In the world of SAP BusinessObjects, governance can be applied to Connections, Universes, Documents, Dashboards, Workspaces, etc. Let’s take a look at the things that can be applied to the governance process.

  1. Naming Standards
    1. Should define a guideline for everything that can be built in BusinessObjects
    2. The goal is to create a consistent experience for users, and an easily maintainable and supportable system for developers and administrators
    3. Naming conventions can be applied to the following objects:
      1. Database connections
      2. Universe names
      3. Universe class names
      4. Universe object names
      5. InfoView folder names
      6. Document names
        1. SAP BusinessObjects Web Intelligence (Webi)
        2. SAP Crystal Reports
        3. Xcelsius (SAP BusinessObjects Dashboards)
        4. Other
  2. Security – three approaches:
    1. See everything except what is forbidden
    2. See nothing except what is needed
    3. Hybrid (usually the preferred approach)
  3. Reporting Guidelines
    1. Define how a report is made public
      1. How it is done now?
      2. What works well?
      3. What doesn’t work well?
      4. How should it be done?
    2. Establish formatting standards for public documents
      1. Creates a common look and feel to all public reports
      2. Makes it easier for users to find what they need on public reports
  4. Best Practices
    1. Job scheduling
      1. Limit the number of users who can schedule, to one or two per group
      2. Take advantage of Events
      3. Don’t schedule everything for the same time
      4. Apply a global limit to instances
      5. Avoid scheduling the same document multiple times
    2. Universe techniques
      1. Keep universes small and focused (no more than 700 objects)
      2. Never use underscores in anything that a user will see
      3. Give the universe a user friendly name and description
      4. Set the universe connection to “Disconnect after each transaction”
      5. Set the array fetch size to an appropriate level
      6. Set reasonable universe limits
      7. Set parameters appropriate to your database
      8. Every object should have a user friendly description
      9. Never have more than 3 levels of classes/subclasses
      10. Use proper case for object names
      11. Use full words for object names
        1. Use “Date” rather than “Dt”
        2. Use “Number” rather than “No” or “Nbr”
      12. Use Index Awareness to improve performance
      13. Every measure object should have an aggregate function in the select
      14. Format all numeric objects in the universe
      15. LOVs should have meaningful names
      16. No two objects should have the same name
      17. Use only custom hierarchies
      18. Set cardinalities on all joins

Now that we know what’s possible in a governance process, let’s take a deeper look at some of these.

Naming Convention Suggestions:

Connections: The connection name should be devised so that it includes information about the database that it is connecting to. For example:

Syntax: Environment_Platform_Database

Example: DEV_TD_CSR (Development Environment, Teradata, CSR database)

Universes: Universe long names should match the application that the data belongs to, as well as the focus of the universe. For example, if the data is from the CSR application, and the focus of the universe is sales, the universe could be named “CSR Sales”. The idea is to use a name that makes sense to the users.

Universe Class and Object Names: These should be consistent between universes. For example, if the Month object from Universe A, and the Month Name object from Universe B, return the same values, they should both be called the same thing. This creates a consistent user experience, and avoids confusion.

Documents: It can be very helpful to begin the name of a document with a code that identifies the universe or group that it belongs to, as well as sequential number. For example:

Syntax: Universe-SequentialNumber Document Name

Example: CS0001 Quarterly Sales By Title

In this example, CS represents Customer Service, and 0001 is simply a sequential number. This type of naming allows the documents to be sorted easily by universe, and also helps the user to specify which document they are referring to when they need to discuss documents.

Security

As a starting point, we should review the current security, and see what works well, and what doesn’t. If the current security meets the needs of the organization, then we can keep the current security, and tweak as needed. The three common approaches to security are as follows:

See everything except what is forbidden. In this approach, everyone sees everything by default. Then, content is hidden that is not allowed to be seen by specific users. This approach can be very hard to manage.

See nothing except what is needed. In this approach, the default is to see nothing, and only grant access to content that is required. This is easier to maintain, but can create a bureaucracy that will only frustrate users looking for content that they can’t see.

Hybrid. In this approach, most, but not all, content is hidden by default. Some content is deemed appropriate for all users, and, therefore, is granted to all users. Other content is only granted as needed. This is the easiest to maintain, and is also the best for the users. Users don’t want to see everything, as the volume of content can be overwhelming. By limiting them to what they need, as well as what they are allowed to see, their view becomes more manageable.

Reporting Guidelines

There should be a standard format applied to all public documents. This common “look and feel” makes it easier for users, as they always know what they’re looking at, and always know where to find what they need on the report. They also can tell, very easily, when they are looking at a document that has not received official approval. Standards may include, but are not limited to, the following:

  • Cover page that includes standard information about the document:
    • Document name
    • Document author
    • Document description
    • Universe(s) in use in this document
    • Last Refresh Date/Time
    • Query filters, include prompts
    • Disclaimers
    • Company logo
      • Which logo?
      • Size
      • Position
    • Standard report colors
    • Document/Report naming conventions

All documents will need to pass the governance process before being placed into a public folder. This process should be streamlined, so as not to create a bottleneck.

There should be one or two people, in each group, who are trained to apply the governance process to a document, and who have permission to add objects to a public folder. These people will be responsible for making sure that all documents meet enterprise standards before being published.

Conclusion

Everything we do should be designed to make life better for the users. We need to treat them like customers, and we want their business. So make sure that your governance process is designed to do just that. Don’t let it become a hurdle that the users need to jump over. Streamline the process as much as possible.

Setting up a governance process is a difficult process. It needs to involve key users who will give input of what the standards should be. The problem is that everyone will feel strongly that their way is best. So your job is to try to find a solution that meets everyone’s needs as much as possible. Give it a shot, and let me know how it goes.