lørdag den 29. juni 2013

Receipt for a Top 1...N SSAS / SSRS solution

Scenario : Olap based SSRS reports with one or more cascaded parameters.
One of the parameters is Top Count letting the user specify whether to display Top 5 or Top 15 or ...Top N of best selling products or most profitable companies, or the like.
The list of values is predefined according to the business requirements.
Hereby the main steps and the pseudocode for implementing the solution using the Analysis Services data provider:

1) Create one dimension, TopN. It may have one attribute TopN.
Source it from a view or a named query with a trivial t-sql like :

select <first value> as TopN
union
select <second value> as TopN
......
union
select <last value> as TopN


Set the property IsAggregatable = False for the attribute TopN, so the attribute cannot be aggregated in any hierarchy.
The dimension is not related to any measure groups.
Process the dimension, browse it's only attribute and verify that there is only one level, and no (All) level at all.

2) Create a calculated member [Top N Value]  as :

CREATE MEMBER CURRENTCUBE.[Measures].[Top N Value]
 AS [Top N].[Top N].currentmember.member_caption,
VISIBLE = 1 ,  DISPLAY_FOLDER = '<your folder>';


If your reporting solution further needs sets for filtering based on the newly created [Top N Value] you may add them as : 

CREATE DYNAMIC SET CURRENTCUBE.[Your set]
 AS TOPCOUNT([Dimension].[Hierarchy].[Attribute].MEMBERS,[Measures].[Top N Value],[Measures].[Your measure]), DISPLAY_FOLDER = '<folder for diplaying sets>';   

Deploy and process the cube.

In your SSRS report:

3) create a parameter, let's call it TopNTopN.
Your parameter is normally one of latest in the chain of cascading parameters.
That is why the Source MDX for the available values may include some subcubes :

WITH MEMBER [Measures].[ParameterCaption] AS [Top N].[Top N].CURRENTMEMBER.MEMBER_CAPTION MEMBER [Measures].[ParameterValue] AS
[Top N].[Top N].CURRENTMEMBER.UNIQUENAME MEMBER [Measures].[ParameterLevel] AS [Top N].[Top N].CURRENTMEMBER.LEVEL.ORDINAL
SELECT {[Measures].[ParameterCaption], [Measures].[ParameterValue], [Measures].[ParameterLevel]} ON COLUMNS , [Top N].[Top N].ALLMEMBERS ON ROWS
 FROM ( SELECT ( STRTOSET(@<param1>, CONSTRAINED) ) ON COLUMNS
 ...........
 FROM ( SELECT ( STRTOSET(@<paramn>, CONSTRAINED) ) ON COLUMNS
 FROM [Your cube]) ...)


Select one value among <first value> , <second value> , ... <last value> as the default parameter value.  You must be now ready for testing your report.

For an alternative solution using the OLE DB Provider for Analysis Services, please see the pages 583-584 from the book 'Applied Microsoft Reporting Services' by Teo Lachev. 

mandag den 13. maj 2013

2 Tips when saving your tablix based SSRS reports to Excel

Hereby two tips related to tablix based SSRS reports using parameters saved to Excel.
1) The saved report has to start at column A and row 1.
Two conditions have to be fulfilled her :

  • If you have a report title and/or logo either in the report header or in the body above the tablix then delete them. Dynamically setting the hidden property of a textbox or image does not help.Recall that you only can statically hide the report header, but this does not help either.     
  • In the properties window for your tablix, set the location to 0, 0. ( Left = 0 , Top = 0 ). 
2) After saving the report in Excel, the users must be able to see the parameter values.
To accomplish this, you may create a new table with two rows : the first one with headers containing  your parameter names ( the Prompt: values you entered for each of your parameters for instance).
The second row, an expression that concatenates - for a multivalue parameter - the values selected by the user.
An example, that also distinguishes between the 'All'  and the other values might look like :
= Microsoft.VisualBasic.Interaction.IIF(Parameters!<param1>.Label(0) <> "All", Microsoft.VisualBasic.Strings.Join(Parameters!<param1>.Label, " "), "All")  

 

where you will replace <param1> with your own parameter(s).

You may also add a page break before the tablix, set the DocumentMapLabel and PageName properties in order to customize the link name in the Document Map and the name of the worksheet allocated to the parameters.  

onsdag den 24. april 2013

A non existing sale or a sale with a 0 revenue ?

Who does not want to distinguish between a non existing event - sale for instance - and an existing event with a 0 as a numeric result, a revenue due to an promotional discount, or an exceptional 0 cost, etc. ?

Your data path may look like :

Data Source -> Data Mart -> SSAS Cube -> Client Reporting tool (SSRS, Excel)

The absence of data has to be transferred on the whole path from the Data Source to the Client.
Simply put, do not make 0's or ' ' out of the absence of data neither in DataMart, SSAS cube or at the report layer.

Your 'false friends' in this process may be :

1) Lack of awareness of this issue for at least one person in the normal chain :
ETL developer -> SSAS developer -> Client developer.

2) Using functions such as isnull() or coalesce() in the ETL processes. I still found a misconception even among skilled database developers, that avoid null values at the datamart level, as they are 'difficult to tackle in t-sql calculations'. 

3) The default SSAS Nullprocessing option at the measure level is set to Automatic. A better alternative is the Preserve in my opinion. You may recall, that Preserve will preserve the NULL value, the automatic is equivalent to ZeroorBlank option. It converts the null value to zero for numerical columns or to a blank string for text-based columns. 

4) You are not safe at the client level. Your Excel olap pivot table may be set to show 0 for empty cells, or you may use SSRS expressions, even a simple division by 2 for fields values may  accidentally create false 0's.

mandag den 15. april 2013

Exists, the MDX variant of 'Cogito ergo sum'

While reviewing or maintaning SSRS 2008/2012 cascading parameters reports based on SSAS olap cubes, I face quite often the lack of 'narrowing' of the values of param n, based on the values of the preceding 1...n-1 parameters.
Scenario : The reporting solution deals with customer sales on time perspective.
When selecting dates from a YMD hierarchy for instance, only customer with sales within the selected period of time must be retrieved and nothing else.
And if there are no customers, the user has to stop her instead of continuing to select values for the next parameters and experience an Empty dataset message at last.
Using the cascading parameters appropriately, can save your users a lot of frustrations.

The MDX template for defining the parameters may look like :

WITH MEMBER [Measures].[ParameterCaption] AS <<[Dimension].[attribute]>>.CURRENTMEMBER.MEMBER_CAPTION
MEMBER [Measures].[ParameterValue] AS <<[Dimension].[attribute]>>.CURRENTMEMBER.UNIQUENAME
MEMBER [Measures].[ParameterLevel] AS <<[Dimension].[attribute]>>.CURRENTMEMBER.LEVEL.ORDINAL

SELECT {[Measures].[ParameterCaption], [Measures].[ParameterValue], [Measures].[ParameterLevel]} ON COLUMNS ,
<<[Dimension].[attribute]>>.Members ON ROWS FROM
( SELECT ( STRTOSET(@param1, CONSTRAINED) ) ON COLUMNS FROM
( SELECT ( STRTOSET(@param2, CONSTRAINED) ) ON COLUMNS FROM
( SELECT ( STRTOSET(@param3, CONSTRAINED) ) ON COLUMNS FROM
......................
( SELECT ( STRTOSET(@param(n-1), CONSTRAINED) ) ON COLUMNS
 FROM   <<Cube>>))))... )


That will only work IRL, if @param1 ... @param(n) are sourced from attributes from the same dimension due to the 'autoexists' feature.  
As soon as at least one paramter is based on another dimension, the 'narrowing' effect is gone. 
To reinforce it, you may replace the line :
<<[Dimension].[attribute]>>.Members ON ROWS with an Exists construction, that at a basic level may look like :
Exists ( <<[Dimension].[attribute]>>.ALLMEMBERS , <<[Other Dimension].[Other attribute]>>.AllMembers , <<Measure Group related to the two dimensions>>  )

The Members in the set { <<[Dimension].[attribute]>>.ALLMEMBERS } must be related to the members in the set { <<[Other Dimension].[Other attribute]>>.AllMembers } in the measure group <<Measure Group related to the two dimensions>> .

søndag den 17. marts 2013

How's your week, aka YWD hierarchy looking ?

I'm quite suprized to realize the lack of 'week corrections' for the Time dimension in lots of SSAS BI solutions.

Scenario : ( affecting the weeks with days spanning two following years) .

The YQMD hierarchy is right but the YWD hierarchy is wrong. Ex: 20120101 is wrong in the YWD hierarchy as it’s ‘Week Year’ is 2011 ( as the day belongs to a week in 2011) while  - using the same Year attribute in both hierarchies – we set it to 2012 . ( right of course for the YQMD hierarchy)

You can fix it at either the relational - view or table level - or at the UDM level and in either a generic way or at detailed level.  The recommended way : at the relational level and in a generic way, but you can mix the : level ( generic or detailed ) X level ( relational or UDM ) and have 4 combinations to select.
I'll try to exemplify two of the alternatives :  
 
Generic x Relational :
 
In your view you may write a statement like :
Select ..., <Week Year> = cast(case
           when Month_Number = 12 and WeekNo = 1 then Year_Number + 1
           when Month_Number = 1 and WeekNo in (52,53) then Year_Number - 1
           else Year_Number
end as nvarchar(4)) + '-' + [Week]
from <Time view or table> 

Detailed x UDM :

In the Data Source View, select your Date DataTable, right click on it , select New Named Calculation. Name it for instance Week Year and create it with an expression like :

CASE <DateKey>
When 20071231 then 2008
When 20081229 then 2009
...............................................
When 20120101 Then 2011
…………………………………………….
ELSE <Year>
END

                           Replace <DateKey> and <Year> with the corresponding names of your columns.

Last common steps 

In your Date dimension now, drag and drop Weak Year from DSV to the dimension attributes pane .
Update the attribute relationships correspondingly.
Only in your Week hierarchy ( YWD), replace on the first level the Year attribute with the Week Year attribute.

Process the Date dimension and the cube and test the ‘week adjustments’ at the start//end of the years where you implement it.

tirsdag den 19. februar 2013

Customize and automate reports delivery with SSRS and data-driven subscriptions

Scenario :
Some weeks ago I was suprized to realize that a few analysts used a lot of time extracting data from an application and sending reports saved in Excel format to customers. The extract was based on customers account numbers and a customer might have one or even 10 account numbers. The reports were sligthly customized and delivered by mail on a daily, weekly or monthly basis. Besides, that was an extra requirement regarding the timeline of each type of report: the weekly
reports could reflect the latest n weeks ( n= 1...24), and n was different from one customer to another. The same idea for monthly and daily reports.

Challange : To fully automate this task and give the analysts the opportunity to maintain it, to create new customers or change data for the existing
customers such as time delivery , e-mail adresses, cancelling one or more deliveries for good or for a certain period of time, etc.

Keywords: Data-driven subscriptions based on a Reporting Services in Sharepoint integrated mode infrastructure and a dialogue with the business.
Hereby the following main steps for the solution :

1) Agree on a separate prototype for the daily, weekly and the monthly report. It's a matter of 'aggregating' all content requirements for all customers for each type of report. Some customers may receive some extra information not originally required, but this is seldom a problem.

2) Ask questions. Agree on all the definitions if you'are in doubt. Weekly for instance: American or European way ? Calendar week or the latest 7 days ? Can we agree on three schedule definitions for the daily, weekly and monthly deliveries ?

3) You are now ready to start designing the three reports and highly parametrize your datasets. Create 10 parameters @acc_1 ... @acc_10 for covering up to ten account numbers for potentially each of your customer and a @dd parameter covering the timeline.

4) Create a table for your data driven subscription already at this stage.  The statement may look like:  CREATE TABLE [dbo].[SubscriptionTable](
 [SubscriptionID] [smallint] IDENTITY(1,1) NOT NULL,  [Name] [nvarchar](50) NULL,  [Acc_1] [int] NOT NULL,
 [Acc_2] [int] NULL,  [Acc_3] [int] NULL,  [Acc_4] [int] NULL,  [Acc_5] [int] NULL,  [Acc_6] [int] NULL,
 [Acc_7] [int] NULL,  [Acc_8] [int] NULL,  [Acc_9] [int] NULL,  [Acc_10] [int] NULL,  [Email] [nvarchar](150) NOT NULL,  [Daily] [smallint] NULL,  [Weekly] [smallint] NULL,  [Monthly] [smallint] NULL,
 CONSTRAINT [PK_SubscriptionTable] PRIMARY KEY CLUSTERED
(
 [SubscriptionID] ASC
)WITH (PAD_INDEX = OFF, STATISTICS_NORECOMPUTE = OFF, IGNORE_DUP_KEY = OFF, ALLOW_ROW_LOCKS = ON, ALLOW_PAGE_LOCKS = ON) ON [PRIMARY]
) ON [PRIMARY]


5) Design the queries for the above mentioned parameters. The @acc_1 ... @acc_10 parameters are based on datasets like:  select distinct acc_1 from SubscriptionTable, and so on. That is why the data-driven subscription table has to be created so early. Your main query, used by the tablix is UNION based with a prototype like :    
select 
  < information from the columns>
  from 
    < join her all the tables you need to retrieve the information>
  where
                                  C.Customerid in ( @acc_1 )
   and datediff ( day , <your extract date> , getdate() ) between  1 and @dd

UNION       
....

UNION
< information from the columns>
  from 
    < join her all the tables you need to retrieve the information>
  where
                                  C.Customerid in ( @acc_10 )
   and datediff ( day , <your extract date> , getdate() ) between  1 and @dd

       
6) Test all the queries by assigning values for the parameters and finish designing of the three reports. In my case, the weekly and monthly reports were displaying the same summary and detailed information but the time perspective was of course different.

7) Deploy the three reports on a sharepoint library and create a data-driven subscription for each of them.

8) The query for the weekly subscription may look like : select
Name + ' ' + ' - Weekly report for latest ' + convert( char(2),Weekly) + ' week(s) issued ' + convert( varchar(12), GETDATE(), 104) as Subject  , [Acc_1]  ,[Acc_2]  ,[Acc_3]  ,[Acc_4]  ,[Acc_5]  ,[Acc_6]  ,[Acc_7]  ,[Acc_8]
      ,[Acc_9]  ,[Acc_10]  ,[Email]  ,Weekly from SubscriptionTable where Weekly > 0


9) You may perhaps realize now that the table contains one row pr. customer. Name is a user friendly customer name used as part of the subject mail, email contains the customer email adresses and weekly serves as the number of weeks covering data for that particular customer.    

10) Find a user friendly way for the business users in order to setup and maintain the solution and instruct them. It's often easier and more effective to produce a video clip for this purpose: http://youtu.be/ADoMp_SdGYo 
If you have other alternatives or even a way for a better scaling of the solution - when a customer can be related to more then 10 accounts - than do not hesitate to contact me. 

torsdag den 27. september 2012

A MSSQL / SSRS / Sharepoint Integration solution

Scenario: A data change event in a table in a relational database - like sailing of a ship in my case - should trigger the delivery of a SSRS report in Sharepoint integrated mode as an e-mail attachment
to a certain list of recipients. 
Start: some t-sql code for the trigger of the table where the data change occurs.
End :  data or data-driven subscription to your report.
But how can we link the 'loose ends' together ? 
Hope the following guide and pieces of code may solve your problem:

1) The template for your trigger code it's perhaps the most trivial part and may look like :

CREATE TRIGGER <trigger_name>
   ON  <your table name>
   AFTER UPDATE
AS
IF UPDATE(<your column name>)
BEGIN

     SET NOCOUNT ON;

     DECLARE <variables> 
    
     SET <varibles> from Inserted, Deleted , etc.)
    
    < Your way to extract, transform and separate from her the parameters that you will use in the query for your data-driven subscription>
                             
       --- code that fires the Manifest subscription.
                                                           
END


2)      Create a shared schedule that will never fire, that is that executes once in the past.

3)      Create a subscription to your report. I used a data-driven subscription. The query of the subscription has to provide the values for your report parameters.
 Put down your subscription id.
You have to fire now a TimedSubscription event for your Reporting Services instance and you can accomplish it in two different ways :

4) By calling a remote stored procedure on the database server that hosts the ReportServer db :
       exec   <remote db server>.ReportServer.dbo.AddEvent @EventType='TimedSubscription', @EventData=<your subscription id from pct. 3>
Remember when configuring the linked server that RPC and RPC out options are set to 'TRUE', your remote login account has permissions to execute the stored procedure from above
and that you solve potential collation conflict in your subscription query. 

5)  Write a .Net application that calls the FireEvent method on the report server SOAP API. In the FireEvent method set the first variable EventType to “TimedSubscription” and the second one to your Subscription ID.
Your app. may be a console application, so your final trigger statement may be: xp_cmdshell <your exe .net application>. 
Make sure you grant the account that runs this process permission to “Generate events” System task.
                                                          
6) The idea in your .net app. is adding a web reference to the 'right' Reporting Services web service. If SSRS 2008 is in Sharepoint Integrated mode, than your web service url may look like:
http://<web site>/_vti_bin/ReportServer/ReportService2006.asmx
Some 'basic' code to start with in your C# main function :
 static void Main(string[] args)
        {
           
            <your web service reference>.ReportingService2006 rs = new <your web service reference>.ReportingService2006();

            rs.Url = "http://intranet.dk.dfds.root/_vti_bin/ReportServer/ReportService2006.asmx";
            rs.Credentials = System.Net.CredentialCache.DefaultCredentials;

            string site = <your site>;
            // Get the subscriptions
            <your web reference>.Subscription[] subs = rs.ListMySubscriptions(site);
           //     rs.ListAllSubscriptions(site);

            try
            {
                if (subs != null)
                {
                    // Fire the first subscription in the list
                    rs.FireEvent("TimedSubscription",
                        subs[0].SubscriptionID, site);  
                    Console.WriteLine("Event fired.");
                }
            }
            catch (Exception ex)
            {
                Console.WriteLine(ex.Message);
            }


        }
    }
}


I implemented the first option, as described at 4).
If you have other alternatives to solve this problem, than do not hesitate to contact me.