Saturday, December 13, 2008

SSAS Cube Testing

1.1 BVT / Smoke Testing Scenarios:

1. Ensure only the user with the desired permission is able to connect to the cube

2. Validate Data Source Connection string for cube

E.g. Provider=SQLOLEDB.1;Data Source=<Test ETL Server>.redmond.corp.microsoft.com;Integrated Security=SSPI;Initial Catalog=dbCPRMart

3. Validate your are able to successfully process the cube

4. Validate your are able to browse the cube

5. Configure and validating Analysis Services Query Logging

1.2 Data Validation Scenarios:

Approach 1: Browse the cube using BIDS or SQL Server Management Studio and compare the output against Mart using SQL query

If it’s a new development of a Cube, Browse the Cube, drag and drop measures on the Page and Dimension on the Y-axis. You need to write T-SQL that does similar aggregation that brings similar output.

Eg:

OWC Output:

                                 SALES        PROFIT

AP

      CHINA                 200                 60

      INDIA                   250                 50

US

      CHICAGO              500               65

       WASHINGTON      550               65

Equivalent T-SQL:

SELECT REGION, CITY, SUM(SALES ), SUM(PROFIT)  FROM GEOG G JOIN SALES  S FACT ON G.ID=S.ID

GROUP BY REGION, CITY ORDER BY 2

Approach 2: MDX query to compare data with SQL Query executed against Mart

2.1 For simpler Cube & Dimensional Models

Write your MDX query and compose SQL query for the same conditions and results should match

2.2 Automatically generate MDX using SQL Profiler & Reporting tools and compare it against SQL

SQL Profiler & Reporting Tools like PPS, ProClarity provide the capability of generating MDX on Cube conditions. Use these MDX against the SQL queries formed by you based on the relational / dimensional model on Mart.

Approach 3: Black box testing using sample test data

Insert / Update / Delete test data in your backend and calculate the outcome value based on the desired functionality / requirement without going to cube and ensuring that your expected output value matches the cube output in the final reports.

Approach 4: AMOMD object Compare

Create Automatic Verification Mechanism between OLAP and SQL objects

http://msdn.microsoft.com/en-us/library/cc281460.aspx

http://msdn.microsoft.com/en-us/library/cc280975.aspx

clip_image002

Figure: shows a high-level view of the AMO object model for OLAP objects

Reference:

http://microsoft.apress.com/feature/74/introduction-to-analysis-management-object-amo-programming

1.3 Cube Design Scenarios:

1. Validate all measures

  • Open Visual Studio analysis services DB
  • Browse and Open Cube DB
  • Double Cube and browse to "Measure" pane
  • Select measure <measure name>
  • Go to Properties, and check "source" field

     Expected Result:

  • Measure source should be set to correct table as source table and correct column as source column in Mart
  • Ensure measure has all the required fields / columns as present in the mart

2. Validate all dimensions & dimension hierarchies

  • Open Visual Studio analysis services DB
  • Browse and Open Cube DB
  • Double click Cube and open data source views and Open <Dimension name> Dimension under "Dimensions"

   Expected Result:

  • Make sure Table columns in Mart are mapped to Dimension correctly
  • Make sure Dimension key is correctly mapped to Dimension key column of the dimension Table in Mart
  • Make sure all the required columns / fields are present in dimension as present in mart
  • Ensure Hierarchy is correctly defined
  • Fiscal Year -> Fiscal Month -> Fiscal Week -> Calendar date

3. Validate all calculated measures

  • Open Visual Studio analysis services DB
  • Browse and Open Cube and go to "Calculated" tab
  • Check expression for "Calculated measure name"

    Expected Result

    The MDX expression defined should be accurate as per your requirements.

     Eg. [Measures].[ChangePoint Total Backlog] + [Measures].[Siebel High Pipe] + [Measures].[Siebel Low Pipe]

4. Validate “Data Source Views” of your cube against your design

image

 

1.4 Security Testing Scenarios:

1. Ensure each user belonging to a cube role has appropriate access

clip_image002[5]

2. User with read permission should only have access to browse the cube

Read Definition checkbox should be selected

clip_image004[6]

3. User with Process permission should be able to process the cube as well.

Process Database checkbox should be selected

4. User with Admin permission should be able to browse, process, make changes to the cube as well.

Full control (Administrator) checkbox should be selected

5. Cube roles should be mapped to correct users and group

clip_image006[4]

6. Cube roles should be having restricted access or unlimited access to dimension data based on the design and project needs

clip_image008[4]

1.5 Miscellaneous Scenarios:

1. Backup and Restore:

Take the backup of the cube and try restoring. The functionality should remain working as earlier.

1.6 Performance Scenarios:

1. Optimize Cube Aggregations

Before running your stress tests, you’ll want to ensure that you OLAP design has optimized aggregations. To optimize your aggregations, you’ll need to first turn on SQL Server Analysis Services query logging, then run the cube optimization wizard.

2. Using load simulator

2 comments:

Mohammad Ashik Elahi (scornik) said...

Hello,

This post is proved to be very helpful for me.

Do you know any information about converting mdx query to sql query.

regards
Ashik

Anonymous said...

For the scenario named "Data Validation Scenarios", "Cube Design Scenarios" and "Performance Scenario", you can use NBi to for automation or just to get easier. This framework is compatible with NUnit and doesn't require C# skills (even no compilator). It's open-source and free. You should check it at http://nbi.codeplex.com