markerBetrokken & direct


markerSamen voor de beste oplossing

Reduce Costs and Deliver Faster Results

Authors:                  Huib Oudemast, René van Kooten
Published:              September 2012

7 Features of DDM Studio 3 for SQL Server that will Reduce Costs and Deliver Faster Results

Summary: The difficult economic climate has created new priorities in the challenges that most organizations have to confront. A few years ago increasing Return On Investment seemed to be the main driver, where the investment was immediate and the elusive return was often spread over a number of years. Today many organizations have to show a ROI over a shorter and shorter term, which can only be achieved by a lower investment, that is lower costs, and a more timely return, usually within the same financial year. Both lower costs and prompter returns are attainable by proven methodologies and a maximized efficiencies in applying them. This white paper provides an overview of how organizations can use DDM Studio 3 for SQL Server (DDM Studio) can help organizations acquire a powerful Business Intelligence solution with low costs and record delivery time.


This is a preliminary document and may be changed substantially prior to final commercial release of the software described herein.

The information contained in this document represents the current view of OCEAN BI on the issues discussed as of the date of publication. Because OCEAN BI must respond to changing market conditions, it should not be interpreted to be a commitment on the part of OCEAN BI, and OCEAN BI cannot guarantee the accuracy of any information presented after the date of publication.


Complying with all applicable copyright laws is the responsibility of the user. Without limiting the rights under copyright, no part of this document may be reproduced, stored in or introduced into a retrieval system, or transmitted in any form or by any means (electronic, mechanical, photocopying, recording, or otherwise), or for any purpose, without the express written permission of Ocean BI and / or Ocean BI.

OCEAN BI may have patents, patent applications, trademarks, copyrights, or other intellectual property rights covering subject matter in this document. Except as expressly provided in any written license agreement from OCEAN BI, the furnishing of this document does not give you any license to these patents, trademarks, copyrights, or other intellectual property.

©2010 Ocean BI. All rights reserved.

Ocean BI and DDM Studio are either registered trademarks or trademarks of Ocean BI and/or affiliated companies in The Netherlands and/or other countries.

The names of actual companies and products mentioned herein may be the trademarks of their respective owners.


Huib Oudemast is working as a consultant since 1996. First he was responsible for the implementation of Enterprise Resource Planning (ERP) systems at 5 companies before he specialized in Business Intelligence and Datawarehousing. Huib is the founder of the architecture of DDM Studio.

Quote: “Simplicity is the key for success”.

Huib is bilingual in English and Dutch. He is married, has two sons, one daughter and resides with his family in IJmuiden.

You can contact Huib on
René van Kooten has worked for almost two decades for a healthcare organization in a variety of IT disciplines before he was employed by Ocean BI in 2007.

Throughout his career René has always been involved with Reporting Demands. From the mid-nineties René has been dedicated to build and to maintain data warehouses.

Quote: “A data warehouse is at most 30% of the total BI solution”.

René is bilingual in English and Dutch. He is married, has two
daughters and resides with his family in Amsterdam.

You can contact René on rvankooten@localhost

Management Summary

In today’s difficult economic climate, most organizations are confronted by the challenge of reduced resources, that may assume a different form for different types of organizations. For private sector organizations, the challenge of reduced resources may mean slower investments and lower sales; for public sector organizations, it may mean tighter and more controlled funding; for non-profit organizations it may mean decreased funding and smaller donations.

Across all sectors, organizations of all sizes are prioritizing cutting costs quickly above
anything else, even supporting new opportunities to drive revenues.

In many boardrooms the survival instinct to support profits places immediate cost slashing at the top of their agendas. That often translates in quickly cutting IT budgets. At the same time there is now an extraordinary opportunity to maximize the advantage from existing IT resources and invest in proven technologies and methodologies that maintain a long-term perspective while they take short-term steps to conform to current economic challenges.

DDM Studio represents a proven technology and methodology to acquire a powerful
Business Intelligence solution that will reduce costs and deliver faster results. DDM Studio empowers organizations to deliver mission-critical Data Warehousing and Business Intelligence solutions at a fraction of the costs and times of competing solutions. DDM Studio enables organizations to build dynamic new solutions with the lowest costs and in record time and extend existing systems in ways that permit them to seize new business opportunities.

This white paper describes 7 ways in which DDM Studio can immediately help your
organization meet today’s financial challenges and prepare for tomorrow’s opportunities by reducing costs delivering faster results.

Challenges Facing Organizations

The current economic climate have imposed a policy of costs reduction to all types of organizations.

Organizations in the private sector may consider reducing costs a necessary short-term measure to survive an increasingly competitive business environment. Organizations in the public sector may regard cutting costs as an indispensable expedient to endure more limited budgets.

However the indiscriminate practice of slashing costs even as a short-term recourse has the potential of causing long-term consequences with a severity that can span from harmful to disastrous.

On the one hand, a public sector organization may indiscriminately proceed to cut costs in a sphere of its domain and that may have its consequences not only by diminishing with too large a proportion the benefit provided to citizens, but successively being also the source of even greater costs to rectify the damage.

A private sector organization, on the other hand, may arbitrarily decide to slash certain costs in an aspect of its business field and that may have its repercussions not only by damaging that organization’s potential for continued growth, but in turn even jeopardizing its chances for survival and thriving in the future.

When facing the dilemma of costs rationalization, an organization should in the first
instance look at how it can make the most of what is already there by maximizing its
existing investments.

In the second instance, an organization should prioritize strategic new investments only if they will bring a tangible profit, that in the case of the private sector would translate into returns and proceeds, and in the case of the public sector would signify advantage and benefit to the citizens.

When deciding where to cut costs, organizations need to prioritize on those areas of technology that will yield the greatest profit within the context and the challenges of their own sector. They include the following:

  • The explosive growth of data experienced by organizations in recent years is notshowing any sign of slowing down. In a tough economic climate it is increasinglyimportant for an organization to make sense of their data for effective decisionalsupport.
  • Organizations that not only survive but sometimes strive in difficult times are those that are able to respond in the most agile manner to unstable and inconsistent circumstances. That increases the need for short times to go live with business intelligence solutions that analyze organizational data in new and relevant ways.
  • In difficult economic conditions organizations and units within organizations tend to merge more frequently. The ability to quickly and easily integrate data coming from heterogeneous systems and aggregate it at a higher lever is a critical factor for success.

Reduce Costs and Deliver Faster Results with DDM Studio 3 for SQL Server

So, how exactly can DDM Studio help reduce costs and deliver faster results? This section lists 7 features of DDM Studio that you can use to start saving time and money now.


In order to make it possible to create a generic solution, DDM Studio stores all aspects of a datawarehouse as metadata into the DDM Studio Repository (the DDM Metadata Database).

For instance, the definitions for dimensions and facts of the star diagrams, as well as data transformation and data enrichment definitions are stored as comprehensive metadata in the DDM Studio Repository.

By letting DDM Studio take care of all technical aspects and systematically store them in its Repository, you can concentrate on the business aspects of your Business

Intelligence solution.

DDM Studio seamlessly and unnoticeably works in the background to assure you that your Business Intelligence solution is built with a proven methodology and effective technology.

While the initial built and successive changes to your project are created and re-created

from the DDM Studio Repository, you save enormous amounts of time and your project can be completed even with what others would consider impossible deadlines.


Within the framework outlined in the previous point, DDM Studio enables you to use a unified solution to load data for dimensions.

DDM Studio supports Kimball Update Types 1, 2, 3 and also the various combinations of these update types by using the dimension metadata stored in the DDM Studio Repository.


Similarly to when loading dimensions, DDM Studio enables you to use one generic and comprehensive solution to load data for first level facts.
For example you can load facts with transactional records with history, transactional records without history, and transactional snapshots by using metadata stored in the DDM Studio Repository.

By using a methodology that can be generically applied to any requirement and a technology that consistently applies the best practices, the part of your Business Intelligence solution that creates and loads dimensions and facts is built in record time.


There are situations where the same dimension is associated to more than one fact attribute.

For instance, you may have the Date dimension that is related to the attributes SaleDate, PurchaseDate, and DeliveryDate in the Sales, Purchases and Deliveries fact tables respectively.

In order to maintain Role-Playing Dimensions in a generic solution, DDM Studio stores their relevant metadata in the DDM Studio Repository.

In the current example, metadata will be stored by DDM Studio into the DDM Metadata Database to signify that the Date dimension is related to the attributes SaleDate in the Sales fact table, PurchaseData in the Purchases fact table, and DeliveryDate in the Deliveries fact tables.

Consecutively this metadata is utilized by DDM Studio during the automated creation of the views for Role-Playing Dimensions in order to build what would be otherwise a complicated and difficult to manage solution in a consistent manner and in the shortest
possible time.


DDM Studio provides a completely automated solution for the support of Late-Arriving Dimensions that further decreases the time to build your Business Intelligence solution.

It is possible that a record may be loaded into a fact table containing a dimension value that does not exist yet in the relevant dimension table.

For example, a sales record containing a ProductCode equal to “MS501” is being loaded into the Sales fact table, but the Products dimension table does not contain a record for that product code.

DDM Studio then inserts a record into the relevant dimension table with the default value for its attributes and it marks it as a Late-Arriving Dimension.

In the current example, a record is added to the Products dimension table with ProductCode equal to “MS501” and its other attributes are set to their default values, for instance ProductDescription is set to “Product description from manufacturing catalog.” In a successive load of that dimension, the actual values for the attributes of that dimension record are set by applying a Kimball Update Type 1.

In our example, the ProductDescription for ProductCode equal to “MS501” in the Products dimension table will be updated to its actual value, for instance “MiniSonde 501.”


DDM Studio automatically generates surrogate keys for dimensions, which allows you to spend more time on business matters rather than technical aspects of the solution that are applied in a consistent and effective way under the hood.

A surrogate key is a unique identifier that can be used as the primary key of a table and it is particularly convenient to use them in dimension tables.

For example, the Cities source table may have the CityCode column as the primary key. Let’s suppose that a record in that table has CityCode equal to “AMS” and the CityName attribute equal to “Amstelveen”. And let’s suppose that starting from 1 January 2010 that CityCode is reassigned to “Amsterdam”.

If we used the CityCode column also as the primary key in the Cities dimension table, it would not be possible to make a distinction on when “AMS” means “Amstelveen” and when it means “Amsterdam”.

Instead, by using a surrogate key in the Cities dimension table it is possible to have two separate records – one stating that the CityCode “AMS” means “Amstelveen” until the 31 December 2009, and another one stating that CityCode “AMS” means “Amsterdam” from 1 January 2010.

Similar examples can be found for parent-child relationships with results that can be even more damaging and at times disastrous is left unresolved.

For instance, let’s suppose that the Clients source table contains a record with the attributes ClientCode equal to “C444” and SalesManagerCode equal to “SM01”, that correspond to “John Smith”. Let’s suppose that starting from 1 January 2010 the SalesManagerCode for that client is changed to “SM02”, that correspond to “Carol Green”.

If we use the ClientCode column also as the primary key in the Clients dimension table, all sales made by John Smith in 2009 are probably going to be assigned to Carol Green.

Instead, by using a surrogate key in the Clients dimension table it is possible to have two separate records for the client with ClientCode equal to “C444” – one with John Smith as their Sales Manager until 31 December 2009, and another one with Carol Green as their Sales Manager from 1 January 2010.

These examples illustrate that it is in general possible to retrieve the correct surrogate key by using the business key of the dimension (e.g. ClientCode) and the main date of the fact (e.g. SalesDate).

By using a surrogate key every dimension record contains the business key as one of its attributes, and a start date and end date determining the valid interval of the values of the dimension attributes.

Surrogate keys are also used for transactional fact tables. A cluster index is built on this key, which became one of the key components in supporting Microsoft Fast Track Data Warehouse.

It has to be noticed that by using a surrogate key it is also possible to gain on performance. In the first instance, surrogate keys avoid primary keys based on more than one column that can be at the base of SQL statements with an excessive number of joins. In the second instance, by using numerical values in the surrogate key a correct type of index would in general perform better than an index on alphanumeric values.


Microsoft SSIS provides logs with a granularity that is associated with packages and are configured at the package level.

Besides the standard SSIS logging, DDM Studio allows a Fine-Grain Logging that provides more detailed operational information.

Some examples of Fine-Grain Logging include the following:

  • More extensive information is provided on the execution of the ETL processes for
  • each individual dimension and fact.
  • For a dimension load, the log provides the number of new records and the number of changed records depending on Kimball type 1, 2, or 3.
  • The log provides extensive information when a source table structure has changed.
  • Every executed SQL statement is logged, when the relevant configuration setting in DDM Studio is turned on.

Fine-Grain Logging provides extra information to understand what DDM Studio does at functional level.

While building a solution, Fine-Grain Logging is an invaluable tool that helps developers and testers pin-point the cause of an error and solve any problem in record time.

Latest DDM Studio Features

The DDM Studio solution is constantly improving. The latest version of DDM Studio is version 3.01 and contains the following latest added features (and more).


By implementing “enterprise” as a separate entity-type it is possible to build and to maintain datawarehouses of multiple enterprises on one SQL Server instance.


Where the names of the Datawarehouse databases are fixed in the previous versions of DDM Studio, these are now flexible and are to be assigned by the development team. Also the physical aspects of the databases (file locations, filegroups) are stored as metadata in the DDM Studio Repository. The name of the DDM Studio Repository remains fixed however!

The metadata of the source objects are now also stored in the DDM Studio Repository and the Staging Area Delta objects are based on these. It becomes possible to trail the data in the datawarehouse back to the source on metadata-level!


Because building a dimensional modeled datawarehouse has become a simple activity the focus is to get the source data in a format which is usable to be loaded into the dimensional modeled datawarehouse. This is done by building Staging Area Delta objects and by defining dependencies between them. The complexity is determined by the characteristics of the source data, for instance how well-modeled is the source transactional system? The DDM Studio Workmethod gives a couple of guidelines for building SAD objects, which are also maintainable.


Data sources can contain history, which should be taken into account while loading the Datawarehouse. DDM Studio 2 can deal with the existence of a startdate and an enddate in the source data, which determine the valid period of a registration.

DDM Studio 3 can now deal also with a date in the source data, which determines the start of a new registration, for instance the start of a new employment agreement.

Future DDM Studio Features

The foundation of the DDM Studio solution is the support of the Dimensional Modeling Methodology of Ralph Kimball (though he does not like to talk about a methodology but about a set of techniques).


In the next release of DDM Studio also the Datavault Methodology of Dan Linstedt will be supported. Metadata of Hubs, Links and Satellites are stored in the DDM Studio Repository, but off course the Datavault still has to be modeled by Data Modellers.

A choice is to be made between Datavault Classic and Datavault Surrogate Keys, because the ETL processes are different.


Extracting data from the Datavault for Reporting and Ad-Hoc Analysis purposes is not easy. For these purposes a Businessvault is to be built. A Businessvault in the mindset of the architects of DDM Studio is a dimensional modeled datawarehouse. When the Datavault is built with DDM Studio, then DDM Studio can use the metadata of the Datavault components stored in the DDM Studio Repository to generate and build Staging Area Delta objects automatically. Because DDM Studio adds an extra metadata attribute to Satellite tables to mark whether a record has already been loaded into the Datawarehouse, it becomes possible to have incremental loads for the Businessvault from the Datavault.


Today’s difficult economic climate has re-prioritized the challenges that most organizations have to confront. DDM Studio can help you acquire a powerful Business Intelligence solution that reduces costs and delivers faster results.


You can find out more about reducing costs and delivering faster results with DDM Studio 3 for SQL Server in the following resources:




Did this paper help you? Please send us your feedback. Tell us on a scale of 1 (poor) to 5 (excellent), how would you rate this paper and why have you given it this rating? For example:

  • Are you rating it high due to having good examples, excellent screen shots, clear writing, or another reason?
  • Are you rating it low due to poor examples, fuzzy screen shots, or unclear writing?

This feedback will help us improve the quality of white papers we release.
Send feedback Subject: Feedback Whitepaper DDM Studio for SQL Server, info@localhost