Pages

Ads 468x60px

Labels

Monday, 2 June 2008

Data Integration Challenge – Parent-Child Record Sets, Child Updates

There are certain special set of records like Loan & its Guarantor details in a banking system, each Loan record can have one or more Guarantor record. In a similar way for a services based industry Contracts & its contract Components exist, these sets can be called as parent-child records where in for one parent record like Loan we might have zero to many child records of Guarantor.
During data modeling we would have one table for the parent level record and its attribute, another separate table for the child records and its attributes.
As part of the data load process, have seen situations where a complete refresh (delete & insert) of the Child records is required whenever there is a change in certain attributes of a parent record. This requirement can be implemented in different ways; here we would look at one of the best ways to get this accomplished.
The following steps would be involved in the ETL process
  1. Read the parent-child record
  2. Determine if a change in the incoming parent record
  3. If a change has occurred then issue a delete to the particular set of child records
  4. Write corresponding incoming new child records into a flat file
  5. Once step 1 to 4 is completed for all parent records have another ETL flow that would bulk load the records from the flat file to the child table
We didn’t issue an insert with a new incoming child record after the delete because the deleted record wouldn’t have got committed and an insert can lock the table. We can issue a commit after every delete and then follow it with an insert but having a commit after each delete would be costlier, writing the inserts to the files handles this situation perfectly.
Also an option to insert first with a different key and then delete the older records would be costlier in terms of locating the records that needs to the deleted.
We could have also looked at the option of updating the records in place of deletion then we would at times end up having dead records in the child tables; the records that have been deleted in the source would still exist in the target child table, also updating a record can disturb contagious memory, deletion and insert would have the pages intact.
Read More about  Data Integration


Monday, 19 May 2008

Let’s talk EPM – Part 2 on Metrics Profiling

In my earlier post on Enterprise Performance Management (EPM), I had enumerated the six steps of a practical EPM strategy in an organization. They were:
  1. Business Process Maps – Understand the business process
  2. Metrics Identification – Get hold of the metrics
  3. Metrics Profiling – Understand the metrics in depth
  4. Metrics Maps – Understand the cause and effect relationships between metrics
  5. Metrics Visualization – Implementation of Metric Maps on BI Tools
  6. Watch and Improve – Monitor Metrics and Improve business process as required
It is important to realize that building a data warehouse (enterprise wide) or data mart (functional area wise) or simply an integrated, subject-oriented data repository (without getting lost in semantics!) is implicit in the set of steps outlined above.
Steps 1 and 2 (Business Process and Metrics identification) are self-explanatory. Though getting hold of the right metrics is easier said than done, it is fairly well understood that the measures/metrics selected for analysis should align itself with the organization’s mission, business model and value creation aspects.
Step 3 – Metrics Profiling, in my opinion, is the step often missed out in EPM implementations and arguably is a major cause of failures in such programs. Metrics Profiling stated simply is a way of understanding your metrics in depth. Given below is a sample template for profiling your metrics and can be customized for each organization.
Profiling Parameters:
1. Metric Name – Name of the metric
2.Metric Definition – Brief definition of the metric
3.Metric Type – Is it a ratio, absolute number, trended value, etc.
4.Sources of data – Identify the source of data for the metric and the owners
5.Application – Brief description of how the metric helps in managing the business better
6.Potentially Affected Metrics – Identify the other metrics that are impacted (positive or negative) by this metric.
7.Example – Provide an example of metrics usage. (For example: ABC Computers released three new product lines during the last 12 months, generating $15 million in new revenue out of total annual revenue of $125 million. New Products Index = 15 ÷ 125 = 12%)
Metrics Profiling is a very important step in the implementation of enterprise wide performance
management system. I will discuss the other aspects of EPM in my subsequent posts.
Thanks for reading .

Thursday, 15 May 2008

Data Integration Challenge – Storing Timestamps

Storing timestamps along with a record indicating its new arrival or a change in its value is a must in a data warehouse. We always take it for granted, adding timestamp fields to table structures and tending to miss that the amount of storage space a timestamp field can occupy is huge, the storage occupied by timestamp is almost double against a integer data type in many databases like SQL Server, Oracle and if we have two fields one as insert timestamp and other field as update timestamp then the storage spaced required gets doubled. There are many instances where we could avoid using timestamps especially when the timestamps are being used for primarily for determining the incremental records or being stored just for audit purpose.
How to effectively manage the data storage and also leverage the benefit of a timestamp field?
One way of managing the storage of timestamp field is by introducing a process id field and a process table. Following are the steps involved in applying this method in table structures and as well as part of the ETL process.Data Structure
  1. Consider a table name PAYMENT with two fields with timestamp data type like INSERT_TIMESTAMP and UPDATE_TIEMSTAMP used for capturing the changes for every present in the table
  2. Create a table named PROCESS_TABLE with columns PROCESS_NAME Char(25), PROCESS_ID Integer and PROCESS_TIMESTAMP Timestamp
  3. Now drop the fields of the TIMESTAMP data type from table PAYMENT
  4. Create two fields of integer data type in the table PAYMENT like INSERT_PROCESS_ID and UPDATE_PROCESS_ID
  5. These newly created id fields INSERT_PROCESS_ID and UPDATE_PROCESS_ID would be logically linked with the table PROCESS_NAME and its field PROCESS_ID
ETL Process
  1. Let us consider an ETL process called ‘Payment Process’ that loads data into the table PAYMENT
  2. Now create a pre-process which would run before the ‘payment process’, in the pre-process build the logic by which a record is inserted with the values like (‘payment process’, SEQUNCE Number, current timestamp) into the PAYMENT table. The PROCESS_ID in the payment table could be defined as a database sequence function.
  3. Pass the current_prcoess_id from pre-process step to the ‘payment process’ ETL process
  4. In the ‘payment process’ if a record is to inserted into the PAYMENT table then the current_prcoess_id value is set to both the columns INSERT_PROCESS_ID and UPDATE_PROCESS_ID else if a record is getting updated in the PAYMENT table then the current_process_id value is set to only the column UPDATE_PROCESS_ID
  5. So now the timestamp values for the records inserted or updated in the table PAYMENT can be picked from the PROCESS_TABLE by joining by the PROCESS_ID with the INSERT_PROCESS_ID and UPDATE_PROCESS_ID columns of the PAYMENT tableBenefits
  6.  
  • The fields INSERT_PROCESS_ID and UPDATE_PROCESS_ID occupy less space when compared to the timestamp fields
  • Both the columns INSERT_PROCESS_ID and UPDATE_PROCESS_ID are Index friendly
  • Its easier to handle these process id fields in terms picking the records for determining the incremental changes or for any audit reporting.
Read More about Data Integration

Monday, 28 April 2008

Let’s talk EPM – Part 1

Welcome to the world of Enterprise Performance Management (EPM), considered the Holy Grail of Business Intelligence. EPM and its various manifestations creatively named as Business Performance Management (BPM), Corporate Performance Management (CPM) etc. is a set of processes that help organizations optimize their business performance.
Does it sound good? – Ofcourse, Yes! Show me an organization that does not want to optimize!!
Does it sound practical? – Not really! Don’t know where to start!!
EPM means many things to many people – Optimization of business performance can mean optimization at the business processes level (local optima), can also mean optimization at the organizational level (global optima) and can also have many flavors in between.
With many BI vendors jumping into the EPM bandwagon, the problem is that EPM is immediately equated to the solutions provided by tools like Business Objects, Cognos, Hyperion etc. That view, in my opinion, is far removed from the truth.
In this series of posts, I would like to share some thoughts on making EPM a practical reality in organizations. To start with, let me enumerate the components of an EPM strategy:
  1. Business Process Maps – Understand the business process
  2. Metrics Identification – Get hold of the metrics
  3. Metrics Profiling – Understand the metrics in depth
  4. Metrics Maps – Understand the cause and effect relationships between metrics
  5. Metrics Visualization – Implementation of Metric Maps on BI Tools
  6. Watch and Improve – Monitor Metrics and Improve business process as required
A keen observer will immediately realize that implementing EPM has lot more of pen & paper work (substitute your favorite analysis tool here!) before technology can come into the picture. Also, in my opinion, there is no silver bullet – No single metric map can fit companies across industries or even within same industry. EPM framework for an organization has to evolve in phases based on company’s growth, its corporate vision, and the important numbers at different stages etc. or in other words ‘EPM is very personal to an organization’.
EPM, for a BI practitioner, represents a convergence of many things –
  • Domain Understanding
  • Quantitative Play
  • BI Tool capability
  • Closed-Loop BI Architecture
  • Knowledge of proven methodologies like Six Sigma, Balanced Scorecard etc.
will try and explain some of the interesting aspects of an EPM strategy like Metrics Profiling, Metrics Maps etc. in the next few posts. Meanwhile, you can take a look at resources like this one (http://www.dmreview.com/issues/20050501/1026062-1.html) to understand the ‘big picture’ with respect to Enterprise Performance Management.
Thanks for reading!

Tuesday, 15 April 2008

Using Analytic Hierarchy Process (AHP) for BI Tool Evaluation

Enterprise wide BI architecture utilizes a multitude of tools within its landscape, each serving a specific functionality – Extract, Transform and Load (ETL), Data Cleansing, Metadata Management, Databases (both relational and multidimensional), Reporting and Analytics (OLAP), Data Mining etc. For example, just taking the OLAP area alone, there are more than 40 different products that can potentially solve a customer problem. You can imagine the number of combinations possible when all the tool options are combined across the overall landscape. This establishes the fact that one of the most challenging and vexing problems in Business Intelligence domain is Tool Evaluation.
Tool Evaluation and selection has become strategic to the implementation of enterprise wide Business Intelligence. Traditionally, tool selection involved comparing the technical features of the tools, looking at demos by product vendors, reading up industry reports, get word-of-mouth referrals and then taking a final decision. In my humble opinion – that is not sufficient any more.
Technical features, though important, cannot be the definitive criteria for selecting a particular tool. More crucial than technical features is what I term as the “Business Fitment Index”. The selected tool should fit with the characteristics of the business process prevalent in the organization and should take into account the requirements of different classes of users. The concept of Business Fitment can be classified as a Multi Criteria Decision Making (MCDM) problem and one of the powerful tools in this category is the Analytic Hierarchy Process (AHP).
AHP is a systematic procedure that helps to:
  1. Represent the elements of any problem, breaking it down into smaller constituents
  2. Assign weightages to each constituent by following a pairwise comparison technique
  3. Leverage expert judgment and intuitive feel into a coherent framework for problem solving
Though AHP can be used in many situations, Hexaware’s BI practice has perfected the art of leveraging its power in the realm of “BI Tools Evaluation”. There are 3 steps to calculating the Business Fitment Index using AHP.
Step 1 – Pair-wise comparison of business parameters by customer stakeholders is done in this step. The parameters can be things like – Real Time Data Integration, Data Volumes, Data Quality, Business Rules Flexibility etc.
Step 2 – Relative ranking of Business Parameters based on the AHP (Analytic Hierarchy Process) technique
Step 3 – Each of the short-listed tools are evaluated against the business parameters and a final rating is arrived at taking into account the organization readiness factors
Bottom-line is that the technical features of the tools have to be taken in conjunction with the fitment level of tool to the characteristics of the business. That alone would ensure the success of the tool for enterprise wide BI initiatives.
AHP is a simple yet powerful way of arriving at a decision by consensus. There are wide ranging applications of AHP in BI and this is a great area for practitioners to get interested. If you have some thoughts on other applications of AHP in the BI world, please do share it with us. Thanks for reading!