Pages

Ads 468x60px

Labels

Monday, 28 January 2008

Business Intelligence and Six Sigma

I just finished a Six Sigma project and was left wondering as to why BI practitioners are not using more of that Six Sigma power in Business Intelligence. Let me delve on this subject a bit more.
The Six Sigma project that I just completed was on “Developing a Function Point based estimation model for ETL loads”. Essentially, I was facing a lot of problems in estimating the effort for ETL (in this case, Informatica) loads that led to “Effort variances” beyond specified limits. So we kicked off a Six Sigma project that had the following DMAIC phases:
1. Define – Definition of the problem (Ex: Estimation process is out of whack)
2. Measure – We measured the effort variances before the start of the project and also set ourselves a target of where it should be.
3. Analyze – Analyzed the root-cause of the problem. The solution was to let go of the complexity based estimation that was done initially and to adapt Function points. In fact, this FP based estimation model was presented at the International Software Estimation Colloquium last year and won the Runner-up prize (http://www.qaiasia.com/Conferences/sec2007/leadership.htm)
4. Improve – Based on a pilot within the project, the Function points based linear regression model was arrived at and the team was educated on the estimation process. The improvements to the estimation process (effort variances) were measured on a regular basis.
5. Control – Periodic checks to ensure the institutionalization of the process and also fine-tune wherever necessary.
That in a nut-shell is what my Six Sigma project was all about. Basically, Six Sigma tries to improve process efficiencies by following the phases mentioned above.
Now let’s see the connection to Business Intelligence. Analytics at this stage of evolution (in majority of organizations) are being used to find the improvement area at a given point of time. The improvement area can be a problem (Ex: Trend chart showing that the Sales in the West region is dropping by 10% every quarter for the last 3 quarters) or an opportunity (Ex: Market potential for a product is huge and our share is small). BI is reasonably good at providing this information and it will only get better. But BI by itself does not enforce the process / execution rigor that is required for successful organizations.
To summarize, Six Sigma needs an improvement opportunity as the starting point for it to unleash its power to improve processes. BI generates lot of these opportunities with its DW/Reporting/Analytics components but does not enforce the process implementation rigor. I feel that there is lot of synergy in bringing both together – Six Sigma, the left hand and BI, the right hand when brought together can earn a lot of claps in the quest to create learning, performing organizations.
Just to sample the power of Six Sigma techniques, please take a look at the following link:http://www.kaushik.net/avinash/2007/01/excellent-analytics-tip-9-leverage-statistical-control-limits.html, which illustrates the use of control charts (one of Six Sigma’s potent tools) in metrics / KPI management. Fascinating!
Agree / Not Agree, Have more thoughts on this topic, this post is good / rubbish, for anything – Please do send in your comments.
Information Nugget:Having talked about execution rigor, let me recommend one of the best books I have read in that area. “Execution – The Discipline of Getting Things Done” by Larry Bossidy and Ram Charan (http://www.amazon.com/Execution-Discipline-Getting-Things-Done/dp/0609610570)

Wednesday, 16 January 2008

Linking BI Technology, Process and People – A Theory of Constraints (TOC) View


With the advent of a new year, let me do a recap of what I have discussed through my 15 odd posts in 2007 and also set the direction for my thoughts in 2008.
I started with the concept (http://blogs.hexaware.com/business_intelligence/2007/06/business-intell.html) of BI Utopia in which information is available to all stakeholders at the right time, right time and in the right format. The bottomline is to help organizations compete on analytics in the marketplace. With that concept as the starting point, I explored some technology enablers like Real Time Data Integration, Data Modeling, Service Oriented architecture etc. and also some implementation enablers like Agile Framework, Calibration of DW systems and Function points based estimation for DW. In my post on Data Governance, I introduced the position of a CDO (Chief Data Officer) to drive home the point that nothing is possible (atleast in BI) without people!
To me, BI is about 3 things – Technology, Process, People. I consider these three as the holy triumvirate for successful implementation of Business Intelligence in any organization – Not only are the individual areas important by itself but the most important thing is the link between these 3 areas. Organizations that are serious about ‘Analytics’ should continuously elevate their technology, process & people capability and more importantly strengthen the link between them – afterall, any business endeavor is only as good as its weakest link.
Theory of Constraints (http://en.wikipedia.org/wiki/Theory_of_Constraints) does offer a perspective, which I feel is really useful for BI practitioners. I will explore more of this in my subsequent posts.
My direction in 2008 for posts on this blog are:
  1. Continue with my thoughts on Business Intelligence along Technology, Process and People dimensions

  2. Provide a “Theory of Constraints” based view of BI with focus on strengthening the link between the 3 dimensions mentioned above.

Almost every interesting business area – Six Sigma, Balanced Scorecard, System Dynamics, Business Modeling, Enterprise Risk, Competitive Intelligence, etc. has its relationship with BI and we will see more of this in 2008.
Please do keep reading and share your thoughts as well Business Intelligence

Thursday, 3 January 2008

BI Appliances


What is a BI Appliance?
If a data warehouse class database product or a reporting product or a data integration product or an all-in-one software package is pre installed and available in a preconfigured hardware box, then such a “hardware + software” box is called a ‘Business Intelligence'  Appliance’. The very purpose of an appliance model is to cover the underlying software components complexity and intricacies and make it simple like operating a TV system.
How an Appliance Model evolved?
As businesses gathered huge data, the demand for faster and better ways of analyzing data increased, the data warehouse as a software technology got evolved; there have been continuous efforts to build software systems that are cognizant of data warehouse environments.


We have seen IBM and Oracle releasing their data warehouse specific database editions
We seen the growth of data warehouse specific databases like RedBrick(now part of IBM), Teradata, Greenplum…
We have seen simple list reporting tools getting into proprietary data structures cubes and the emergence of acronyms MOLAP, HOLAP, ROLAP, DOLAP
We had a very new software market created for ETL and EII products
We have seen more new software applications related to BI being created BAM, CPM, Metadata Management, Data Discovery and lot more getting defined every day into the market….

As many organizations started setting up its BI infrastructure or enhanced its existing BI environment with different BI software packages they needed, they also imbibed different platforms and hardware, the maintenance of these became frightening. Getting started with a BI project by itself became a bigger project; we needed to spend sufficient time not just on choosing the right set of BI products but also on the supported hardware, dependent software packages and the platform. No BI vendor currently addresses the complete stack of BI system needs and this has been the driving factor for more acquisitions.
Products like Nettezza (Data base Appliance), CastIron (ETL Appliance) came up with their ‘software in a box’ concept, where we can buy or rent preconfigured ‘hardware + software’ boxes which in a way addresses the need of ‘ready to use’ BI market. Many of these boxes have Linux, open source databases, web server, message queues and proprietary software.
The Appliance based model is not new, IBM has been renting its ‘mainframe + software’ for decades. IBM has addressed the BI market with its ‘Balanced Warehouse’; a preconfigured ‘hardware + software’, its OS can vary from Windows – AIX – Linux with DB2 as database and data reporting can vary from DB2 Cubes – Crystal – Business Objects. HP in a similar way has come out with its Neoview platform which is a revitalized version of NonStop SQL database and NonStop OS.
The need of a CIO has been always ways to shorten the application deployment cycle and reduce the maintenance factor of the servers; the Appliance based products meet these KRA of a CIO and are getting accepted widely.
The Future
More Appliances, Focus on Performance:
We would see more BI appliances coming into market; as the Appliance model covers what’s underneath and in many cases the details being not available; the buying focus would be more on what the products deliver rather than what they have inside.
Common Appliance Standards:
Getting best of breed of software and hardware from a single vendor would not happen. We might see both software and hardware vendors defining a set of basic standards among themselves for the Appliance model. New organizations would also evolve similar to “tpc.org” which would define performance standards for appliances. We might see companies similar to DELL coming up which can assemble best of breed components and deliver a packaged BI Appliance.
More Acquisitions: The current  Business Intelligence Market landscape can also be interpreted as
  1. Hardware + Software or Appliance based vendors – HP, IBM
  2. Pure software or Non-Appliance based vendors – Oracle, Microsoft, SAP

Once the current BI software consolidation gets established the next wave of consolidation would be towards companies like Oracle looking for hardware companies to be added to their portfolio.
TechnologyAppliance Products
DatabaseNetezza
Teradata
DATAllegro
Dataupia
Data IntegrationCASTIron
Reporting-DashboardCognos NOW (Celequest LAVA)
Configurable Stack (with third party support)IBM Balanced Warehouse
HP Neoview

Thursday, 27 December 2007

“What Management Is” – The crucial link between Business and Intelligence


Let’s for a moment accept the hypothesis that the true intent of Business Intelligence is to help organizations manage their business better. “Better” in this context tends to be a rather elastic adjective as it straddles the entire spectrum of firms using BI for simple management reporting to the other extreme of using BI to ‘Compete on Analytics’ in the marketplace.
“Managing business better” presents the classic question of “What aspects of business can BI help manage better”. The Answer – “Pretty much everything”.
In this post, I would like to list down the different business areas that ought to be managed for the better and drill down into the applicability of BI for each of these areas in future posts. The primary reference for my listing is from one of the best management books I have ever read till date – “What Management Is” by Joan Magretta and Nan Stone. (http://www.amazon.com/What-Management-Works-Everyones-Business/dp/0743203186). This book really helps in drawing the boundaries around management concepts and for BI practitioners, like me, shows the direction for the evolution and business applicability of BI.
BI practitioners need to understand the following business areas:
  1. Value Creation – BI can help in providing the critical “Outside-in” perspective

  2. Business Model – Is this the right business to be in?

  3. Strategy – Validation and tuning of Strategy thro’ BIsiness Intelligence

  4. Organization Boundaries – BI can help solve the Build vs Buy conundrum

  5. Numbers in Business – Really the sweetspot for BI applications

  6. Mission and Measures – Connecting the company’s mission with the measures

  7. Innovation and Uncertainty – Domain of Predictive Analytics & its ilk

  8. Focus – Realm of Pareto’s Law vis-à-vis the more recent “Long-Tail” phenomenon

  9. Managing People – Human Resource Analytics is one of the most happening analytics application areas at this point in time.
Bit of marketing here – after all, this is a corporate blog – My company Hexaware is a specialty provider of HR Analytics solutions. Please do visit –http://www.hexaware.com/new_hranalytic.htm for more information
To me, the list above presents the most comprehensive high-level thought process when confronted with implementation of BI in organizations. In my consulting engagements, the litmus test is to really see whether the BI strategy covers the different aspects of business as noted above 
– “More the coverage better is the BI vision”.
Information Nugget
I was quite fascinated by the range of analytical apps available “On-Demand” at http://www.salesforce.com/appexchange/category_list.jsp?NavCode__c=a0130000006P6IoAAK-1 . I personally feel that “On-Demand” does have the potential to disruptively change the way BI services have been delivered customers. More on that later!
The crucial link between Business and Intelligence
Have a Merry Christmas and a Happy New Year 2008!

Tuesday, 18 December 2007

Data Integration Challenge – Building Dynamic DI Systems – II


Following are the design aspects towards getting a DI system dynamic
  1. Avoiding hard references, usage of parameter variables

  2. Usage of lookup tables for code conversion

  3. Setting and managing threshold value through tables

  4. Segregating data processing logics into common reusable components

  5. Ensuring that the required processes are controllable by the Business team with the required checks built in

We had defined the first two aspects in the earlier writing, let us look at the scenarios and approach for the other three items
Setting and managing threshold values through tables
In data validation process we also perform verification on the incoming data in terms of count or sum of a variable, in this case the validity of the count or sum derived is verified against a pre defined number usually called the ‘Threshold Value’. Some of the typical such validation are listed below
  1. The number of new accounts created should not be more than 10% (Threshold Value) of the total records

  2. The number of records received today and the number of records received yesterday can not vary by more than 250 records

  3. The sum of the credit amount should not be greater than the 100000

This threshold value differs across data sources but in many cases the metric to be derived would be similar across the data sources. We can get these ‘threshold values’ into a relational table and integrate this ‘threshold’ table into the Data Integration Challenge process as a lookup table, this enables the same threshold based data validation code to implemented across different data sources and also apply the specific data source threshold value.
Segregating Data Processing Logics into Common Reusable Components
Having many reusable components in a system by itself makes a DI system dynamic or adaptable, the reason being that reusable components work on the basic aspect of parameterization of inputs and outputs of an existing process and parameterization is a key component to get a DI system dynamic. Some of the key characteristics to look for in a DI system that would help carve out a reusable component are
  1. Multiple data sources providing data for a particular subject area like HR data coming from different HR systems

  2. Same set of data being shared with multiple downstream systems or a data hub system

  3. Existence of an industry standard format like SWIFT, HIPPA either as source or target

  4. Integration with third party systems or their data like D&B, FairIsaac

  5. Changing data layouts of the incoming data structure

  6. Systems that capture survey data

Ensuring that the required processes are controllable by the Business team with the required checks built in
In many situations we are now seeing requirements where in the business would be providing regular inputs to the IT team of the DI systems, these are the situations where we can design and place the portions of the DI system parameters under the business control. Typical examples of such scenarios are
  • In ‘threshold value’ based data validation, these values would be business driven i.e., ‘threshold table’ can be managed by the business team and they would be able to make changes to the threshold table without code changes and without IT support
  • In many scenarios the invalid data would under go multiple passes and be need to be validated at different passes by the business in terms of starting a BI session, the input from the business could be just starting the process or as well providing input data
  • The data to be pulled out from a warehouse based on a feed from an online application; a typical web service problem-solution
The need for the business team to control or feed the DI systems is common with companies that handle more external data as with market research firms and Software As A Service (SAAS) firms. The web service support from the leading Data Integration vendors plays a major role in full filing these needs.