Pages

Ads 468x60px

Labels

Showing posts with label Business Intelligence Consulting. Show all posts
Showing posts with label Business Intelligence Consulting. Show all posts

Thursday 5 September 2013

Business Intelligence - Are you game for the Moneyball Process?

The importance of data-driven decision making and different aspects of looking at data was much popularized among the civic society by Brad Pitt starred Hollywood movie ‘Moneyball’ which is based on a true story.

A quick snapshot of the storyline:
“Oakland Athletics general manager Billy Beane (Brad Pitt) is upset by his team’s loss to the New York Yankees in the 2001 postseason. With the impending departure of star players, Beane attempts to devise a strategy for assembling a competitive team for 2002 but struggles to overcome Oakland’s limited payroll. Billy turns baseball on its ear when he uses statistical data to analyze and place value on the players (not star players though) he picks for the team. This resulted in Oakland’s Athletics (baseball team) set a team record of 20 wins in a row. Similar strategy was adopted by Boston Red Sox’s who won the World Series in 2004 since their first win in 1918”

(Sources: WIKI, IMDB)

What Beane had done differently that turned the game around was application of Sabermetrics (A Statistical analytics method of analyzing data points in the game of baseball). The analytics lead application of Sabermetrics helped Beane to question traditional methods of evaluation such as RBI (Runs Batted In) and batting average. It took in-depth analysis to conclude that matches were not won by players with higher batting average but by those with a higher On Base percentage (OBP), Slugging percentage (SLG). Beane formed a team based on these new metrics and other parameters.

Monday 16 February 2009

Industry Specific BI – What's the common denominator?

My previous post on business process fundamentals concluded with a friendly exhortation to BI practitioners inciting them to view their craft from the point of optimizing business process.
So the next time you are involved in any BI endeavor, please ask this question to yourself and the people involved in the project – “So which business process is this BI project supposed to optimize, why and how?” I define ‘Optimization’ loosely as anything that leads to bottom-line or top-line benefits.
Business processes by its very definition belong to the industry domain. Companies have their own business processes – some of them are standard across firms in that particular domain and many of them are unique to specific companies. Efficiency of business processes is a source of competitive advantage and the fact that ERP vendors like SAP has special configurations for every industry illustrates this point. So by corollary, for BI to be effective in optimizing business processes, it has to be tied to specific industry needs creating what can be called as “Verticalized Business Intelligence”. (V-BI in short)
At Hexaware’s Business Intelligence & Analytics practice (the company and team that I belong to), we have taken the concept of V-BI pretty seriously and have built solutions aimed at industry verticals. You can view our vertical specific BI offering at this link and we definitely welcome your comments on that.
Though Verticalized BI is a powerful idea, companies typically need an “analytics anchor point” to establish a BI infrastructure before embarking on their domain specific BI initiatives. The analytics anchor point, mentioned above, should have the following characteristics:
  • All organizations across domains should have the necessity to implement it
  • Business process associated with these analytics needs to be fairly standardized and should be handled by experts
  • Should involve some of the most critical stakeholders within the organization as the success of this first initiative will lay the foundation for future work
Based on my experience in providing consulting services for organizations in laying down an Enterprise BI roadmap, I feel that “Financial Analytics” has all the right characteristics to become the analytics anchor point for companies. Financial Analytics, the common denominator, typically comprises of:
  • General Ledger Analysis – (also known as Financial Statements Analysis)
  • Profitability Analysis (Customer / Product Profitability etc.)
  • Budgeting, Planning & Forecasting
  • Monitoring & Controlling – The Dashboards & Scorecards
  • General Ledger Consolidation
The above mentioned areas are also classified as Enterprise Performance Management. The convergence of Performance Management and BI is another interesting topic (recent announcements of Microsoft have made this subject doubly interesting!) and I will write about it in my future posts.
In my humble opinion, the prescription for Enterprise BI is:
  • Select one or more areas of Financial Analytics (as mentioned above) as your first target for Enterprise BI.
  • During the process of completing step 1, establish the technology and process infrastructure for BI in the organization
  • Add your industry specific BI initiatives (Verticalized Business Intelligence) as you move up the curve
I, for one, truly believe in the power of Verticalized BI to develop solutions that provide the best fit between business and technology. That business and IT people can sit across the table and look at each other with mutual respect is another important non-trivial benefit.
Thanks for reading. Do you have any other analytics anchor points for organizations to jumpstart their BI initiatives? Please do share your thoughts.
Read More About  Industry Specific BI

Thursday 5 February 2009

Analytics, choosing it

We observe many BI Project Sponsors clearly asking for an Analytics Package implementation to meet business needs; the benefit is that it saves time. By deciding on an analytics package we can get the application up quickly and comes with all typical benefits of a ‘buy’ solution against a ‘build’ solution.
So what are the key parameters that we need to look for in choosing an Analytics Package. The following would be the points to consider in choosing an Analytics Package, in the order of importance.
1.The effort to arrive at the right data model for a BI system is huge and as well quite tedious, so a comprehensive ‘Data Model & Metrics, Calculations’ from the package is very important.
2.The flexibility and the openness in managing Data Model is also very critical, some of tools to manage the data model elements that can be looked for are
  • Ability to browse the data elements and its definitions
  • Support for customization of the data model without getting back to the database syntax
  • Auto Source System profiling and field mapping from the source systems to the data model
  • Enabling validation of data type, data length of the data model against the source system field definitions
  • Means to ensure that customization of the data model in terms of field addition doesn’t happen when a similar element exists
  • Availability of standard code data as applicable to the functional area
  • Supporting country specific needs in terms of data representation
3. ETL process for a BI system is also a major effort. Though the absolute effort of pulling the data and making it available for the package in the required format cannot be avoided, availability of plug-ins that can understand the data structure from typical systems like ERP would save good amount of effort.
4. Availability of ETL process for typical data validation as part of ETL is also a must; integration with any data quality product would be valuable
5. Ability to support audit and compliance requirements for data usage and reporting
6. Integration of the package with industry specific research data from vendors like D&B, IMS etc to enable benchmarking the performance metrics against industry peers/competitors
7. Customizable Security Framework
8. Semantic layer definition with formulas, hierarchies etc
9. Ready to use Score Cards and dashboard layouts
10. Pre built reports and portal
Often all the pre delivered reports under go changes and are almost completely customized when implemented. So availability of a larger list of reports itself doesn’t mean a lot since most of the reports would be minor variations from one other. Certain compliance reports would be useful when it comes along with the package; these would be published industry standard report formats.
Definitely an evaluation phase to test the Analytics products capability on a sample of the data before choosing it is a must, the above ten points would the evaluation criteria during this exercise.

Sunday 25 January 2009

Business Process for BI Practitioners – A Primer

Business Intelligence has a fairly wide scope but at the fundamental level it is all about “Business Processes”. Let me explain a bit here.
BI, without the bells and whistles, is about understanding an organization’s business model, its business processes and ultimately find the reason (analytics) and way to optimize the processes. The actions are carried out based on informed judgments (aided by BI), to make the organization better in whatever endeavor it has set itself to accomplish.
Assuming that BI practitioners are convinced that understanding business process is critical to their work, let me delve a bit into the basics of it.
1) What is a business process? (As a side note, one of the best explanation for business models is given by Joan Magretta in her book ‘What Management Is”)
Business processes are set of activities involved within or outside an organization that work together to produce a business outcome for a customer or to an organization. The fact is that for an organization to function, there are many outcomes that are required to happen on a daily basis.
2) What are BPM Tools?
Business Process Management (BPM) tools are used to create an application that is helpful in designing business process models, process flow models, data flow models, rules and also helpful in simulating, optimizing, monitoring and maintaining various processes that occur within an organization.
3) The Mechanics of Business Modeling
Business Process Modeling is the first step, followed by Process Flow Modeling and Data Flow diagrams. All these 3 diagrams and associated documentation will help in getting the complete picture of an organization’s business processes. Brief explanation of these 3 types are given below:
a) In Business Process Modeling, an organization’s functions are represented by using boxes and arrows. Boxes represent activities and arrows represent information associated with that activity. Input, Output, Control and Mechanism are the 4 types of arrows. A box and arrows combination that describes one activity is called a context diagram and obviously there would be many context diagrams to explain all the activities within the enterprise.
b) Process Flow Modeling is a model that is a collection of several activities of the business. IDEF3 is the process description capture method and this workflow model explains the activity dependencies, timing, branching and merging of process flows, choice, looping and parallelism in much greater detail.
c) Data Flow Diagrams (DFD) are used to capture the flow of data between various business processes. DFD’s describe data sources, destinations, flows, data storage and transformations. DFDs contains five basic constructs namely: activities (processes), data flows, data stores, external references and physical resources.
Just like the data modeler goes thro’ conceptual, logical and physical modeling steps, a business process modeler creates the Business Process Models, Process Flow Models and Data Flow Diagrams to get a feel for the business processes that take place within an enterprise.
Thoughts for BI Practitioners:
  1. Consider viewing BI from the point of optimizing business processes
  2. Might be worthwhile to learn about Business Process Modeling, Process Flow Modeling and Data Flow Diagrams
  3.  
  4. Understand the working of BPM tools and its usage in the enterprise BI landscape
  5. Beware of the acronym BPM. BPM is Business Process Management but can also be peddled as Business Performance Management.
  6.  
  7. My view is that Performance Management is at a higher level, in the sense, that it is a collective (synergistic) view of the performance of individual business processes. A strong performance management framework can help you drill-down to specific business processes that can be optimized to increase performance.

Monday 19 January 2009

Analytics, its Evolution

What is ‘Analytics’ – A business intelligence application with ready to use components for data analysis, we also refer to it as ‘packaged analytics’. ‘Business Analytics’ refers to analytics applications that support analysis of data collected as part of a business process.
In similar lines we can define an analytics application that supports analysis of data collected as part of a ‘computer user’ daily activity as ‘Personal Analytics’.
Business systems evolved from the state of building custom applications to a state of configurable generic Enterprise Resource Planning (ERP) systems. Now we have configurable generic business intelligence applications called ‘Business Analytics’ which have evolved from the state of building custom business intelligence applications.
The ERP systems are designed to collect the business data where as the Business Analytics systems are designed to analyze the collated business data, so one of the key sources for a Business Analytics application is an ERP system. Data analysis is a next logical step after data collection, the ERP vendors like Oracle, SAP, Microsoft got delayed in addressing this specific requirement of data analysis. In the last two years we have seen some finer business intelligence products being acquired by the ERP vendors. Clearly the customers who are on ERP products would get a better platform that can talk to their ERP applications for data analysis.
It’s a reality that not many companies, at least the larger (>USD 500million) companies would not run their entire business in one ERP system. Consolidating all applications to one single ERP platform will not happen immediately, multiple ERP and custom applications would get added if the company grows through acquisitions, hence existence of multiple transaction systems cannot be avoided. The number of customers embracing packaged analytics from the ERP vendors will increase as the flexibility of the business analytics applications from the ERP vendors matures to accept data from other outside applications.
Logical Data Model to Packaged Reports
The business analytics applications grew step by step as following
  • 1. Logical data model – as a first step towards the formation of packaged analytics, companies like IBM, Teradata provided industry specific logical data models (LDM) to help customers build their enterprise data warehouse. The LDM was based on the business process and provided the required jumpstart to enable the integration of data from multiple source systems effectively. We also have certain industry endorsed LDMs like Supply-Chain Operations Reference-model (SCOR), Public Petroleum Data Model(PPDM
  • 2. Metrics definition – LDMs led to the next step of defining metrics to measure the performance of the business process. The required data for the metrics that were specific to a business process were extracted (virtually/physically) into data marts as analytic data models in a fact-dimension data model
  • 3. Semantic Layers – the next step was the creation of semantic layer over the data mart to enable adhoc querying and report generation
  • 4. Reports and Dashboards – then we had set of reports and dashboards delivered over the semantic layer
Still the packaged analytics are positioned as a data mart application addressing specific business process like HR or Customer Relationship, unlike ERP systems which addresses complete end to end business process of an organization…there is still more time to go for an Enterprise Analytics Application to be established.
Read More About  Analytics and its Evolution

Friday 2 January 2009

What is “Safe to Bet On” in Business Intelligence?

While the phrase “Safe to Bet On” is an oxymoron of sorts, it is that time of the year where we first look at the past, derive some insights and look forward to what the future has in store for us. I have no doubts that 2009 will be doubly interesting for BI practitioners as compared to 2008.
Having said that, I decided to do a bit of introspection to figure out what skills (can also be read as competencies) should I be looking at to stay relevant in the Business Intelligence world far into the future, say at 2020. Hopefully that resonates with some of you.
Let me first try and get down to defining the skills required for Business Intelligence and Analytics. The trick here is to stay “high-level” as any BI person will acknowledge the fact that one we get down to look at the trees (rather than the forest), the sheer number of skills required for enterprise level BI can get daunting
Taking inspiration from the fact that any business can be condensed into 2 basic functions, viz. Making & Selling, I propose that there are 3 key skills that make for successful BI
Skill 1 – Business Process Understanding: If you are a core industry expert and can still talk about multi-dimensional expressions, that’s great! But most BI practitioners have their formative years rooted on the technology side and have implemented solutions across industries. The ability to understand the value-chain of any industry, map out business processes, identify optimization areas, translating IT benefits to business benefits are the key sub-skills in this area.
Skill 2 – Architecting BI Solutions: This skill is all about answering the question of “What is the blue-print” for building the Business Intelligence Landscape in the organization. Traditionally, we have built data warehouses & data marts either top-down or bottom-up, integrated data from multiple sources into physical repositories, modeled them dimensionally, provided ad-hoc query capability and we are done! – NOT ANYMORE. With ever increasing data volumes, real-time requirements imposed by Operational BI, increased sophistication for end-user analytics, the clamor for leveraging unstructured data on one hand and the advent of On-Demand Analytics, Data Mashups, Data Warehouse appliances, etc., there is no single best way to build a BI infrastructure. So the answer to “What is the blue-print?” is “It depends”. It depends on many factors (some of which are known today and many which aren’t) and the person / organization who appreciates these factors and finds the best fit to a particular situation is bound to succeed.
Skill 3 – BI Tools Expertise: Once a blue-print is defined and optimization areas identified, we need the tools that can turn those ideas into reality. BI practitioners have many tools at their disposal straddling the entire spectrum with excel spreadsheets at one end to high-end data mining tools at the other extreme. If you bring in the ETL & data modeling tools, the number of industry-strength tools gets into the 50s and beyond. With convergence of web technologies, XML, etc. into mainstream BI, it probably makes sense to simplify and say “Anything you imagine can be done with appropriate BI tools”. “Appropriate” is the key word here and it takes good amount of experience (and some luck) to get it right.
In essence, my prescription for BI practitioners to stay relevant in 2020 is to be aware of developments on these 3 major areas, develop specific techniques / sub-skills for each one of them and more importantly respect & collaborate with the BI practitioner in the next cubicle (which translates to anywhere across the globe in this flat world) for he/she would bring in complementary strengths.
Read More About  Safe to Bet On

Monday 22 December 2008

Business Intelligence Challenge – Product Upgrades & Migrations, Validation – 5

Once the code has been moved to the target platform (Moving the Code), whether it’s an upgrade to a newer version or migration to another newer platform, the next step is to validate the objects moved.
Validation Process involves verification or testing of the objects in the target platform to ensure that they deliver the same output as the older objects in the source platform.
Validation is a key process by which the migration or upgrade process is certified as successful, it’s usually laborious and a time consuming process. Let us see how the Validation Process can be broken into different steps and automated for saving time and for improved accuracy. We can look at the Validation process to encompass three steps, they are
  • Metadata Validation
  • Run Validation
  • Output Validation
Metadata Validation involves comparison of the metadata definitions between the existing source environment and the target environment. This requires that the metadata of the source and the target environment be captured for the comparison.
Steps Involved:
  • Capture the source metadata into a relational structure, as part of Object Consolidation we would have captured the source metadata
  • Capture the target platform metadata in a similar way into a relational structure
  • Run SQL queries to automate the metadata comparison process
Metadata Comparison would be done at the level of semantic layer definitions and individual reports. Let us take the case of metadata comparison between two semantic layers, in case of Business Objects; Universe is the semantic layer definition. After an upgrade from an older version of Business Objects to its newer version, the first level of metadata validation between the universes would be to check whether the object counts between the universes match like the classes, the objects, the filters and then further comparison on their definitions.
If there are any differences when comparing the definitions and if they fall within the known differences between the two versions (source & target) then they are good else would require code fixing in the upgraded object.
Since we always try to validate the reports by what it gives as output, the validation process is limited by the data fed in; we could miss scenarios of a filter clause not being tested. Metadata Validation can overcome the limitation in data preparation for different scenarios for testing. If a report passes through a Metadata Validation expectation then we could 100% say that the report has upgraded or migrated effectively.
Benefits:
  • Sets up a strong base on the metadata understanding, as the objects between different platforms has to be mapped and the bridges gaps identified to run automated metadata validation
  • Improved accuracy in the validation process, overcomes the limitation in data preparation
  • Enables determining issues without running the report against the data
Run Validation is to perform a dry run of the reports in an automated way to determine whether the reports run (open) successfully or not.
When we give a report to a tester, the first activity he would perform is to run the report and if it doesn’t go through the problem is reported or analysed further. We try to foresee this problem in an automated way.
Steps Involved:
  • Have scripts to invoke the reports in batch mode, as soon as the objects are upgraded invoke(open) all the upgraded reports in the batch mode
  • Capture the errors while opening/running the report into a log
  • Classify them into two categories ‘reports that ran’ and ‘reports that failed’
Some reports could fail to open because of incorrect connection details, some due to object not found etc. This process of quick run in an automated way enables to locate the failure reports immediately and also help determine the reason for the failures in one go. Limiting the data input should be considered while invoking the report.
Benefits:
  • Saves time in determining errors due to report opening or running
  • Enables building a common solution for the code fixing team, as the ‘run errors’ are consolidated
Output Validation, is to validate the output delivered by the reports. There are two levels of output validation; they are Format Validation and Data Validation.
Format Validation is to check on the format of the data presented like font size, colour, bold, label location etc which doesn’t relate to the data value.
Data Validation is to check cell by cell the data value content between the two reports.
Steps:
  • Run the source report and export the output data to excel/word
  • Run the target report and export the output data to excel/word
  • Compare the outputs for the format and the data
The best means of comparing the output of two reports is to export them to Excel and then performing a comparison between the two Excel’s. If we can export the reports to a word format then we can leverage the word compare utility, even an export to XML would enable using available utility. In case of excel we would need to build a utility that can compare the two excel sheets.
The above three validations are some of the key aspects in validating the objects of semantics and reports; let me know your thoughts on the other means of validation …

Monday 15 December 2008

The Esoteric World of Predictive Analytics

Let me start with the defintion of Predictive Analytics as used in literature – “The nontrivial extraction of implicit, previously unknown and potentially useful information from data”. If that doesn’t sound esoteric enough, you are probably more advanced than what this post gives you credit for!
For a BI practitioner, it is important to get an understanding of Predictive Analytics (also known as Data Mining) as this subject definitely deserves a place in the wide spectrum of Business Intelligence disciplines. BI at a broad level is about optimizing business through “Hindsight, Insight and Foresight”. Predictive analytics adds the powerful “Foresight” part to business decision making.
Most BI practitioners tend to equate statistics with predictive analytics and this post explains why such a view is inaccurate. To understand this let’s start at the very beginning (a la Alice in Wonderland). Broadly, this world is divided into 2 types of systems:
  • Physical Systems – Has causality and hence can be modeled mathematically with relative ease
  • Human Behavioral Systems – Lacks causality and can be modeled only with specialized techniques
Predictive analytics for business decision making is all about modeling human behavioral systems.
Why Traditional Statistics is insufficient?
Though the entry into predictive analytics requires that we understand the implications of traditional statistical analysis, statistics by itself is insufficient in the business context. Traditional statistical analysis allows us to understand the general group behavior and is primarily concerned with common behavior within the group – the central tendencies.
In business we generally develop models to anticipate human behavior of some type. Human behavior is inconsistent, lacks causality and distributions based on human behavior almost always violate the assumptions of traditional statistical analysis (like normal distribution of data, stability of mean and standard deviation etc). The strength of data mining comes from the ability of the associated techniques to deal with the tails of the distributions, rather than the central tendencies, and from the techniques’ ability to deal with the realities of the data in a more precise manner.
In the realm of predictive analytics, we are concerned with modeling human behavior and hence are interested with the tail of our distribution – small percentage of the population that responds to a campaign, commits a fraud, leave our business or purchase the next service.
Though there are specialized techniques used for Predictive Analytics (viz. Non-linear statistics, Induction Algorithms, Cluster Analysis, Neural Networks to name a few), a BI practitioner is only expected to appreciate its usage in different business situations, prepare and model data as required by the tools and interpret the results correctly (a much less daunting task indeed!)
Typically the model development process involves the following steps – a) Define Project, b) Select Data, c) Prepare Data, d) Transform Variables, e) Process Model, f) Validate Model, g) Implement Model. I will explain these steps in more detail in subsequent posts.
Fundamentally, an end-to-end BI view requires the practitioner to learn the concepts around statistics and predictive analytical techniques as available in tools (like say SQL Server Analysis Services) in addition to their technology bag of tricks around data integration, data modeling and OLAP.
Read More About  Predictive Analytics