Pages

Ads 468x60px

Labels

Showing posts with label Business Intelligence. Show all posts
Showing posts with label Business Intelligence. Show all posts

Monday 19 November 2012

Transitioning to a New World – An Analytical Perspective

Recently, I had the opportunity to speak at the Silicon India Business Intelligence Conference. The topic I chose for the discussion was focused on providing the BI & Analytics perspective for companies transitioning to a new world. You can view my presentation at this link –http://bit.ly/VLDDfF

The gist of my presentation is given below:

1)      First, established the fact that the world indeed is changing by showing some statistics:

  • Data Deluge: Amount of digital data created in the world right now stands at 7 Zettabytes per annum (1 Zettabyte = 1 Trillion Terabytes)
  • Social Media: Facebook has touched 1 Billion users which makes it the 3rd largest country in the world
  • Cloud: Tremendous amount of cloud infrastructure is being created
  • Mobility: There are 4.7 billion mobile subscribers which covers 65% of world population

2)      Enterprises face a very different marketplace due to the profound changes taking place in the way people buy, sell, interact with one another, spend their leisure time etc.

3)      To ensure that BI can help business navigate the new normal, there are 3 key focus areas.

  • Remove Bottlenecks – Give business what they want
  • Enhance Intelligence
  • End to End Visibility by strengthening the fundamentals

For each of the 3 areas mentioned above, I gave some specific examples of the trends in the BI space.

1)      For Removing Bottlenecks, the impact of in-memory and columnar databases were elaborated.

2)      For enhancing intelligence, working with unstructured data and using big data techniques were discussed.

3)      For the 3rd point, the focus was on strengthening the fundamentals in the BI landscape.

Please do check out my complete presentation at http://bit.ly/VLDDfF and let me know your views.

Thanks for reading.

Sunday 25 January 2009

Business Process for BI Practitioners – A Primer

Business Intelligence has a fairly wide scope but at the fundamental level it is all about “Business Processes”. Let me explain a bit here.
BI, without the bells and whistles, is about understanding an organization’s business model, its business processes and ultimately find the reason (analytics) and way to optimize the processes. The actions are carried out based on informed judgments (aided by BI), to make the organization better in whatever endeavor it has set itself to accomplish.
Assuming that BI practitioners are convinced that understanding business process is critical to their work, let me delve a bit into the basics of it.
1) What is a business process? (As a side note, one of the best explanation for business models is given by Joan Magretta in her book ‘What Management Is”)
Business processes are set of activities involved within or outside an organization that work together to produce a business outcome for a customer or to an organization. The fact is that for an organization to function, there are many outcomes that are required to happen on a daily basis.
2) What are BPM Tools?
Business Process Management (BPM) tools are used to create an application that is helpful in designing business process models, process flow models, data flow models, rules and also helpful in simulating, optimizing, monitoring and maintaining various processes that occur within an organization.
3) The Mechanics of Business Modeling
Business Process Modeling is the first step, followed by Process Flow Modeling and Data Flow diagrams. All these 3 diagrams and associated documentation will help in getting the complete picture of an organization’s business processes. Brief explanation of these 3 types are given below:
a) In Business Process Modeling, an organization’s functions are represented by using boxes and arrows. Boxes represent activities and arrows represent information associated with that activity. Input, Output, Control and Mechanism are the 4 types of arrows. A box and arrows combination that describes one activity is called a context diagram and obviously there would be many context diagrams to explain all the activities within the enterprise.
b) Process Flow Modeling is a model that is a collection of several activities of the business. IDEF3 is the process description capture method and this workflow model explains the activity dependencies, timing, branching and merging of process flows, choice, looping and parallelism in much greater detail.
c) Data Flow Diagrams (DFD) are used to capture the flow of data between various business processes. DFD’s describe data sources, destinations, flows, data storage and transformations. DFDs contains five basic constructs namely: activities (processes), data flows, data stores, external references and physical resources.
Just like the data modeler goes thro’ conceptual, logical and physical modeling steps, a business process modeler creates the Business Process Models, Process Flow Models and Data Flow Diagrams to get a feel for the business processes that take place within an enterprise.
Thoughts for BI Practitioners:
  1. Consider viewing BI from the point of optimizing business processes
  2. Might be worthwhile to learn about Business Process Modeling, Process Flow Modeling and Data Flow Diagrams
  3.  
  4. Understand the working of BPM tools and its usage in the enterprise BI landscape
  5. Beware of the acronym BPM. BPM is Business Process Management but can also be peddled as Business Performance Management.
  6.  
  7. My view is that Performance Management is at a higher level, in the sense, that it is a collective (synergistic) view of the performance of individual business processes. A strong performance management framework can help you drill-down to specific business processes that can be optimized to increase performance.

Monday 15 December 2008

The Esoteric World of Predictive Analytics

Let me start with the defintion of Predictive Analytics as used in literature – “The nontrivial extraction of implicit, previously unknown and potentially useful information from data”. If that doesn’t sound esoteric enough, you are probably more advanced than what this post gives you credit for!
For a BI practitioner, it is important to get an understanding of Predictive Analytics (also known as Data Mining) as this subject definitely deserves a place in the wide spectrum of Business Intelligence disciplines. BI at a broad level is about optimizing business through “Hindsight, Insight and Foresight”. Predictive analytics adds the powerful “Foresight” part to business decision making.
Most BI practitioners tend to equate statistics with predictive analytics and this post explains why such a view is inaccurate. To understand this let’s start at the very beginning (a la Alice in Wonderland). Broadly, this world is divided into 2 types of systems:
  • Physical Systems – Has causality and hence can be modeled mathematically with relative ease
  • Human Behavioral Systems – Lacks causality and can be modeled only with specialized techniques
Predictive analytics for business decision making is all about modeling human behavioral systems.
Why Traditional Statistics is insufficient?
Though the entry into predictive analytics requires that we understand the implications of traditional statistical analysis, statistics by itself is insufficient in the business context. Traditional statistical analysis allows us to understand the general group behavior and is primarily concerned with common behavior within the group – the central tendencies.
In business we generally develop models to anticipate human behavior of some type. Human behavior is inconsistent, lacks causality and distributions based on human behavior almost always violate the assumptions of traditional statistical analysis (like normal distribution of data, stability of mean and standard deviation etc). The strength of data mining comes from the ability of the associated techniques to deal with the tails of the distributions, rather than the central tendencies, and from the techniques’ ability to deal with the realities of the data in a more precise manner.
In the realm of predictive analytics, we are concerned with modeling human behavior and hence are interested with the tail of our distribution – small percentage of the population that responds to a campaign, commits a fraud, leave our business or purchase the next service.
Though there are specialized techniques used for Predictive Analytics (viz. Non-linear statistics, Induction Algorithms, Cluster Analysis, Neural Networks to name a few), a BI practitioner is only expected to appreciate its usage in different business situations, prepare and model data as required by the tools and interpret the results correctly (a much less daunting task indeed!)
Typically the model development process involves the following steps – a) Define Project, b) Select Data, c) Prepare Data, d) Transform Variables, e) Process Model, f) Validate Model, g) Implement Model. I will explain these steps in more detail in subsequent posts.
Fundamentally, an end-to-end BI view requires the practitioner to learn the concepts around statistics and predictive analytical techniques as available in tools (like say SQL Server Analysis Services) in addition to their technology bag of tricks around data integration, data modeling and OLAP.
Read More About  Predictive Analytics

Wednesday 10 December 2008

Business Objects Security

In the current business scenario, securing the data and restricting the users from what rows and columns of data they can see and what rows and columns of data they cannot see is very important.  We can secure the rows of data by row level security. Some people call this as ‘Fine grained access control’.  We can secure the columns of data by column level security. This is popularly called in Business Objects as ‘Object level security’
ROW LEVEL SECURITY
There are various ways through which the row level security can be implemented in a Business Objects environment.
One way is by securing the datamart. In case of this approach, the datamart is secured – meaning the security policies and rules are written in the datamart. Technically, a security table can be created and maintained having the users / groups with corresponding access rights.  Security policies can have a logic to compare the active logged in user and security table. All the users accessing the datamart are provided access to their data only after executing the security policies. We can also embed the security policies and rules in a view. A good example for row level security is — Non-Managers cannot see the data of   co-workers however managers can see the data of his / her sub-ordinates. In Oracle (for example), we can create a non-manager and manager views with the security rule (<security_table.user> = “USER”). The security views are imported in the Business Objects ( BO) universe and the reports use these security views through the universe. The main ADVANTAGE of securing your datamart is that your security rules can also be used by many other BI tools ( Cognos, Microstrategy )  as the rules are built at the datamart and NOT at the Business Objects)
Second way is by building the security rules at the Business Objects. Here the security rules comparing the logged in user and security data can be written in a virtual table of your Business Objects. These virtual tables are nothing but the universe derived table. BO Reports use the derived table to access the datamart tables. Alternatively, we can also define security filters in a BO universe. The filters are called as condition / filter  objects in the BO universe world. With this approach, you can take the maximum ADVANTAGE of the BO features however the disadvantage is that when you are going to a different BI tool like Cognos you need to rewrite the business security rules in your new tool.
In case of the projects dealing with the migration of Peoplesoft transactional reporting to Business Objects analytical reporting. We can potentially reuse / import some security tables  and security policies from Peoplesoft into our analytical datamart. These reusable components can save time in building the secured datamart and reporting environment.
COLUMN LEVEL SECURITY
Like ‘Row level security’, we can implement the column level security either at the datamart or Business Objects. In the financial industry, the business users do not want their revenue amounts, social security number , tax id number and other sensitive columns to be shown to unauthorized users.  Given this instance, we can mask the sensitive columns by a restricted tag in the place of sensitive columns. Non-sensitive columns like first name , last name , gender , age can be left and shown as it is to the end business user. These logic can be technically implemented in the business objects universe derived table or datamart views using a decode / ‘if then else’ / case statements.
Alternatively , we can use the universe object restriction feature in the BO designer to define restriction on the universe objects. So whenever a business user tries to drag the restricted object from the universe , the restriction rules get invoked , authorization occurs and the object access is given to the end user if he / she is successfully authenticated to access that object.
I’m signing off this BO security blog for now. The contents are based on my knowledge and BO experience in various projects.  Thanks for reading.  Please share your thoughts on this blog. Also, please let me know your project experiences pertaining to row and column level security in Business Objects.
Read More About  Business Objects Security

Thursday 20 November 2008

Zachman Framework for BI Assessments

The Zachman Framework for Enterprise Architecture has become the model around which major organizations view and communicate their enterprise information infrastructure. Enterprise Architecture provides the blueprint, or architecture, for the organization’s information infrastructure. More information on the Zachman Framework can be obtained at www.zifa.com.
For BI practitioners, the Zachman Framework provides a way of articulating the current state of the BI infrastructure in the organization. Ralph Kimball in his eminently readable book “The Data Warehouse Lifecycle Toolkit” illustrates how the Zachman Framework can be adapted to the Business Intelligence context.
Given below is a version of the Zachman Framework that I have used in some of my consulting engagements. This is just one way of using this framework but does illustrate the power of this model in some measure.
zachman
Some Salient Points with respect to the above diagram are:
  • The framework answers the basic questions of “What”, “How”, “Who” and “Where” across 4 important dimensions – Business Requirements, Conceptual Model, Logical/Physical Model and Actual Implementation.
  • Zachman Framework reinforces the fact that a successful enterprise system combines the ingredients of business, process, people and technology in proper measure.
  • It is typically used to assess the current state of the BI infrastructure in any organization
  • Each of the cells that lies at the intersection of the rows and columns (Ex: Information Requirements of Business) has to be documented in detail as part of the assessment document
  • Information on each cell is gathered through subjective and objective questionnaires.
  • Scoring Models can be developed to provide an assessment score for each of the cells. Based on the scores, a set of recommendations can be provided to achieve the intended goals.
  • Another interesting thought is to create a As-Is Zachman framework and overlay that with To-Be one in situations where re-engineering of a BI environment is undertaken. This will help us provide a transition path from the current state to the future.
Thanks for reading. If you have used the Zachman framework differently in your environment, please do share your thoughts.

Tuesday 21 October 2008

Business Intelligence Value Curve

Every business software system has an economic life. This essentially means that a software application exists for a period of time to accomplish its intended business functionality after which it has to be replaced or re-engineered. This is a fundamental truth that has to be taken into account when a product is bought or for a system that is developed from scratch.
During its useful life, the software system goes through a maturity life cycle – I would like to call it the “Value Curve” to establish the fact that the real intention of creating the system is to provide business value. As a BI practitioner, my focus is on the “Business Intelligence Value Curve” and in my humble opinion it typically goes thro’ the following phases as shown in the diagram.
curve1
Stage 1 – Deployment and Proliferation
The BI infrastructure is created at this stage catering to one or two subject areas. Both the process and technology infrastructure are established and there will be tangible benefits to the business users (usually the finance team!). Seeing the initial success, more subject areas are brought into the BI landscape that leads to the first list of problems – lack of data quality, completeness and duplication of data across data marts / repositories.
Stage 2 – Leveraging for Enterprise Decision Making
This stage takes off by addressing the problems seen in Stage-1 and overall enterprise data warehouse architecture starts taking shape. There is increased business value as compared to Stage-1 as the Enterprise Data Warehouse becomes a single source of truth for the enterprise. But as the data volume grows, the value is diminished due to scalability issues. For example, the data loads that used to take ‘x’ hours to complete now needs at-least ‘2x’ hours.
Stage 3 – Integrating and Sustaining
The scalability issues seen at the end of Stage-2 are alleviated and the BI landscape sees much higher levels of integration. Knowledge is built into the set up by leveraging the metadata and the user adoption of the BI system is almost complete. But the emergence of a disruptive technology (for example – BI Appliances) or a completely different service model for BI (Ex: Cloud Analytics) or a regulatory mandate (Ex: IFRS) may force the organization to start evaluating completely different ways of analyzing information.
Stage 4 – Reinvent
The organization, after appropriate feasibility tests and ROI calculations, reinvents its business intelligence landscape and starts constructing one that is relevant for its future.
I do acknowledge the fact that not all organizations will go through this particular lifecycle but based on my experience in architecting BI solutions, most of them do have stages of evolution similar to the one described in this blog. A good understanding of the value curve would help BI practitioners provide the right solutions to the problems encountered at different stages.

Friday 10 October 2008

Business Intelligence Challenge – Product Upgrades & Migrations Product Upgrades & Migrations, Object Consolidation – 2

As an initial step one of the key tasks to be considered in any Business Intelligence product upgrade or migration is ‘Object Consolidation’.
What is Object Consolidation? The process of getting to understand the current BI environment by means of the metadata and analysing them with a perspective to determine and eliminate redundant objects. The ‘object’ in a BI product would be its reports and the semantic layer definitions (like Universe in Business Objects).
Steps Involved in Object Consolidation
1. Locate all objects (reports and semantic definitions). These objects could be from a central repository and as well from individual user folders and desktops
2. Check whether the Object’s metadata are available in a relational storage (metadata repository) else build processes that would collect the metadata of the objects and store them into a relational structure
3. Run SQL queries against the relational structure to determine
a.‘Duplicates’; the objects that have same metadata elements
b.‘Clusters’; the objects that have similar metadata elements. when objects(reports) differ between them by a few 1 or 2
metadata elements then these Objects are grouped as ‘Clusters’
c. ‘ Dormant’; the objects that are no longer used
d. Complexity of the objects in terms of factors like the number of metadata elements being used in an object
4. Share the object consolidation findings to the users for confirmation and verification
5. By eliminating the duplicate & dormant and including only the prime in a cluster prepare the consolidated list of objects
a.Duplicate objects are directly removed
b.From the Cluster objects only the key object is considered for upgrade. After the upgrade of the key object rest of the
objects in the same cluster are derived from this upgraded key object
The consolidated list of objects and the understanding of the complexity of the existing environment becomes one of the key inputs to plan for the upgrade process.
Benefits of Object Consolidation
1. Eliminating upgrade of unwanted objects, saving on effort & cost
2. Enabling to build a clean system in the newer version or platform ensuring easier system maintenance
3. Enables effective upgrade planning based on the understanding of the environment
4. Improves the understanding of the existing environment through the metadata links
Object Consolidation Challenge: Accessing the metadata of the objects would be a challenge since many of the BI products don’t expose the metadata that can be queried through SQLs. But almost every products provide SDK kits through which the metadata can be accessed or expose the metadata as XML files. We would need to build tools that can pull the metadata using SDKs or in the cases of XML files build XML readers/parsers to pull the required metadata.

Tuesday 30 September 2008

Business Intelligence – The Reusability Gene

One issue that confronts me time and again while executing BI projects is “Reusability”, actually the lack of it. Let me give an example. 
In the many migrations and upgrade projects that Hexaware (my company) has executed, I always find that the number of reports finally migrated/upgraded to a new environment is only 40-50% of the number that is provided to us by the customer initially. Report Rationalization has become such a critical step that we have developed many specific metadata tools that helps rationalize the reporting environment.  Coming back to the topic – The reason for such a divergence between the final number of reports and the initial number is lack of ‘reusability’. Business users have their own versions of standardized (?) reports stored in their desktops which are nothing but small variations (usually with a new filter added) of an already existing report.
Another similar example on the data integration side is the creation of ad-hoc ETL routines as and when required. This results in duplication of ETL jobs and also results in a non-standard BI environment.
Lack of re-use causes two major problems:
1) BI environment becomes bloated with the increase in the number of unwanted components that use valuable computing resources, resulting in delays for availability of more important information.
2) Any attempt at upgrading/re-engineering the existing system results in high costs and undesirable heart-burn among business users
The Prescription:
1) Establish a corporate level BI team whose primary responsibility is to ensure that any component addition (ETL, Reports, and Models etc.) is justified based on its purpose. This team has to ensure that existing standards and components are reused to the maximum extent.
2) Strengthen the “Business Metadata” architecture within the organization. In one of my earlier posts, I had explained my view of BI metadata and that is very relevant to the task of improving reusability.
Basically, the “Reusability gene” seems to be a little muted in its functioning among BI practitioners. It is time that BI teams within organizations and system integrators like Hexaware look at reusability as a critical parameter while developing and deploying BI solutions.
Read More About  The Reusability Gene

Wednesday 17 September 2008

Business Intelligence Challenge – Product Updates and Migration-I

Product Upgrades are situations where we are moving from one version of the product to the latest version of the same product. Upgrades happen
  • to ensure support from the product vendor
  • to leverage new features provided by the latest version in terms of performance and user experience
  • as some other new product which is being added to the architecture doesn’t talk to the existing versions
Product Migrations are situations where we are moving from a platform of one vendor to another vendor’s platform. Migrations happen
  • as ‘BI Standardization’ initiatives drive organizations to move towards a common platform to operate BI systems at a lower cost and provide uniform user experience
  • because of bad experience with the current product not meeting the business needs in terms of performance or usability or product support or license cost
  • to be triggered also because of the recent mergers and acquisitions which lead organizations to think of a ‘safer’ platform
Upgrade a Challenge? With newer versions of every major product especially the ones like Business Objects, Cognos under go such a rapid change that the newer versions of the same product comes out on a different architecture with entirely new set of components, no longer upgrades are upgrades they have become effort intensive product migrations almost similar to moving from one BI product vendor to the another BI vendor.
Let us call either upgrade or migration as ‘Upgrade’ as any such initiative is for better upgraded experience of the business and the IT.
Can we do this upgrade next year? , a common dialogue when an IT team requests for a Business Intelligence Product Upgrade. Upgrade is one of the key items that would definitely come up for discussions during BI budget allocation in every organization. Fears among the business subsist that Upgrade projects would involve many of their hours without much benefit to them. For the IT Upgrade is a bigger challenge due to the unpredictability involved in the problems they would face during the course of the project and ensuring minimal disturbance to the business team. Hence the BI initiatives related to Product Upgrade get through multiple scrutinies before budget approval. Such projects are seen as an IT initiative and clear definition of business benefits becomes difficult to build.

Tuesday 9 September 2008

Business Intelligence – The Unconquered Territories

Bill Bryson, one of my favorite authors, writes this way in the book “A Short History of Nearly Everything” and I quote:
“As the nineteenth century drew to a close, scientists could reflect with satisfaction that they had pinned down most of the mysteries of the physical world: electricity, magnetism, gases, optics, kinetics, and statistical mechanics, to name just a few. If a thing could be oscillated, accelerated, perturbed, distilled, combined, weighed or made gaseous they had done it, and in the process produced a body of universal laws so weighty and majestic that we still tend to write them out in capitals. The whole world clanged and chuffed with the machinery and instruments that their ingenuity had produced. Many wise people believed that there was nothing much left for science to do”
Now we all know how much science did invent / discover in the 20th century.
Sitting now in 2008, sometimes when I hear people speaking about BI, I get a feeling that we are on the verge of accomplishing everything in this space. Alas! That is “as far as it gets” from the truth– There are so many “unconquered territories” in BI that if you were thinking that the past was challenging enough, it is time to get rejuvenated for wresting with bigger challenges in the future.
My top ten “Unconquered Territories” for BI Practitioners are:
1) Majority of BI decision making is geared towards analysis of structured data. Usage of unstructured data is minimal at best and non-existent in many cases.
2) There is still lot of work to be done in integrating the process rigor of a Six Sigma or a quality management methodology (say CMMI) to the BI paradigm. Unless that is done, BI will not be sustainable in the long run.
3) Lack of valuation techniques. BI systems are corporate assets like Human Resources, Brands etc. and there has to be concrete models for valuing them.
4) Predictive Analytics / Data Mining are used only by handful of organizations effectively. There is no shortage of techniques but the world is probably short of people who can apply high-end analytical techniques to solve “common-sense”, real world business problems.
5) Let’s face it – There are technology limitations. Operational BI (Lack of real-time data access), Guided analytics (Lack of comprehensive business metadata), Information as a Service (Lack of SOA based BI architecture) are some of those technology limitations that come to my mind.
6) Data Quality is a nightmare in most organizations. Either the data is already ‘dirty’ or there is really no governance process which leaves the only option that data will become ‘dirty’ eventually.
7) Here is a mindset challenge – BI Practitioners, in my view, need to develop a higher level of “business process” oriented thinking that seems to be lacking given the ever increasing technology complexity of BI tools.
8) Simulations!! – Businesses run with a lot of interdependent variables. Unless a simulation model of the business is built into the analytical landscape, there is really no way of having a handle on the future state of business. Of course, ‘Black Swans’ will continue to exist but that’s a different subject matter altogether.
9) On demand analytics – I accept that am being a little unfair here to expect BI to catch up with the nascent world of “cloud” computing so early. But the fact remains that much work can be done in this area of “Cloud Analytics”.
10) Packaged analytics is a step in the right direction – Organizations can quickly deploy analytical packages and spend more time on how to optimize business decisions. Having said that, the implementation difficulty combined with the lack of flexibility in packages are areas of concern to be alleviated.
Each one of us will have our own list of “unconquered territories”. Probably it is worthwhile to put everything down on paper and nudge your BI environments towards conquering all those areas and beyond.
Read More About  Business Intelligence