Friday, December 31, 2010

Social Media

Good blog on social media, analytics tools and marketing research.  It gives the Social Media fundamentals in these areas.

Analytics and In-Memory Databases are Changing Data Centers

For real-time analytics with large data volume this is a solution.  My first exposure to this technology was about 18 months and I was impressed with its capabilities. 

Another solution for realt-time analytics is the use of intelligent agents (think about the crawlers that are currently used but with some embedded analytics capabilities added-on to the recognition and identification algortihmns). I wrote an article about this in 2007:

In-memory databases analytics and intelligent agents are not mutually exclusive.  If anything, these technologies are complementary in nature.

Thursday, December 30, 2010

Data Mining Awards 2010

Bringing Down the High Cost of Business Forecasting

Excellent and on point article by MIT Review about the decreasing costs of business analytics using cloud computing analytics tools.

Sunday, December 26, 2010

Robotic Surgery and Business Analytics

Robotic Surgey and Business Analytics
by Alberto Roldan
Copyright 2010 Alberto Roldan

Imagine a future where we can use the techniques, technologies, and methodologies of robotic surgery in business analytics.  For example:
  1. What if we can "see", diagnosed, and repair the health of a business using 3D visualizations, predictive algorithms, and statistical control processes?
  2. What if we could in detail the good parts of our business and compared it with the diseased or needed repair part of our business real-time?
  3. What if we can monitor the different components of the health of a company (i.e., financial, operational, marketing, CRM, and social media) in real time while we are making strategic and tactical decisions.
The surgical outcomes of robotic surgery is less pain and faster recovery.  We trust a complex organism as the human body to robotic surgery.  What is preventing us to use the same science, technology, and matehmatical methods in business analytics?  I believe that we are living in the future, because the future is now.

Click to the link above and tell me if you are interested.

Contact me at

Thursday, December 16, 2010

Trends in Business Analytics for 2011: Alberto's Predictions

There are three main trends that we can see occurring in 2011 across all industries and regardless of company size: (a) a large increase in volume of work, (b) utilization of social media as a key data source, and (c) challenges in implementation and innovation. First, the year 2011 will bring a tremendous increase in the volume of work in business analytics. Specifically, the area of predictive analytics had an increased momentum in the last half of 2010 that will carry to 2011 without any abatement in sight. My prediction is that companies will increase their 2011 appetite for predictive analytics by over 200% from 2010.

The need for more efficient ways to do work and adapt to swift changes in the economic environment will be feeding the craving for companies to use business analytics. Some of the accompanying challenges that this increased appetite for predictive analytics will include the following:

  1. Prioritization of analytics projects to align with corporate strategic and tactical objectives
  2. Analytic resources identification - the profiling of individuals that know predictive modeling technique
  3. Institutionalization of budgeting for analytics projects , or no more unplanned budget for analytics projects

Whether business analytics resides in the IT or business organization, companies will need to modify their business models to reflect the contributions and needs of an analytics organization. Companies need to decide whether they will bring analytics talent by acquisitions, hiring, or by outsourcing. Offshoring will be substantially different for IT than for analytics (rate cards, project management, and resources). Adjustments on delivery expectations for analytics projects may be needed since the analytics portion of a project can be a short-term exercise, but the enterprise implementation for the same project could be a long-term project.

Social media will become an even more important key data source. It will be used to reduce the time that it takes to predict trends affecting companies and their competitors. Companies that successfully leverage analytics in social media to detect future trends, and make changes to their strategy, will differentiate in the marketplace by using “swift insights” to quickly adapt to changing market conditions. The best example to understand the importance of swift insights is the experience of Coca-Cola vis-à-vis Gap. In 1985 Coca-Cola introduced a new coke formula, and it took them nearly three months to return to their original formula after a public outcry. On the other hand, Gap introduced its new logo, and the public backlash was so pronounced in the social media that one week later they returned to their original logo. In mathematical terms this mean that social media accelerated the identification of the need for change by eleven times (11x). In other words, if an average car ran at 50 miles in 1985, it will run at 550 miles in 2010.

The involvement of business personnel from companies will become essential in defining the social media analytics strategy, and for testing the results of any analytics project. This involves additional time commitments that must be managed both strategically and operationally. A combination of savvy, innovative, and experienced staff in consulting, technology, finance, and analytics skills will become critical in 2011 for companies to successfully integrate social media analytics into their business models. Any licensed driver can drive a car at 50 miles per hour, but only skilled experts can maneuver a car at 550 miles per hour.

The areas of innovation and implementation are connected by a combination of best practices and repeatable processes in best of breed companies. Enterprise analytics implementation requires a combination of a knowledge of statistics, analytical tools, and optimization techniques. Companies must guard against unwise investments in analytics by following best practices: references, due diligence, proof of value, pilot or POC, evaluation, budgeting, project planning, and implementation. Once the analytic problems have been defined and aligned with strategic goals, companies should look at their internal project planning process to ensure the availability of right resources, skills, and budget.

Planning a phased approach is recommended in all analytic analytics projects. This allows for the evaluation of the business lift, in addition it gives time for improvements and modifications that are department and geographically specific. The key for planning analytics capabilities within a company is to build a small but strong foundation of business, technical, and analytics skills and then move from small projects to larger projects.

The integration of technology innovations with analytics will be a crucial test for many companies in 2011 and beyond. There are three main technologies that will make an impact in the way we do business in 2011: mobile devices, visualization, and speech technologies. The delivery of predictive analytics results using mobile devices, like the iPad, tablets, and smart phones, allows executives and field personnel to have access to swift insights. Those insights will allow decision makers at all levels of a company to know the impact of their decisions in revenues, costs, and profitability.

The ability to use analytics and mobile devices to deliver filtered and ready to act information (converting data into information) will depend on innovative visualization techniques. The screen space in mobile devices is smaller than laptops and PCs, hence the need for smarter visualizations. The use of spreadsheets is not efficient in smart phones. For companies that have hundreds of products, the representation of multiple dimensions or variables (i.e., predicted revenues, profits, and commissions), the use of line or bar charts also have a limited use. The use of interactive 3D visualizations in mobile devices to represent analytics outcomes will become a new breakthrough in the business world. 3D visualizations and predictions are common practice in web-based games, and those algorithms will be integrated into the business world in 2011.

Speech technology is another innovation that will be making its mark in 2011 and beyond. In December 2010 two high school students won a price at the Siemens Competition by developing a speech recognition algorithm that can detect a speaker’s emotion better than any current technology. Imagine how many errors individuals make when they are in a hurry, or otherwise distracted. The business impact of preventable errors could be billions of dollars annually using a combination of this speech recognition technology and predictive analytics outcomes, all delivered through mobile devices.

Finally, my last prediction: in order to flourish and quickly adapt to changes in these rapidly changing economic times, we need to carefully listen to those that are our future. The examples from Siemens Competition (speech recognition technology), Gap (social media), and visualizations (web-based games) are common technologies used by the 9-to-30-year-old population. One of my main roles as an innovator is to listen to those voices and use my experience to provide guidance in implementing those new technologies and methodologies for businesses. A warning and advice to companies: listen carefully to those that represent our future. The future belongs to them, and our job is to provide guidance based on our experience.

My wish to companies for the year 2011: be the future not the past. Companies need to be open to new ideas and new ways to do business using analytics. Learn from the Blockbuster-Netflix proposed partnership in 2000. Blockbuster laughed Netflix out of their office thinking that the online subscribe service model would not be successful in the movie rental business. Now Netflix is a thriving business with 16 million members, while Blockbuster is in bankruptcy with $900 million in debt.

Friday, December 03, 2010

Smart-grid analytics to be $4 billion industry

I just recently completed a sucessful predictive analytics project in smart grid analytics for a utility with about 1 million smart meters. The methodology developed allow us to predict meter events; as well as power outages and spike in consumption at the meter level. See,

For additional details contact me.

Thursday, December 02, 2010

Applicability to Business Analytics of New Science Discoveries in Stem-Cell Biology and Astrophysics

In the last week, there were two major scientific discoveries: scientists “trick” cells to change identities; and there are three times more stars in the universe than previously thought. What are the lessons than we can learn and apply from these discoveries?

First, the methodology used by scientists for transforming a specialized cell into a different specialized cell is a “reverse and conversion “approach. In other words, molecular biologists reversed the process of a specialized cell development and turn that cell into a basic or stem-cell. Once this “reverse” step is completed, then the scientists guided the new basic cell to transform into a different specialized cell. In the area of retail and consumer goods/services companies are always coming with new strategies to increase sales and profitability by “converting” a competitor’s consumer into one of their own. In other words, the emphasis is in “conversion”, but there is no prior “reverse” methodology. Business analytics techniques can be used to identify actionable “reverse” methodologies that can be used with conversion methodologies to increase revenues and profitability. In other words, companies must have a methodology to determine how they can get a potential customer to reverse an established purchasing pattern, before attempting to “convert” that customer.

Second, the study that concluded that the number of stars in the universe had previously been undercounted and that the correct number is three times the previous estimate is a case study of applying good science to known facts. One of the reasons for the undercount was the assumption that the distributions of most galaxies are spiral like our own Milky Way galaxy. This recent study found out this to be an incorrect assumption. Moreover, the study found that the assumption that the ratio of dwarf stars to sun-like stars is 1,000 times greater in elliptical galaxies than in spiral galaxies. What is the application of these discoveries in the area of business analytics? There are two main lessons. One, assumptions of patterns, ratios, distribution, and correlations in large data sets based on the observation of a limited set (and non statistically valid random sample) could end up been not supported by reliable scientific methods. Lastly, it is imperative to have a methodology that deals with missing data before reaching conclusions and recommendations.

In summary, companies should be looking at methodologies and discoveries in the sciences and apply that knowledge into their business analytics strategy and tactics.

Wednesday, December 01, 2010

Business Analytics Project Prioritization - ©Copyright 2010 Alberto Roldan

As companies get involved in the area of predictive analytics, the issue of project prioritization has come more important. Companies need an analytics project prioritization framework that allows them to maximize resources while simultaneously reduced costs and improve profitability. Traditionally companies look at predictive analytics as an isolated function within an organization. As predictive analytics, data mining techniques, and other advanced methodologies are starting to permeate throughout an enterprise, the issue of prioritization has become a keystone in their analytics strategy.

Initially, companies looking for a prioritization framework look at their IT projects or IT reports processes in an attempt to adapt those processes to business analytics. Companies tend to find out that those processes are sufficient since an analytics project is more a combination of IT, reporting, and research and development (R&D) project.

Let me suggest the components of a framework to help companies prioritize their analytics projects:
1. Identify analytics projects prioritization committee composed of members from:
a. Executive sponsor
b. Business sponsor
c. IT lead
d. Analytics lead
e. Identify other interested parties – they may not vote but need to have input in the process of prioritization – for example project management lead, finance lead, human resources lead, and administrative support.

2. Develop standard operating procedure for meetings (agenda, minutes, decision making criteria) – meetings could be by teleconference and members of the committee can give their documented input by email

3. Follow the six M’s for the entry point for the prioritization of all analytic projects:
a. Must be Easy
b. Must be Friendly
c. Must be Accessible
d. Must be Efficient
e. Must be Documented
f. Must be Consistent

4. Identify how the project aligns with corporate and department-specific strategic goals

5. Prepare business case – the key is to quantify the business case: investment costs, quantifiable savings, changes and disruptions, soft benefits and ROI.

6. Prioritize analytics using a quantifiable framework approach in the following dimensions:
a. Strategic and Tactical initiatives
b. Costs
c. Business Impact
d. Disruption of other initiatives
e. Impact on organization financial health
f. Analytics maturity level of end-users
g. Resource availability
h. Budget availability
i. Technology availability
j. Leverage current IT environment
k. Leverage current business environment
l. High-level execution Plan
m. Exceptions
n. Process improvement suggestions

7. Document prioritization methodology so that the process is transparent

8. Notification of project prioritization with quantifiable scores and/or exceptions reasons
a. It should include the high-level execution plan
b. All stakeholders, interested parties and committee members should be notified.

9. Process improvement analysis and recommendations

Although the aforementioned process is not all inclusive, at least it gives a general overview of how to develop an efficient and documented business analytics project prioritization framework

Monday, November 22, 2010

Smart Grid Business Analytics: Adding Value to the Smart Grid

Most of the electric utilities companies are investing in deploying smart meters to replace their aging infrastructure. Recently, I finished a smart grid analytics project with one of the largest electric utilities companies in the US, and found that smart meter data can be use to accurately predict (RSq .86 to .97) the following:

1. A meter which is going to go bad within a 30 days period;
2. Power outage at the meter level within a -1 to -3 hours; and
3. Spike in consumption at the meter level within a -1 to -3 hours

The general smart grid analytics methodology that we developed can be used across the board and is technology agnostic. The value to the electric utility companies is allows them to integrate predictive capabilities to their Meter Data Management Systems (MDMS) in near-real-time. These predictive capabilities can be used in load forecasting, settlement, distribution planning, and efficient deployment of field staff for problems that require manual intervention.

Thursday, November 18, 2010

Business Analytics: Thoughts that guide my journey as an applied mathematician

-It never ceases to amaze me the insights of the incredible intelligent and hard working individuals that I work and associate with. Life is a journey of never ending acquisition of knowledge and the greatest source of that knowledge is always around us.
-When I take the time to consciously hear, see, and feel what others are trying to teach me the learning curve gets substantially reduced.
-I prefer working with a group of people who work hard to achieve a common objective.
-Awareness of my own knowledge limitations makes me a better learner.
-The depth and width of a specific insight has no relationship with the physical characteristics of the individual presenting the insight.
-Younger minds do contribute as much to insights as more mature intellects.
-Passion sometimes is the only differentiator.
-Mathematics is a precise science, do not mess around it.
-Results are just results. Interpret them, change the methodology, or discard them, but never make them up.
-I would rather be respected for my integrity than accepted because my lack thereof.
-Just because something has always been done a certain way that does not mean that there is not a better or different way.
-People would rather live with a problem that they understand than with a solution that they do not.
-A simple solution sometimes is the best solution.
-I will get up every time that I fall down.
-Falling hurts, but not getting up will end up hurting more.
-Ignore those that tell you that it cannot be done.
-If you cannot the same results is because you did not follow the same process.
-If you are successful some people are going to say that the results were obvious and the work easy. At that time, remember to smile and say something nice about them. Some people are just negative by nature.
-Fear is an emotion that needs to be tempered at all times. It is like salt: too little and the food is bland, too much and it ruins a meal.
-My ratio is about 20:1. For every 20 things that I have tried only one is successful. I improve my ratio by not repeating the same mistakes more than once.
-My job in life is to inspire and build up those around me. I have found that honesty is the only rule, humor keeps us human, joy is a choice, and there is beauty all around.

Wednesday, October 13, 2010

Marketing Analytics Conference, Boston, Nov. 10, 2010

I want to invite you to the marketing analytics conference in Boston, Nov. 10th. I will be a speaker in the area of advanced marketing analytics. Our Bsuiness Analytics group at linkedin now has over 14,000 member and this will be a time to exchange ideas. See you at the conference.Our members will receive $100 discount on the full delegate price and the academic/government rates - they need quote the promotional code Roldan to receive this discount. For further information on the event – the website is:

To register for the event the URL is

Thursday, September 23, 2010

IBM Quest for Analytics Dominance

IBM has the capabilities in data mining, analytics software, ETL/BI, and the advanced purpose build hardware in SPSS, Cognos, and Netezza, and it should make a handsome profit with all those capabilities. Nevertheless, the companies that use these tools need to effectively utilize them to give them an edge in their respective markets. Just buying these analytics tools is not enough to build an analytics system. For example, I may own all the best power tools but that does not mean that I can build a house.

Companies that use business analytics are somehow like restaurants. They may have a beautiful site and a great location, but customers go to a restaurant because of the food. People go to fast food restaurants because it is simple and convenient. Fast food restaurants do not have do not have the ability and flexibility to change their menus to adapt to market conditions. Companies that want simple and convenient analytics solutions will find themselves with making substantial investments, but not getting any substantial ROI.

The way that companies can differentiate in the analytics market place is by adhering to these principles:

  1. Time-to-market - The ability to turn complex analytics projects in no more than 30-90 days is essential. Companies have a need for speed, and are driven by quarterly earnings reports.

  2. Low cost business model - Offshore/onsite models, and revenue sharing business models are essential in today's economic environment.

  3. Technology neutral - The understanding that most advanced analytics tools in the market place are based in the same mathematics at the back end is extremely important.

  4. Visualization and easy to use - End users, from field personnel to executive management, should be able to "see" and take action on relevant and pertinent information is a keystone of transforming data into actionable information.

  5. Industry specific knowledge - The knowledge of the industry issues and specific data elements is something that cannot be ignored for a successful analytics project.

In the next 3-6 months we will see that the quest for analytics dominance will shift from software and hardware, to the effective application of those products to solve business problem.

Saturday, September 11, 2010

How Priority Inbox works

How Priority Inbox works

My hat off to Google for bringing this innovation to the market. Although time and input from users will make this tool more accurate on its prediction, this is a leap in terms of putting predictive analytics technology at the fingertip of all users. It brings the concept of predictive modeling and analytics to the masses to a new level. Just imagine the possibilities of this technology in the banking, healthcare, travel, and other industries. As I have stated before, "We are living in the future".

Tuesday, August 31, 2010

Marketing Analytics: Understanding Segmentation and Prediction

By Alberto Roldan, Copyright 2010 Alberto Roldan
Campaign management, media mix optimization, cross-sell, and up-selling are some of the terms that are commonly used in marketing analytics. Although those terms are understood in a business context, the advanced analytics techniques behind those terms are not as well understood. The purpose of this article is to explain some of the analytics techniques used in marketing analytics, such as segmentation and predictive analytics, so executives understand the capabilities and limitations involved. A secondary objective is to help analytics professionals explain concepts to businesses.

Segmentation is the process of dividing a large market into groups that have similar characteristics. There are two main issues that I have found in explaining this process to businesses: 1) the different processes in arriving at granular vs. aggregated segmentations; and 2) the difference between “similar” vis-à-vis “equal” characteristics within a segment.

Granular vs. Aggregated Segmentations
Companies need to be able to separate the clusters of data and identify the driving factors for marketing and sales purposes. Hence, a limited number of segmentations that is directly related to the core business is necessary for strategic and tactical decision making. The limited number of segmentations (about 6) is what I refer to as aggregated segmentations. Examples of aggregated segmentations are best customers, next best, infrequent buyers, and power buyers. The process of arriving at a consensus of aggregated segmentations is a combination of business knowledge and experience with statistical data analysis of granular segmentations. Aggregated segmentations allow businesses to efficiently analyze large datasets and make decisions that will impact profit, revenues, and costs.

Granular segmentations refers to the separation of the clusters of data using advanced analytics techniques like hierarchical partition, k-means clustering, distribution, and correlation analysis. Therefore, the main difference between the process of distinguishing granular and aggregated segmentations is that the first is determined by using proven mathematical techniques, while the later employs business knowledge and advanced analytics techniques. A frequently asked question is, Why do we need to do granular segmentations before arriving at aggregated segmentations? The answer is that analytics requires following scientific methodology (observations, theory, experiment, and outcome). The scientific method allows for objectivity, reduces bias in the interpretation of results, and brings measurable precision to the process. Moreover, the accuracy of a prediction is based on granular segmentations rather than aggregated segmentations.

Similar Characteristics vs. Equal Characteristics
A challenge encountered in attempting to explain segmentation to businesses is the difference between similar vs. equal characteristics in a segment. The members of a granular segment might have equal characteristics and, hence, be homogeneous. The members of an aggregated segment will have similar but not equal characteristics. The granular segment characteristics are determined by using advanced analytics techniques; therefore, the homogeneity of the segments is determined with mathematical precision.

On the other hand, aggregated segments are determined by combining homogeneous granular segments with business value and experience. Since multiple distinct granular segments are combined in aggregated segmentations, those segments should have similar but not equal characteristics. For example, the best customer segment will have customers with similar characteristics such as frequency of purchase, but not all the frequencies will be the same (i.e., once a week, twice a week, or daily).

The importance in understanding the differences within an aggregated segment allows the decision makers to be precise in their strategic decisions while simultaneously considering a manageable set of segments.

Predictive Analytics
Availability and Data Quality
Predictive analytics refers to the ability to accurately predict an event or occurrence, for example, that a customer will purchase a product or service at a set price or within a certain price range. This area is so broad that I am only going to address two issues: 1) data availability and quality; and 2) accuracy of prediction. A baby must first learn to crawl before it can run. Although this concept is fairly obvious, sometimes its application in marketing analytics is not well understood. In order to make a prediction, data must be available and of acceptable quality. When a new product is launched into the market, predictions are difficult because of lack of data. Sometimes the initial analytics outcome is limited to comparing similar characteristics of a new product with an existing product. The next step is to make an inference (weighted value) that the new product may perform similarly to an existing product. As the new product gains traction into the market and that data becomes available, the accuracy of any prediction will substantially improve. The availability of this new data will prove, disprove, or modify the inference that was initially made. Availability of data also means that the data is accessible in the correct format for analysis.

Data quality refers to the percentage of individual variables that have correct information, as well as how the aggregate data quality issues impact the accuracy of any prediction. The old computer science axiom “junk in, junk out,” is true in marketing analytics. Therefore, it is crucial that a thorough ETL (extraction, translation, and load) process, including a data quality hub, be in place prior to attempting any enterprise predictive analytics. In other words, this is the seam where best practices in business intelligence (BI) and advanced analytics meet. It is important to remember that the accuracy of any prediction is directly correlated to the quality of your data. Therefore, executives should address data quality issues at the beginning of any analytics project.

Accuracy of Prediction
There are two issues that I would like to address regarding accuracy of prediction: variables and analytics tools. In the IT and BI world we speak of fields or data elements. In the analytics world, we talk about variables. A dataset may have hundreds of data elements, but analytics uses a limited number of relevant and pertinent variables. In order to understand whether their company can successfully implement a predictive analytics project, business decision makers must be able to distinguish the fundamental differences between the IT and Analytics languages. I like to think about this as the difference between learning how to say “food” in English and in Chinese—both words are necessary if you want to eat in each country.

One of the most common mistakes in predictive analytics is thinking that if we input data elements into a predictive analytics tool such as SAS, SPSS, or KXEN, we are going to obtain accurate predictions. This is a lack of understanding of the internal workings of regression algorithms. Regression works on a set of independent variables and a dependent variable. Therefore, regression reads independent variables as separate from one another. If a ratio between two independent variables is pertinent and relevant to a prediction, that ratio must be created as an independent variable. For example, if the variables are “date of first purchase” and “date of last purchase,” and you think that the relevant variable is “days between purchases,” then you need to create this variable. Otherwise, the regression algorithm reads those separate variables as independent from one another. Experience in variable creation is one of the areas business decision makers should examine when evaluating a predictive analytics project. The accuracy of a prediction is directly related to the variables used in the analytics model.

Analytics Tools
I have found that companies want to talk analytics tool evaluation right away when considering a predictive analytics project. This tendency is driven by IT experience with estimating cost of software, hardware, and staffing with qualified resources. Although analytics tool selection is an integral part of any predictive analytics project, it is neither the most important consideration, nor should it be the driving motivation for an analytics project. For example, I can go to a hardware store and buy the best carpenter tools, but if I do not understand their proper use, my success rate in building anything will be significantly reduced. A master carpenter with an old hammer and a hand saw will build a house faster and better than a novice with the best power tools.

One of the most important analytics tools that decision makers tend to ignore is to create a separate environment for advanced analytics. I have found that on occasion executives do not understand that predictive analytics consumes a large amount of internal memory and tends to negatively impact performance of current operational systems. The solution is fairly simple: build your analytics engine in a separate environment.

A recent survey found that three out of four executives understand that predictive analytics are essential to the operations of their business. Decision makers can and should use proven advanced analytics techniques to improve profitability. If executives learn the fundamentals of business analytics, its possibilities and limitations, they will be able to make better informed decisions in the investment of these new technologies.

Wednesday, May 05, 2010

Implementing Business Analytics

By Alberto Roldan - Copyright 2010

Companies are eager to implement business analytics to help them reduce costs and increase revenues. These companies face an unchartered territory in the area of analytics and need assistance in how solve implementation issues. Among those issues are mapping business objectives to analytics, resources, budget, data understanding, and planning. The objective of this article is to assist companies in dealing with those issues.
For the purposes of this article, we will equate analytics with the ability to predict the probability an occurrence or an event in the future. Companies are using analytics in many ways to predict the probability of
1. Transactions that have a potential to be fraudulent in market surveillance institutional trading and in health care claims processing.
2. Clients that should be targeted to buy different software products that are bundled together.
3. A CPG competitor’s product gaining track in the marketplace through the use of data from social media and its internal data.
4. The patients that are more susceptible to multiple chronic diseases.
5. The failure of machine parts within a specific product at the customer site and during the manufacturing process.

Organization Structure
Companies want to know the number of resources needed for a successful analytics project. Although the number of resources varies from project to project, the rule of thumb is that the core team consists of three people: a statistical modeler, an analyst, and developer. A fully functional team is also going to include a leader, a manager or project leader, business analysts, and evaluators/testers.
One of the main issues is how to find and budget for the proper resources. Individuals with statistical backgrounds are hard to find in the marketplace, as well as developers in specialized analytics software. A common mistake is to equate a developer with a statistical modeler. These skill sets are different, and understanding this difference is one of the keys for successfully implementing business analytics.
Outsourcing the analytics project is a good and cost-effective solution if the outsourced company has industry-specific knowledge and statistical modeling experience. There are different business models that allow a company to be successful using this approach:
1. Start with a proof of concept (POC) and move to larger projects.
2. Ramp up with the outsourced company, but then move to bring the analytics area into the full control of the company.
3. Fully outsource for the long term using a revenue-sharing model.
Other companies prefer a staff augmentation model. Personally, I have only seen this model work for companies that already have a well-organized organization structure for their analytics.
Budgeting for these resources and the analytics software is another challenge. It is a traditional market forces issue: high demand for these resources but low supply in the marketplace. The result is that you are paying a premium for these resources. In order to justify these resources, you need to show a return on investment (ROI). It is difficult to show an ROI in an area that you have no historical experience. The best solution is to start with a well-defined (including the budget) small project and use the results to extrapolate an ROI. Outsourcing the POC is a potential solution to this issue.

Mapping Business Objectives
One of the keystones in the successful implementation of business analytics is to specifically map the business objectives to the question that you want to solve. Although on its face this seems like a fairly simple issue, it is recurring flaw that I see. Companies want to know how they can use predictive analytics instead of defining the issues that they are seeking a solution. The first step in implementing analytics within a company is to have a clear business understanding of the issue. Analytics are based on a combination of mathematics and business knowledge. This knowledge is precise to solve a specific question.
The leader of the analytics team must ensure that the best practices are followed, including mapping out business objectives to the specific questions that analytics answers. The manager of the analytics team is responsible for identifying the metric for success used to evaluate the project. An analytics project should always be measured in business terms. How much potential revenue or cost savings does it identify?

Data Understanding
Two of the main issues are data quality and availability. The results of an analytics project are directly related to these two issues. The best practice in dealing with a data-quality issue is to deal with these issues before undertaking an analytics project. The data does not need to be perfect for an analytics project to be successful, but leaders and managers must understand and agree upon the limitations facing the project. Also, there are statistical techniques that can be used to refine the results and minimize data-quality issues like segmenting the results into “expected” results or those with a high probability of data-quality issues (“unexpected” results) for evaluation purposes.
The issue of data availability is more complex since there are some issues that we can extrapolate results even without having the data available, and sometimes this is impossible without additional data. A CPG company could extrapolate the results of an ethno demographic trade promotion analytics project into stores that they do not have data, but that falls within the same classification as similar stores where that data is available. The issue to validate the extrapolation is whether the segmentation or classification is valid or not. For example, a small store (by sales volume) in Southern California may not be comparable with a small store in Miami because although they both are catering to Latino customers, the characteristics of both populations may be different. On the other hand, a segmentation of small stores within a specific geographical area of Southern California could be useful to extrapolate the results within that geographical area in stores that do not have data available.

Implementation Planning
Implementation is essential in analytics because that is the actual transformation of data into actionable information. Planning how you intend to use analytics and the nature of your audience becomes essential to maximize your ROI. An understanding that analytics has different meanings within a company is fundamental. Executive management may want to know whether tactics are properly aligned with strategic goals. Line management would like to know how to best accomplish their specific monthly objectives. The employee on the field would want to know how to accomplish today’s target.
The visualization of analytics results, through dashboards, provides the means for decision makers within a company to quickly grasp the meaning of the information. Companies should think about how they want the information layer to be presented before embarking into an analytics project. The information layer should be directly correlated to the business objectives. Also, it should be flexible to add new requirements.
Companies should consider utilizing the advances in visualization techniques when planning a dashboard. This includes the ability to see the condition of a company through business metrics in a 3-D manner. The utilization of a 3-D graph that incorporates dollar value, statistical control process comparison, and predictive analytics is a powerful tool that allows using analytics in strategic and operational areas. The utilization of 3-D graphs increases perception and the ability to detect patterns by over 40 percent. Companies should ask, “Is there a value in increasing our ability to detect patterns in the data by 40 percent?”

As companies embrace and streamline analytics projects into their operations, they should plan to face issues that can derail their goals. Just a few years ago, it was common to hear that 50 percent of all IT projects were never completed. One of the lessons learned during that time was the importance of limiting scope, budgeting for resources, and project planning. A modified version of those lessons should be used when planning analytics projects. Think big, but start with small measurable projects. Avail yourself of best practices, identify potential issues early, and use experienced resources to ensure the success of your project.

About the Author - Alberto Roldan, the author, is responsible for the Enterprise Analytics Practice- North America, at Cognizant Technologies. He a published author with over 18 years experience in business analytics. He has a BA from the University of Michigan and a JD from the University of Puerto Rico Law School. He can be reach at

Copyright 2010

Business Analytics

Business Analytics

Blog Archive

About Me

My photo
See my resume at: