Category Archives: Predictive Analytics

Get More Value From Operational Assets with Predictive Analytics

Kaan_groupSharpening operational focus and squeezing more efficiencies out of production assets—these are just two objectives that have COOs and operations managers turning to new technologies. One of the best of these technologies is predictive analytics. Predictive analytics isn’t new, but a growing number of companies are using it in predictive maintenance, quality control, demand forecasting, and other manufacturing functions to deliver efficiencies and make improvements in real time. So what is it?

Predictive analytics is a blend of mathematics and technology learning from experience (the data companies are already collecting) to predict a future behavior or outcome within an acceptable level of reliability.

Predictive analytics can play a substantial role in redefining your operations. Today, let’s explore three additional cases of predictive analytics in action:

  • Predictive maintenance
  • Smart grids
  • Manufacturing

Predictive Maintenance

Predictive maintenance assesses equipment condition on a continuous basis and determines if and when maintenance should be performed. Instead of relying on routine or time-based scheduling, like having your oil changed every 3,000 miles, it promises to save money by calling for maintenance only when needed or to avoid imminent equipment failure.

While equipment is in use, sensors measure vibrations, temperature, high-frequency sound, air pressure, and more. In the case of predictive maintenance, predictive models allow you to make sense of the streaming data and score it on the likelihood of failure occurring. Coupled with in-memory technologies, it can detect machine failures hours in advance of it occurring and avoid unplanned downtime by scheduling maintenance sooner than planned.

This all means less downtime, decreased time to resolution, and optimal longevity and performance for equipment operators. For manufacturers, predictive maintenance can streamline inventory of spare parts and the ongoing monitoring services can become a source of new revenue. And as predictive maintenance becomes part of the equipment, it also has the potential to become a competitive advantage.

Smart Grids

Sensors and predictive analytics are also changing the way utilities manage highly distributed assets like electrical grids. From reliance on unconventional energy sources like solar and wind to the introduction of electric cars, the energy landscape is evolving. One of the biggest challenges facing energy companies today is keeping up with these rapid changes.

Smart grids emerge when sensor data is combined with other data sources such as temperature, humidity, and consumption forecasts at the meter level to predict demand and load. For example, combined with powerful in-memory technologies, predictive analytics can be used by electricity providers to improve load forecasting. That leads to frequent, less expensive adjustments that optimize the grid and maintain delivery of consistent and dependable power.

As more houses are equipped with smart meters, data scientists using predictive analytics can build advanced models and apply forecasting to groups of customers with similar load profiles. They can also present those customers with some ideas to reduce their energy bill.


The manufacturing industry continues its relentless drive for customization and “Lot sizes of 1” with innovations such as the connected factory, the Internet of Things, next shoring, and 3D printing. It’s also hard at work making sure it extracts the maximum productivity from existing facilities, which traditionally has been accomplished by using automation and IT resources. According to Aberdeen, the need to reduce the cost of manufacturing operations is now the top reason companies seek more insight from data.

Quality control has always been an area where statistical methods have played a key role in whether to accept or reject a lot. Now manufacturers are expanding predictive analytics to the testing phase as well. For example, tests on components like high-end car engines can be stopped long before the end of the actual procedure thanks to predictive analytics. By analyzing test data from the component’s ongoing testing against the data from other engines, engineers can identify potential issues faster. That in turn, maximizes the capacity available for testing and reduces unproductive time. That is only one of the many applications manufacturers find for predictive analytics.

Innovations on the Shop Floor

Predictive analytics provides an excellent opportunity for COOs and operations managers to extract additional value from production assets. It can also be an opportunity to create critical differentiators in the way products are created and delivered to customers—by providing it as a paid service (predictive maintenance) or as insight (predicting future electricity consumption).

However a company chooses to use it, predictive analytics can be the key to beating the competition.

Discover and Follow

And join the predictive conversation by following me on Twitter @pileroux.


Dresner’s Advanced and Predictive Analytics Study Ranks SAP #1 for Second Time in a Row

By  Chandran Saravana,  Senior Director Predictive Analytics Product Marketing

For the second year in a row SAP has received the number one ranking in the Wisdom of Crowds 2016 Advanced and Predictive Analytics Market Study by Dresner Advisory Services. The Dresner study reached over 3000 organizations and vendors’ customer communities and 20+ industry verticals with an organization size ranging from 100 to 10,000+.

Study findings include:

  • Organizations view advanced/predictive analytics as building on existing business intelligence efforts.
  • Over 90% agree about the importance and value of advanced and predictive analytics.
  • Statisticians/data scientists, business intelligence experts, and business analysts are the greatest adopters of advanced and predictive analytics.
  • Regression models, clustering, textbook statistical functions, and geospatial analysis are the most important analytic user features/functions.
  • Usability features addressing sophisticated advanced/predictive analytic users are almost uniformly important today and over time, led by easy iteration, advanced analytic support, and model iteration.
  • In-memory analytics and in-database analytics are the most important scalability requirements to respondents, followed distantly by Hadoop and MPP architecture.

I find it interesting that the Dresner study finds “Hybrid roles are also evident” and confirms SAP’s customer organization usage of predictive analytics. The research study looked at core advanced and predictive features, data preparation, usability, scalability, and integration as key criteria to rank the vendors.  Though usability criteria looked at many things, I would like to highlight one key one—“Support for easy iteration”—that ranked as most important.

In the scalability criteria, “In-memory analytics” ranked as most important one followed by “In-database analytics” and “In-Hadoop analytics (on file system).”

Read the Complete Dresner Report

You can find lots more in the 92-page Dresner Wisdom of the Crowds report. I invite you to take a look.

In the New Digital Economy, Everything Can Be Digitized and Tracked : Now What?

Woman Buying ClothesWelcome to a world where digital reigns supreme. Remember when the Internet was more of a ‘push’ network? Today, it underpins how most people and businesses conduct transactions – providing peer-to-peer connections where every single interaction can be tracked.

Enterprises are still not taking full advantage. With hundreds of millions of people connected, it’s possible for them to connect their suppliers with their customers and their payment systems, and reach the holy grail of seamlessly engaging in commerce, where a transaction can be tracked from purchase, to order received, to manufacturing, through to shipment— all in real time. It’s clear that end-to-end digitization delivers enormous potential, but it has yet to be fully tapped by most companies.

In the latest #askSAP Analytics Innovations Community Webcast, Reimagine Predictive Analytics for the Digital Enterprise, attendees were given an introduction to SAP BusinessObjects Predictive Analytics, along with some key use cases. The presentation covered native in-memory predictive analytics, deploying predictive analytics on Big Data, and how to bring predictive insight to Business Intelligence (BI).

The live, interactive call was moderated by  SAP Mentor  Greg Myers and featured expert speakers Ashish Morzaria, Global GTM Director, Advanced Analytics, and Richard Mooney, Lead Product Manager for Advanced Analytics.

The speakers noted that companies used to become leaders in their industries by establishing an unbeatable brand or by having a supply chain that was more efficient than anyone else’s. While this is still relevant in the digital economy, companies now have to think about how they can turn this new digital economy to their advantage. One of the keys is turning the digital economy’s key driver —the data— to their advantage.

Companies embracing digital transformation are outperforming those who aren’t. With predictive analytics, these companies can use historical data to predict behaviors or outcomes, answer “what-if” questions, and ensure employees have what they need to make optimized decisions. They can fully leverage customer relationships with better insight, and make meaningful sense of Big Data.

One big question delved into during the call: How can companies personalize each interaction across all channels and turn each one into an advantage? The answer: By getting a complete digital picture of their customers and applying predictive analytics to sharpen their marketing focus, optimize their spend, redefine key marketing activities, and offer product recommendations tailored to customers across different channels.

Real-World Customer Stories

The call also focused on some real-world examples of customers achieving value by using and embedding predictive analytics in their decisions and operations, including Cox Cable, Monext, M-Bank, and Mobilink.

These companies have been able to improve performance across thousands of processes and decisions, and also create new products, services, and business models. They’ve squeezed more efficiencies and margins from their production assets, processes, networks, and people.

One key takeaway is the importance of using algorithms, as they provide insights that can make a business process more profitable or competitive, and spotlight new ways of doing business and new opportunities for growth.

The speakers also presented a very detailed customer case study on Harris Logic. The company is using SAP BusinessObjects Predictive Analytics for automated analytics and rapid prototyping of their models. They execute models into SAP HANA for real-time predictions using a native, logistical regression model. This approach is allowing for the identification of key predictors that more heavily influence a behavioral health outcome.

Learn More

Lots of food for thought. See what questions people were asking during the webcast and get all of the answers here. Check out the complete presentation, and continue to post your questions and watch for dates for our upcoming webcast in the series via Twitter using #askSAP.

What You Need to Know About Supply Chain Risk

#3 in  series by Matthew Liotine, Ph.D. , Strategic Advisor, Business Intelligence and Operations, Professor University of Illinois

In our previous articles, we discussed how disruptions to a supply chain can originate from a multitude of sources. According to some current trends, it is apparent that there is continued rise in measured losses from disruptions such as natural events and business volatility. Traditionally, supply chains are designed for lean operational efficiency wherever possible, yet such efficiency requires the minimization of excess capacity, inventory and redundancy – the very things that are needed to create resiliency against disruptive risks. Risk assessment tools and methodologies help decision-makers to identify the most cost effective controls that can strike the right balance between cost and risk reduction to protect against disruption. Typically, the most cost effective controls are those that can minimize the common effects arising from multiple disruptive threats. In order to understand the kind of controls that could be effective, one must recognize the risk outcomes from common supply chain vulnerabilities, which is the focus of this article.

What is Risk?

Before continuing, it would be worthwhile to revisit some of the terminology that we have been using in previous discussion, in order to understand how risk is derived. Fundamentally, risk is the chance (or the probability) of a loss or unwanted negative consequence. For decision purposes, it is often calculated numerically as a function of probability and impact (sometimes called single loss expectancy), and quantitatively expressed as an “expected” loss in monetary value or some other units. A common flaw with using risk values is that they mask the effects of impact versus probability. For example, an expected loss of $100 does not reflect whether high impact is overwhelming low probability, or high probability is overwhelming low impact. Thus, it is not clear whether this value is the expected loss due to an event that occurs 10% of the time and causes $1000 in damages when it occurs, or due to an event that occurs 20% of the time and causes $500 in damages when it occurs. For this very reason, risk values must be used in conjunction with probability and damage values, along with many other metrics, in order for the decision maker to compare the one risk against another. Risk values are not precise and are usually not to be used as standardized values for business management. Nevertheless, risk values can be used to provide decision makers with a means to distinguish risks and control options on a relative basis. Figure 1 illustrates the fundamental parameters that are used to construct risk values, and how they relate to each other.

SC 3 graphic

Figure 1 – Fundamental Components of Risk

Hazards, conditions and triggers are situations that increase or cause the likelihood of an adverse event (sometimes referred to as a peril). In our last article, we examined numerous sources of hazards that can threaten a supply chain. Vulnerabilities are factors that can make a system, in our case a supply chain, susceptible to hazards.  They are usually weaknesses that can be compromised by a hazardous condition, resulting in a threat. The likelihood, or probability, of a threat circumstance occurring must be considered, for reasons discussed above. If it occurs, failures can take place, whose effects are quantified as impacts. When impacts are weighed against the likelihood of the threat, the result is a risk that poses an expected loss. Controls are countermeasures that a firm can use to offset expected losses.

With respect to a supply chain, there are many ways to classify risk. Academics have made many attempts to try to classify risks according to some kind of ontology or framework (Harland, Brenchley and Walker 2003) (Gupta, Kumar Sahu and Khandelwal 2014) (Tummala and Schoenherr 2011) (Peck 2005) (Monroe, Teets and Martin 2012) (Chopra and Sodhi 2004). Some of the more common supply chain risk classifications include:

Recurring risks – These risks arise within the operational environment due to the inability to match supply and demand on a routine basis. The ensuing effects are lower service levels and fill rates.

Disruptive risk – These risks result from loss of supply or supplier capacity, typically driven by some disruptive event.

Exogenous risk – These risks arise within the operational environment and are process driven (e.g. poor quality control, design flaws, etc.), usually within the direct influence of the firm. They typically require the use of preventive mechanisms for control.

Endogenous risk – These risks originate externally, either from the supply side or demand side, which may not necessarily be under a firm’s direct influence. They typically involve the use of responsive mechanisms for control.

While many classification attempts have been noble in nature, in the end it is difficult to classify risks according to a single scheme, for a variety of reasons. First, the lines of demarcation between risk categories can be blurred and there could be overlap between them. For example, from the above categories, one can easily argue about the differences between exogenous and recurring risks. Second, every firm is different, and thus one framework may not fit all. Finally, risk methodology approaches may differ somewhat across various industries, as evidenced by different industry best practices and standards for risk analysis.

Supply chains can exhibit many kinds of vulnerabilities, but quite often these can be viewed as either structural or procedural in nature. Structural vulnerabilities stem from deficiencies in how the supply chain is organized, provisioned and engineered. Single points of failure can arise when there is insufficient diversity across suppliers, product sources or the geographical locations of sources. Inadequate provisioning can create shortages in inventory or capacity to meet customer demands. Procedural vulnerabilities stem from deficiencies in business or operational processes. Gaps and oversights in planning, production or transport processes could adversely affect a firm’s ability to respond to customer needs. Insufficient supply chain visibility could render a firm blind to oversights in supplier vetting and management practices, quality assurance and control, or demand planning.

Such kinds of vulnerabilities, combined with an aforementioned hazardous condition, results in the supply chain failing in some fashion. Table 1 illustrates some of the more common modes of supply chain failure.

Table 1 – Common Supply Chain Failure Modes

Degraded fill rate

Degraded service level

High variability of consumption

Higher product cost

Inaccurate forecasts

Inaccurate order quantity

Information distortion

Insufficient order quantities

Longer lead times/delays

Loss of efficiency

Lower process yields

Operational disruption

Order fulfillment errors


Poor quality supplied

Supplier stock out


Ultimately, such supply chain failures result in increased costs, loss of revenue, loss of assets, or combination thereof. Common risks are typically assessed as increases in ordering costs, product costs, or safety stock costs. Product stock out losses can be assessed as backorder costs or loss of sales and business revenue. Different kinds of firms will be prone to different types of risks. For example, a manufacturing firm with long supply chains will be more susceptible to ordering variability (or bullwhip) types of effects versus a shorter retail supply chain which would be more sensitive to fill rate and service level variability. Understanding and characterizing these risks is necessary in order to develop strategies to control or manage them. Quantifying risks provides the decision maker with a gauge to assess risk before and after a control is applied, thereby assessing the prospective benefit of a potential control. Using quantified risk values, in combination with other parameters, enables a decision maker to prioritize potential control strategies according to their cost-effectiveness.


Risk is the chance or the probability of a loss or unwanted negative consequence. Inherent supply chain weaknesses such as sole sourcing, process gaps or lack of geographical sourcing diversity can render a supply chain more vulnerable to some hazardous, unforeseen condition or trigger event, such as a strike or major storm, resulting in undesirable increases in costs, asset loss or revenue loss. Such risks can be quantified to some extent, quite often in monetary units, and can be used to facilitate cost-benefit analysis of potential control strategies. In our next article, we will take a look some of the most favored strategies to control supply chain risk.


Chopra, S., and M. Sodhi. “Managing Risk to Avoid Supply-Chain Breakdown.” MIT Sloan Management Review, 2004: 53-61.

Gupta, G., V. Kumar Sahu, and A. K. Khandelwal. “Risks in Supply Chain Management and its Mitigation.” IOSR Journal of Engineering, 2014: 42-50.

Harland, C., R. Brenchley, and H. Walker. “Risk in Supply Networks.” Journal of Purchasing & Supply Management, 2003: 51-62.

Monroe, R. W., J. M. Teets, and P. R. Martin. “A Taxonomy for Categorizing Supply Chain Events: Strategies for Addressing Supply Chain Disruptions.” SEDSI 2012 Annual Meeting Conference Proceedings. Southeast Decision Sciences Institute, 2012.

Peck, H. “Drivers of Supply Chain Vulnerability.” International Journal of Physical Distribution & Logistics Management, 2005: 210-232.

Tummala, R., and T. Schoenherr. “Assessing and Managing Risks Using the Supply Chain Risk Management Process (SCRMP).” Supply Chain Management: An International Journal, 2011: 474-483.



10 Data Visualizations You Need to Know Now

word cloud predictive dataNo one likes reading through pages or slides of stats and research, least of all your clients. Data visualizations can help simplify this information not only for them but you too! These ten different data visualizations will help you present a wide range of data in a visually impactful way.

1.Pie Charts and Bar Graphs—The Usual Suspects for Proportion and Trends

New to data visualization tools? Start with the traditional pie chart and bar graph. Though these may be simple visual representations, don’t underestimate their ability to present data. Pie charts are good tools in helping you visualize market share and product popularity, while bar graphs are often used to compare sales revenue over the years or in different regions. Because they are familiar to most people, they don’t need much explanation—the visual data speaks for itself!

2.Bubble Chart—Displaying Three Variables in One Diagram

When you have data with three variables, pie charts and bar graphs (which can only represent two variables at the most) won’t cut it. Try bubble charts, which are generally a series of circles or “bubbles” on a simple X-Yaxis graph. In this type of chart, the size of the circles represents the third variable, usually size and quantity.

For example, if you need to present data on the quantity of units sold, the revenue generated, and the cost of producing the units, use a bubble chart.  Bubble charts immediately capture the relationship between the three variables and, like line graphs, can help you identify outliers quickly. They’re also relatively easy to understand.

3.Radar Chart—Displaying Multiple Variables in One Diagram

For more than three variables in a data set, move on to the radar chart. The radar chart is a two-dimensional chart shaped like a polygon with three or more variables represented as axes that start from the same point.

Radar charts are useful for plotting customer satisfaction data and performance metrics. Primarily a presentation tool, they are best used for highlighting outliers and commonalities, as radar charts are able to simplify multivariate data sets.

4.Timelines—Condensing Historical Data

Timelines are useful in depicting chronological data. For example, you can use it to chart company milestones, like product launches, over the years.

Forget the black and white timelines in your history textbooks with few dates and events charted. With simple tools online, you can add color and even images to your timeline to accentuate particular milestones and other significant events. These additions not only make your timeline more visually appealing, but easier to process too!

5.Arc Diagrams—Plotting Relationships and Pairings

The arc diagram utilizes a straight line and a series of semicircles to plot the relationships between variables (represented by nodes on the straight line), and helps you to visualize patterns in a given data set.

Commonly used to portray complex data, the number of semicircles within the arc diagram depends on the number of connections between the variables. Arc diagrams are often used to chart the relationship between products and their components, social media mentions, and brands and their marketing strategies. The diagram can itself be complex, so play around with line width and color to make it clearer.

6.Heat Map—For Distributions and Frequency in Data

First used to depict financial market information, the heat map has nothing to do with heat but does display data “intensity” and size through color. Usually utilizing a simple matrix, the 2D area is shaded with different colors representing different data values.

Heat maps are not only used to show financial information, but web page frequency, sales numbers and company productivity as well. If you’ve honed your data viz skills well enough, you can even create a heat map to depict real time changes in sales, the financial market, and site engagement!

7.Chloropleth and Dot Distributions Maps—For Demographic and Spatial Distributions

Like heat maps, chloropleths and dot distribution maps use color (or dots) to show differences in data distribution. However, they differ from heat maps because they’re specific to geographical boundaries. Chloropleths and dot distribution maps are particularly useful for businesses that operate regionally or want to expand to cover more markets, as it can help present the sales, popularity, or potential need of a product to the client in compelling visual language.

8.Time Series—Presenting Measurements over Time Periods

This looks something like a line graph, except that the x-axis only charts time, whether in years, days, or even hours. A time series is useful for charting changes in sales and webpage traffic. Trends, overlaps, and fluctuations can be spotted easily with this visualization.

As this is a precise graph, the time series graph is not only good for presentations (you’ll find many tools to help you create colorful and even dynamic time series online), it’s useful for your own records as well. Professionals both in business and scientific studies typically make use of time series to analyze complex data.

9.Word Clouds—Breaking Down Text and Conversations

It may look like a big jumble of words, but a quick explanation makes this a strong data visualization tool. Word clouds use text data to depict word frequency. In an analysis of social media mentions, instead of simply saying “exciting” has been used x number of times while “boring” has been used y number of times, the word that is used most frequently appears the largest, and the word that hardly appears would be in the smallest font.

Word clouds are frequently used in breaking down qualitative data sets like conversations and surveys, especially for sales and branding firms.

10.Infographics—Visualizing Facts, Instructions and General Information

Infographics are the most visually appealing visualization on this list, but also require the most effort and creativity. Infographics are a series of images and text or numbers that tell a story with the data. They simplify the instructions of complex processes, and make statistical information easily digestible. For marketers, infographics are a popular form of visual content and storytelling.

Get more information on building charts, graphs and visualization types.

– See more at:

The Nature of Supply Chain Risk

Contributed by Matthew Liotine, PHD

In our last article, we looked at the magnitude of the supply chain risk problem and how it is a major concern for most companies – large or small. Studies have shown that most companies experience one or few supply chain disruptions annually, each resulting in some significant loss. Many of these disruptions involve key suppliers or those below Tier 1. Never the less, many firms still lack commitment to controlling supply chain risk for the reasons of the costs and complexity involved. Consequently, many firms will tend to favor short term ROI solutions versus longer-term solutions that involve investing capital to improve both their supply chain infrastructure and operational resilience. Larger firms will manage risk more strategically using a combination of executive governance and/or data driven approaches. While operational data is increasingly becoming more available, much work is still needed in leveraging such data for strategic risk management. The nature of risk in the supply chain lies with a firm’s exposure to potential disturbances to the supply chain operation. Many of these disturbances can be manifested in various ways, usually in the form of single, multiple or recurring events, conditions or phenomena. In this article, we will examine what kinds of hazards, events or triggers can possibly compromise supply chain weaknesses and can ultimately threaten supply chain operations.

The Changing Nature of Threats

When one thinks about threats to a supply chain, natural disasters usually first come to mind. Figure 1 shows the trend in major U.S. disaster declarations as reported from the Federal Emergency Management Administration (FEMA, 2011). While it is clear that there has been a rising trend in declarations, the reasons may vary from the increase in severe weather events due to climate change, to political influences. Figure 2 shows a trend in worldwide natural catastrophes (Munich RE, 2014).

SC chart 1

Figure 1 – Trend in U.S. Disaster Declarations

SC chart 2

Figure 2 – Trend in Worldwide Natural Catastrophic Losses

As evident in the Figure, there’s an ever growing trend in measured losses. While natural catastrophes have been occurring since the beginning of time, their effects over the years have been more far reaching due to population growth and insurability trends. These trends, combined with human created disruptions, together have created an environment of increased volatility for supply chains, as depicted in Figure 3 (Martin & Howleg, 2011).

SC Chart 3

Figure 3 – Trend in Supply Chain Volatility

This Figure shows the annual volatility in a composite set of key business parameters such as exchange rates, interest rates, shipping costs and raw material prices. They are combined into a single volatility index using the coefficient of variation (CoV) of the business indices representing these parameters to produce a normalized volatility metric. While in the far past there has been a timely return to supply chain stability following adverse events, the recent increase in volatility bandwidth questions whether this trend would likely continue. The high collective swings (versus individual swings) in key business parameters, which may be correlated with each other, suggests that an alternative approach to designing supply chains and managing supply chain risk might be preferred.

Volatility can arise from many possible undesirable hazards, conditions or trigger events. The likelihood of such events compromising a supply chain’s vulnerability is regarded as a threat. Table 1 lists categories of possible threat sources and examples within each category. The list was compiled from several studies and is not meant to be all-inclusive (Tummala & Schoenherr, 2011) (World Economic Forum, 2012) (Accenture and World Economic Forum, 2013) (Chopra & Sodhi, 2004).


Table 1 – Supply Chain Threat Sources


  • Natural disasters
  • Terrorism and wars
  • Labor disputes/shortage
  • Single source of supply
  • Insufficient supplier capacity or responsive
  • Extreme Weather


  • Capacity inflexibility
  • Capacity cost increase
  • Geographical concentration
  • Insufficient capacity

Information System

  • Over-reliance on systems
  • Information infrastructure outages
  • Insufficient system/network integration
  • Incompatible IT platforms
  • Unavailable data/information
  • Inaccurate data/information

Sovereign Regional instability

  • Conflict & political unrest
  • Government regulations
  • Loss of control
  • Intellectual property breaches
  • Corruption
  • Export/import restrictions
  • Illicit trade & organized crime
  • Ownership/investment restrictions

Strategy & Operations

  • Lean processes

  • Frequent changes in demand
  • Sudden unforeseen demand surges/dips

Process Design changes

  • Communication gaps
  • Inaccurate specifications
  • Supplier non-compliance

Procurement Unqualified supplier

  • Inflexibility of supplier
  • Poor supplier quality or process yield
  • Supplier insolvency
  • Rate of exchange
  • Flawed supplier’s sourcing
  • Commodity price volatility
  • Global energy shortages
  • Lack of supplier transparency

Transportation Paperwork and scheduling

  • Strikes
  • Port capacity/congestion
  • Higher costs of transportation
  • Piracy
  • Infrastructure failures
  • Excessive handling
  • Custom clearances at ports
  • Border delays
  • Transportation breakdowns

Structural Fragmentation along the supply chain

  • Extensive subcontracting
  • Dependency on a single source of supply
  • Extensive outsourcing
  • Extensive offshoring
  • Product/supply network complexity



The Changing Course in Risk Management

Many supply chains are designed under the assumption of operating in stable environment (Martin C. H., 2011). While approaches such as Just-in-Time (JIT) and product-focused production are designed to minimize variation, maximize efficiency and ultimately reduce costs, they require a more rigid command-control management strategy which may not necessarily respond well in a volatile environment. In addition, the effects of volatility can be further amplified in a rigid supply chain that lacks resiliency. Building supply chain resiliency may counter the notion of an efficient operation, since it requires the addition and re-allocation of capacity, inventory and other resources that could serve as shock absorbers to withstand disruption. Since these controls will entail added costs, the use of a risk analysis methodology would be an effective tool in helping firms identify, evaluate and prioritize the most cost-effective risk-control options. It was clearly evident in Table 1 that there can be numerous sources of threats to a supply chain. However, since many threats can have similar outcomes on a supply chain operation, control options can be devised using an “all hazards” philosophy, which entails implementing controls to minimize the common effects of multiple threats or threat categories.


Supply chain disruptions can arise from many sources, both natural and man-made. Current trends indicate a continued rise in measured losses from natural events and increased business volatility in response to man-made events. Traditional supply chain structures designed for operational efficiency may not necessarily be able to withstand disruptions arising from numerous threat sources. Creating a more resilient supply chain may require the use of risk assessment tools and methods to help decision-makers identify the most cost effective controls that could minimize the common effects arising from multiple threats. In the next article, we will examine some common supply chain vulnerabilities and their ensuing risks.


Accenture and World Economic Forum. (2013). Building Resilience in Supply Chains. Accenture.

Chopra, S., & Sodhi, M. S. (2004). Managing Risk To Avoid Supply-Chain Breakdown. MIT Sloan Management Review, 46(1), 53-61.

FEMA. (2011). Democratic Blog News. Retrieved from

Martin, C. H. (2011). Supply Chain 2.0: Managing Supply Chains in the Era of Turbulence. International Journal of Physical Distribution & Logistics Management, 41(1), 63-82.

Martin, C., & Howleg, M. (2011). Supply Chain 2.0: Managing Supply Chains in the Era of Turbulence. International Journal of Physical Distribution & Logistics Management, 41(1), 63-82.

Munich RE. (2014, January). Topics Geo: After the Floods. Munchen: Munich RE.

Tummala, R., & Schoenherr, T. (2011). Assessing and Managing Rrisks Using the Supply Chain Risk Management Process (SCRMP). Supply Chain Management: An International Journal, 16(6), 474–483.

World Economic Forum. (2012). New Models for Addressing Supply Chain and Transport Risk. World Economic Forum.

Reimagine Predictive Analytics for the Digital Enterprise


As part of a broad announcement made at SAPPHIRE NOW 2016, SAP announced a range of new features and capabilities in its analytics solutions portfolio. Because predictive capabilities play an important role in the portfolio, I thought I’d take this opportunity to share the details of our innovations in both SAP BusinessObjects Cloud and SAP BusinessObjects Predictive Analytics.

Innovations in SAP BusinessObjects Cloud

Predictive analytics capabilities have been added to the SAP BusinessObjects Cloud offering. Business users can use an intuitive graphical user interface to investigate business scenarios by leveraging powerful built-in algorithmic models. For example, users can perform financial projections with time series forecasts, automatically identify key influencers of operational performance, and determine factors impacting employee performance with guided machine discovery.

Learn more about our predictive capabilities in SAP BusinessObjects Cloud.

Innovations in SAP BusinessObjects Predictive Analytics

Predictive analytics features that aim to help analysts easily deliver predictive insights across an enterprise’s business processes and applications are planned for availability in the near term.

Planned innovations include:

  • Automated predictive analysis of Big Data with native Spark modeling in Hadoop environments
  • Enhancements for SAP HANA including in-database social network analysis and embedding expert model chains
  • A new simplified user interface for the predictive factory and automated generation of segmented forecast models
  • Integration of third-party tools and external processes into predictive factory workflows
  • The ability to create and manage customized models that detect complex fraud patterns for the SAP Fraud Management analytic application

Learn more about what SAP Predictive Analytics has in store.

Upcoming Release of SAP Predictive Analytics

Watch the video about our upcoming release of SAP Predictive Analytics for more information.

Thank you to Pierre Leroux, Director, Predictive Analytics Product Marketing, SAP for writing this informative article.


The Human Aspect of Predictive Analytics

By Ashith Bolar , Director AmBr Labs, Amick Brown

The past decade and a half has seen a steady increase in Business Intelligence (BI). Every company boasts a solid portfolio of BI software and applications. The fundamental feature of BI is Data Analytics. Corporations that boasts large data do indeed derive a lot of value from their Data Analytics. A natural progression of Data Analytics is Predictive Analytics (PA).

Think of data analytics as a forensic exercise in measuring the past and the current state of the system. Predictive Analytics is the extension of this exercise: Instead of just analyzing the past and evaluating the current, predictive analytics applies that insight to determine, or rather shape the future.

Predictive Analytics is the natural progression of BI.

The current state of PA in the general business world is, for the most part, at its inception. Experts in the field talk about PA as the panacea – as the be-all and end-all solution to all business problems. This is very reminiscent of the early stages of BI towards the turn of the century. Hype as it may be, BI did end up taking the center stage over the ensuing years. BI was not the solution to the business problems anymore; it was indeed mandatory for the very survival of a company. Companies don’t implement BI to be on the leading-edge of the industry anymore. They implement BI just to keep up. Without BI, most companies would not be competitive enough to survive the market forces.

Very soon, PA will be in a similar state. PA will not be the leading-edge paradigm to get a headstart over other companies. Instead PA is what you do to just survive. All technologies go through this phase transition – from leading-edge to must-have-to-survive. And PA is no different.

predict the future

Having made this prediction, let’s take a look at where PA stands. Some industries (and some organizations) have been using predictive analytics for several decades. One such not-so-obvious example is the financial industry. The ubiquity of FICO scores in our daily lives does not make it obvious, but they are predictive analytics at work. Your fico score predicts, with a certain degree of accuracy, the likelihood of you defaulting on a loan. A simple number, that may or may not be accurate in individual cases, arguably has been the fuel to the behemoth economic machinery of this country, saving trillions of dollars for the banking industry as well as the common people such as you and me.

Another example would be that of the marketing departments of large retail stores. They have put formal PA to use for several years now, in a variety of applications such as product placement, etc. If it works for them, there is no reason it should not work for you.

This is easier said than done. Implementing Predictive Analytics is not a trivial task. It’s not like you buy a piece of software from the Internet, install it on a laptop, and boom – you’re predicting the future. Although I have to admit that that is a good starting point. Implementing the initial infrastructure for PA does require meticulous planning. It’s a time-consuming effort, but at this point in time, a worthy effort.

Let’s take a look at high-level task list for this project

  1. Build the PA infrastructure
  2. Choose/build predictive model(s)
  3. Provision Data
  4. Predict!
  5. Ensure there’s company-wide adoption of the new predictive model. Make PA a key part of the organization’s operational framework. Ensure that folks in the company trust the predictive model and not try to override it with their human intelligence.

Steps 1 thru 4 are the easy bits. It’s the 5th step that requires a significant effort.

Most of us have relied on our superior intellect when it comes to making serious decisions. And most of us believe that such decision-making process yields the best decisions. It is hard for us to imagine that a few numbers and a simple algorithm would yield better decisions than those from the depths of our intellect.

However, it is important to change your organization’s mindset about predictive analytics. If you are considering your business to be consistently run on mathematical predictive models, acceptance from the user community is crucial. Implementing PA is a substantial effort in Organizational Change Management.

Remember the financial services industry. They don’t let their loan officers make spot decisions on the loan-eligibility of their clients based on their appearance, style of speech or any such human sensory cues. Although, if you ask the loan officer, they might claim to be better judges of character – the financial industry does not rely on their superior human intellect to measure the risk of loan default. Generally, a single 3-digit number makes that decision for them.

The next time you go to the supermarket for a loaf of bread, and return with a shopping cart full of merchandise that you serendipitously found on the way back from the bread aisle including the merchandise along the cash-counter, you can thank (or curse) the predictive analytics employed by the store headquarters located probably a thousand miles away from you.

AmickBrown.comGet What You Expect

The Time to Change is Now

clock_calendar_moneyThe world is speeding ahead at a significant pace towards a major revolution—the data-driven economy.  Several data-driven start-ups in the last decade have become large corporations (Google, Facebook, Twitter), with billions of people reached and influenced by their innovations. Here is a list of the hottest start-ups that are looking to mature to the big league.

As the momentum is picking up, major organizations from different industry verticals are in a quest to exploit the opportunities that have arisen from the humongous amount of data that their business generates, directly and indirectly. Philip Evans, Senior Partner, Boston Consulting Group, discussed in his TED talk what businesses would look like in the future, and the impact that Big Data will have on business strategies.

Whether businesses want to use data to make the world a better place, to understand the wishes of customers before they’re expressed, to be more proactive than reactive in decision making based on predictive technologies, or something else, there are several challenges that we must all face.

These challenges include the following.

  1. Data volumes are ever-increasing. Most of the data is unstructured (either textual, videos, graphs and so on) rather than transactional and structured.
  2. The decision cycles are becoming shorter. We expect millisecond response times from the systems we interact with. And with mission-critical applications, the response time could be even shorter.
  3. Thousands of predictive models are required to get coverage of all the predictive scenarios that an application can create.
  4. Traditional methods of modelling are very time-consuming. The quest to find a perfect model drains valuable time and money before it can be put to business use.
  5. The knowledge workers who understand data science, and who could mine useful actionable nuggets from the data, are rare. The demand for such skilled workers is ever-increasing and their lack of availability is causing a massive skills gap.

With Challenges Comes Opportunities

However, with challenges come opportunities.

Consider the Industrial Revolution. As we know, at that point in history the move was to automate processes that were repetitive or required more manual effort, and find ways to free valuable resources—the brain and imagination—that we use to focus on even larger problems. The result is the modern world we now live in.

Now the data revolution is demanding a new change. That is, the way in which we work with data. We must find ways and means to automate most of the repetitive workflows and modelling processes that are applicable industry-wide. This way, we can free the very valuable time of the data scientist to focus on tough problems that cannot be solved without human intervention.

With several thousand models that enable a data-driven company to run, it’s also important to have capabilities that enable the company to monitor the performance of these models in real time. This means decommissioning the models that exhibit significant deviation in performance, as compared to when they were deployed on production systems.

This paves the way for the need of a Massive Predictive Factory, a single source of truth and heart-beat monitor for the entire organization.

For more on Predictive Analytics, Follow Amick Brown

Data Driven Decisions improve your business

Fact Based Decision Making

For many companies the first reporting and analytics question that they ask is, “What specific items should my company measure?” However, what you measure should be based on how to get results from your data that make measurable change in the organization. The first question really is, “What are my business goals and what measurable components can help me achieve or miss this goal?

Decision Man

Some examples are:

  1. Churn reduction
  2. Retirement possibilities in the next year.
  3. Employees that leave in less than a year
  4. Departments with the highest attrition
  5. Supply chain service improvement
  6. JIT miscalculations by department or location
  7. Customer service complaints, late deliveries to customers
  8. Staffing variations and what affect this has on production 

Why fact based is important…

Another huge common occurrence in the reporting world is decisions made without trackable, measureable fact. Unbelievably, companies still make decisions with historical process, experience, and their “gut” to some degree.

With the availability of big data, your competitors are not only going to have access to information about themselves, but also about your customers and your ability to perform. If this data is not used well by your company – you will lose business.

So how exactly does your company aggregate data, produce reports, or glean insight to meet goals? If you are like the vast majority of companies out there, it is a big data dump into a spreadsheet that is picked through and interpreted by the individual who requested it.

Many have very slick Reporting solutions, but they are not leveraging them to the full potential – not by a long shot. Why is this? The pervasive gaps are that people are creatures of habit and continue to want to report like they have always done and change management/training is not factored in. The poor IT Manager put in charge of the BI project is inundated with data dump requests and help requests on-going.

Circling back to Data Driven Decisions, the very first things that must be carefully and completely determined are:

  1. What are the challenges that prevent me/my company from beating the competition, increasing revenue, operating smoothly, etc?
  2. What people drive the resolution of these challenges?
  3. What data and report metrics do these users need to show the golden road to overcoming the challenges?

Beginning with what decisions need to be made, which people will drive goal attainment, then what data and metrics will roll up to an answer – the first big hurdle to Business Intelligence success will have been overcome.

For more on this subject, watch this space…