All posts by Amick Brown

SAP BW/4HANA – The Value of Agility

Thank you Neil McGovern, SAP Senior Director of Marketing for this article.

On August 31st, 2016 we released SAP BW/4HANA. A great deal of ink and pixels has been spilled outlining its capabilities, but I’d like to look at one of the key reasons we built this new product.
As we stated in our earlier post the Forrester research we sponsored showed a correlation between business agility and revenue growth.
In SAP BW/4HANA the option to deploy on SAP HEC and Amazon AWS (plus others to follow) in addition to on-premise, has given customers the ability to start a new BW/4HANA instance in under an hour.
BW/4HANA users also will have the advantage of enhanced modeling coupled with a dramatically simplified set of data warehousing objects and simplified governance that will enable agile data warehousing development, delivering better real-time business insights faster, and at less cost, than before. Business applications can either leverage “pre-built” data warehousing or “SQL-based” development environments that delivers faster go-to-market and lets applications work with other SQL-based solutions.
Agility is key for BW/4HANA. We have improved agility in three ways:
SIMPLICITY
• The number of data objects is reduced –eliminates data redundancies, increases consistency, and results in a smaller footprint
MODERN INTERFACE
• Administration efforts to maintain data objects and error-prone data flows are also reduced
• Customers can build their own HANA models on top of the BW/4 HANA models
• User interface will improve for administrators, as next generation HANA Studio and Browser front ends will be used instead of SAP GUI front ends
OPENNESS
• Customers see great benefit in the possibility to expose BW/4 HANA models as native HANA views
• Customers will be able to use HANA in “BW” mode and in native “SQL” mode or a combination of the two
One of our first BW/4HANA customers is Fairfax Media. Fairfax are an experienced BW customer who had business challenges that fit BW/4HANA new capabilities. With BW/4HANA the project took half the time anticipated, three months instead of six. The flexibility and simplicity of the new object set and development environment, plus the ability to develop in the Cloud were key to this success. The resulting system was 10 times faster for end users and allowed cost savings to be identified in their expenses.

BW/4HANA, HANA Cloud Integration, and S/4HANA Finance are areas where Amick Brown can help your company succeed. AmickBrown.com

Learn more about BW/4HANA at sap.com/bw4hana

 

Get More Value From Operational Assets with Predictive Analytics

Kaan_groupSharpening operational focus and squeezing more efficiencies out of production assets—these are just two objectives that have COOs and operations managers turning to new technologies. One of the best of these technologies is predictive analytics. Predictive analytics isn’t new, but a growing number of companies are using it in predictive maintenance, quality control, demand forecasting, and other manufacturing functions to deliver efficiencies and make improvements in real time. So what is it?

Predictive analytics is a blend of mathematics and technology learning from experience (the data companies are already collecting) to predict a future behavior or outcome within an acceptable level of reliability.

Predictive analytics can play a substantial role in redefining your operations. Today, let’s explore three additional cases of predictive analytics in action:

  • Predictive maintenance
  • Smart grids
  • Manufacturing

Predictive Maintenance

Predictive maintenance assesses equipment condition on a continuous basis and determines if and when maintenance should be performed. Instead of relying on routine or time-based scheduling, like having your oil changed every 3,000 miles, it promises to save money by calling for maintenance only when needed or to avoid imminent equipment failure.

While equipment is in use, sensors measure vibrations, temperature, high-frequency sound, air pressure, and more. In the case of predictive maintenance, predictive models allow you to make sense of the streaming data and score it on the likelihood of failure occurring. Coupled with in-memory technologies, it can detect machine failures hours in advance of it occurring and avoid unplanned downtime by scheduling maintenance sooner than planned.

This all means less downtime, decreased time to resolution, and optimal longevity and performance for equipment operators. For manufacturers, predictive maintenance can streamline inventory of spare parts and the ongoing monitoring services can become a source of new revenue. And as predictive maintenance becomes part of the equipment, it also has the potential to become a competitive advantage.

Smart Grids

Sensors and predictive analytics are also changing the way utilities manage highly distributed assets like electrical grids. From reliance on unconventional energy sources like solar and wind to the introduction of electric cars, the energy landscape is evolving. One of the biggest challenges facing energy companies today is keeping up with these rapid changes.

Smart grids emerge when sensor data is combined with other data sources such as temperature, humidity, and consumption forecasts at the meter level to predict demand and load. For example, combined with powerful in-memory technologies, predictive analytics can be used by electricity providers to improve load forecasting. That leads to frequent, less expensive adjustments that optimize the grid and maintain delivery of consistent and dependable power.

As more houses are equipped with smart meters, data scientists using predictive analytics can build advanced models and apply forecasting to groups of customers with similar load profiles. They can also present those customers with some ideas to reduce their energy bill.

Manufacturing

The manufacturing industry continues its relentless drive for customization and “Lot sizes of 1” with innovations such as the connected factory, the Internet of Things, next shoring, and 3D printing. It’s also hard at work making sure it extracts the maximum productivity from existing facilities, which traditionally has been accomplished by using automation and IT resources. According to Aberdeen, the need to reduce the cost of manufacturing operations is now the top reason companies seek more insight from data.

Quality control has always been an area where statistical methods have played a key role in whether to accept or reject a lot. Now manufacturers are expanding predictive analytics to the testing phase as well. For example, tests on components like high-end car engines can be stopped long before the end of the actual procedure thanks to predictive analytics. By analyzing test data from the component’s ongoing testing against the data from other engines, engineers can identify potential issues faster. That in turn, maximizes the capacity available for testing and reduces unproductive time. That is only one of the many applications manufacturers find for predictive analytics.

Innovations on the Shop Floor

Predictive analytics provides an excellent opportunity for COOs and operations managers to extract additional value from production assets. It can also be an opportunity to create critical differentiators in the way products are created and delivered to customers—by providing it as a paid service (predictive maintenance) or as insight (predicting future electricity consumption).

However a company chooses to use it, predictive analytics can be the key to beating the competition.

Discover and Follow

And join the predictive conversation by following me on Twitter @pileroux.

AmickBrown.com

 

5 Ways to Drive Value with BI Proof of Concepts

by Kaan Turnali, Global Senior Director, Enterprise Analytics

Designers in a meeting --- Image by © Laura Doss/CorbisProof of concepts (POC) specifically designed for business intelligence (BI) projects can be invaluable because they can help to mitigate or eliminate the risks associated with requirements whether we’re working with a new BI technology, asset, or data source.

POCs (sometimes referred to as proof of principle) may be presented with slightly varying interpretations in different areas of business and technology. However, a BI POC attempts to validate a proposed solution that may cover one or more layers of the BI spectrum through a demonstration with a small number of users.

There are many reasons why a BI POC may be needed, and they may come in different shapes and sizes. Some focus on the end user; others may deal with data or the ETL process. BI POCs can be small, quick, and even incomplete. Or they can be involved, measured, and lengthy. Some are initiated ad-hoc and executed informally while others may require a process as strict as a full-scale project and the same level of funding as a formal engagement.

Here are five ways to drive value with your BI POCs.

1. Focus More on the Value and Less on the Mechanics

You can’t lose the sight of the big picture—it doesn’t matter how simple the BI requirement may appear or how informal the process may be that you’re asked to follow. Often BI teams concentrate on the technical details (a necessary step), but you need to go beyond just the mechanics and think about the value. Sometimes, a technical solution alone may not be adequate because technology is only half of the solution. And BI is no different.

2. Identify All of the BI Layers in Question

In a typical BI project, there are usually several layers involved: data, ETL, reports, access, and so on. Depending on the size and/or scope of your BI project, identifying the correct BI layers that need validation becomes critical. For example, you may be looking at a report design, but you can’t simply ignore the underlying data source or required data transformation rules.

3. Cheat on Sample Data, but Not on the Logic

Time is an extremely scarce resource in business, and POCs are often executed at a higher velocity. As you manage the process, it’s completely acceptable to cut corners such as hard coding a value in a report instead of fully defining the formula or building an integrated process to calculate it. But if you cheat, you should always cheat on time and not on concept.

4. Define and Manage the Scope

No matter how informal your BI POC may be, you need to define and maintain a POC scope. Open-ended or prolonged efforts result in waste. And BI POCs are not immune to this virus. It may not require intricate project- or change-management processes, but you still need to have a plan and execute around that plan.

5. The Right Talent Matters

Identifying the right talent with the right background is critical to your BI POCs success. It goes without saying that subject matter expertise around BI as well as areas related to business content and processes is a prerequisite. However, equally important are the soft skills, starting with critical thinking.

Bottom Line

If our goal is to enable faster, better-informed decisions, technical know-how alone won’t guarantee successful outcomes, because a POC is only as good as its assumptions and the BI team that’s executing it.

It all starts and ends with leadership that can pave the way for executing a BI vision where technology becomes a conduit to delivering business growth and profitability through the talent and passion of our teams.

What other ways do you see that can drive value with BI POCs?

AmickBrown.com

 

What if Beethoven and Mozart Invented Their Own Notation System

sheet_music_violinTo appreciate how semantic notation can impact your business, take a step back for a moment and imagine if every composer from Mozart to Beethoven used a different notation system. How would conductors and musicians interpret the music in a moment without standardized notation? What if engineers didn’t have a standardized notation system? Most likely they wouldn’t be able to communicate vast amounts of information clearly and quickly.

Yet in business, there is an overabundance of ways to layout out corporate reports and dashboards. And even within a single company, you will find forecast data or averages defined and displayed differently. But with pattern recognition, you can immediately understand the context of that information. This is the essence of a standard notation system, which brings clearer, data-driven insights and faster visualization turnaround.

Communicate Vast Amounts of Data-Driven Insights with Clearer, More Aligned Messages

Executives can  digest and act on visual data faster when it is always laid out the same: forecast, averages, historical and all metrics always looks the same and are in the same layout. People learn quickly to recognize patterns, and this is helpful to interpreting volumes of data. Critical to good business decision-making is the ability to portray very dense amounts of information, while maintaining clarity. This is vital when extrapolating multiple metrics for data to understand how it relates to the business.

A visualization that shows a percentage breakdown of revenue into products is, in itself, not very useful. To act and make better decisions, you need to understand how revenue has changed over time and to compare it to other product lines, the budget, profit margin, and market share. Also, executives spend less time trying to align the data into one version of the truth when all metrics are calculated and portrayed using the same standards across all business units. Having a standard notation system in your business can help you foster data-driven culture and alignment for better decision making. 

Faster and Improved Visualization of Analysis and Insights

Although content creators spend less time inventing their own system and layout, they still follow guidelines. In speaking with people who have adopted international business communications standards (IBCS), they found, for example, that the average time to create dashboard dropped three-fold. Through standard notation systems, they shortened implementation times and improved the outcome of their analytics investments.

Don’t Start from Scratch – These Are Best Practices

One of the best-developed, semantic notation systems – which was only chosen by the SAP Executive Board back in 2011 – is based on an open source project called International Business Communications Standards. Anyone can join the association and benefit from years of thought leadership and best practices developed over decades. Plus, the community is included in the evolution of these standards.

Register for the Standards Course

OpenSAP allows you to learn anywhere, anytime and on any device with free courses open to the public.

This blog orignially appeared on the  D!gitalist Magazine by SAP and SAP Business Objects Analytics blog has been republished with permission.

 

Dresner’s Advanced and Predictive Analytics Study Ranks SAP #1 for Second Time in a Row

By  Chandran Saravana,  Senior Director Predictive Analytics Product Marketing

For the second year in a row SAP has received the number one ranking in the Wisdom of Crowds 2016 Advanced and Predictive Analytics Market Study by Dresner Advisory Services. The Dresner study reached over 3000 organizations and vendors’ customer communities and 20+ industry verticals with an organization size ranging from 100 to 10,000+.

Study findings include:

  • Organizations view advanced/predictive analytics as building on existing business intelligence efforts.
  • Over 90% agree about the importance and value of advanced and predictive analytics.
  • Statisticians/data scientists, business intelligence experts, and business analysts are the greatest adopters of advanced and predictive analytics.
  • Regression models, clustering, textbook statistical functions, and geospatial analysis are the most important analytic user features/functions.
  • Usability features addressing sophisticated advanced/predictive analytic users are almost uniformly important today and over time, led by easy iteration, advanced analytic support, and model iteration.
  • In-memory analytics and in-database analytics are the most important scalability requirements to respondents, followed distantly by Hadoop and MPP architecture.

I find it interesting that the Dresner study finds “Hybrid roles are also evident” and confirms SAP’s customer organization usage of predictive analytics. The research study looked at core advanced and predictive features, data preparation, usability, scalability, and integration as key criteria to rank the vendors.  Though usability criteria looked at many things, I would like to highlight one key one—“Support for easy iteration”—that ranked as most important.

In the scalability criteria, “In-memory analytics” ranked as most important one followed by “In-database analytics” and “In-Hadoop analytics (on file system).”

Read the Complete Dresner Report

You can find lots more in the 92-page Dresner Wisdom of the Crowds report. I invite you to take a look.

AmickBrown.com

I Think We Need a GRC Tool. Where Do We Start?

 by Thomas Frenehard,                                               SAP GRC Solution Management

compassI’m not going to say that I have this question every Monday morning, but it does pop up rather more often than I would have expected. This is especially the case when risk, compliance or audit departments ask their IT counterparts to send them a list of suitable governance, risk and compliance (GRC) vendors but fail to really explain what they need.In essence, the request to their mind is pretty simple—just get me the list and rankings published by “[…] and […]” ( fill-in-the- blank spaces with your favourite analyst companies). That should do the trick.

This is usually what triggers the question above, with the IT department reaching out asking for a discussion on what is a GRC solution to ensure that they only source relevant options to present to their internal clients.

As mentioned by my colleague Jan Gardiner a few weeks ago (GRC ≠ Access Authorization Management), for us at SAP, the GRC portfolio is a combination of more than 10 solutions.

As a result, before even going further into any discussion, my answer is always the same, ”Well, what do you need to do?” I could spend hours explaining and illustrating the benefits of a full internal control solution, but if the original request comes from the audit team who is looking for a tool to help them support their risk-based auditing process, I’m not sure it’ll be of much use.

What’s the Requirement?

So first things first—define the need. Easier said than done of course, but it will be the foundation for everything. In case the requiring team doesn’t have a predefined idea of what exactly it is they need in terms of detailed requirements, you can always reach internally and see what is already available and being used (tools, spreadsheets, shared drives,).

Even if you then decide that none are the right option, this will still give you a good idea of what people are using today and for what purpose. And if you push your investigation further and interview the key users, they might even tell you what they currently lack. This is essential as it may lead to needs or pain points that are beyond the ones initially expressed.

Prepare for Today but Plan for Tomorrow

Now that you know the requirements, define where you are today and where you want to be tomorrow (or the day after). And keep in mind that a tool will never solve all your problems at once but it can bring new ones if expectations aren’t managed properly.

Using the information you collected above, work on a roadmap—what would be the first features needed today to facilitate work life, and what is,at the moment,  a “nice to have” but that you know will be important in the future.

With this in mind, start prioritizing so that the tool selected will be able to answer the immediate requirements, but also accompany the company as it evolves.

Leave Tabula Rasa to Aristotle!

Your company already has a wealth of risk registers, control libraries, audit repositories, and so on.

This is the “GRC memory” of your company and you certainly don’t want to get rid of it.

Collect as much as you can and then work with the business owners to review the data—define what should be carried forward, what is redundant and can be let go, and so forth.

And for what you decide is worth keeping, ensure that it’s complete and well documented. This way, not only will you embark on a new tool, but you will also have secured the consistency of the imported data. No need to have a Formula 1 car if you don’t have the right fuel, right?

Of course, I have over simplified the process, but for  a short blog that was my intent. But I hope this has still given you some food for thought for the next time your business owners call saying, “We need a GRC tool, what do you suggest?”

I look forward to reading your thoughts and comments either on this blog or on Twitter @TFrenehard

 

The Self-Service BI Application Dinner: Restaurant Guests and Home Cooks

chef_prepares_dishIn a recent thread on social media, there was an interesting discussion about just “how self-service-like” today’s self-service analytics components really are. Some of the thread contributors doubted whether self-service BI was really something one could hand over to a business end-user. They are concerned whether self-service really can exist in the day-to-day life of an end user.  “Isn’t there always some ICT intervention needed?”someone asked. It’s an interesting discussion that hasn’t a black and white answer. So let’s take a closer look with the help of a restaurant analogy.

The doubters in the social media thread were talking about self-service for data analysts. But there is a small, but strict, difference between self-service for the end user or consumers, and self-service for data analysts. To explain this, I’ll  need to use the analogy of an analytics dinner, and consider the differences between the home cook and the restaurant guest.

The BI Restaurant Guest

Our guests “equal” the business end users of analytics. A dinner can be seen as a collection of analytical insights. The insights are thoroughly selected as our guests pick either from a menu—and ordering à la carte—or they go to the buffet and pick the things presented to them already ready for consumption. Ordering à la carte refers to end users opening specific dashboards, reports, or storyboards from the business analytics portal.

The BI restaurant guest’s workflow is:

  • Screen the menu and roughly select the type and amount of items they want. Our analytics end user chooses whether he/she needs financial info or logistic info, and what kind of detail-level is needed.
  • Next our guest chooses a specific item from the menu. In analytics terms, the user decides which reports, dashboard and/or storyboards he/she needs to get the insights required. Our user also decides on prompts or variables needed to get the specific scope of the insights.
  • When dinner is served our guest just enjoys what he/she asked for, leaving leftovers if feeling like it.

buffet_dinner_tableThe BI restaurant buffet guest’s workflow is similar, with the difference being that adding special requests (like steak well done) is not possible. However, the buffet allows the guest to digest multiple small plates according to their individual needs, just like an analytical end user could consume reports and dashboards in random order.

Our guest will typically be a user of existing SAP BusinessObjects Design Studio applications or SAP BusinessObjects Cloud storyboards. I have stipulated how they work in this article.

The BI Analytics Home Cook

Our next ‘flavor’ of a self-service user is the home cook that has to cook for him/herself. This user is more like a data analyst. Somebody who may not have a clear view on what kind of insight is needed, or requires insight on non-corporate data that is not explored on a regularly basis.

Here the workflow differs. Imagine the workflow of the TV cooks we all see on tele every single day; it is the exact same workflow as our self-service end user.

1.      Our home cook opens up the fridge and explores the ingredients needed; think of the data analysts that accesses the data sources he/she requires to start exploring data.

2.      Next our home cook starts cleaning, cutting, seasoning, mixing and combining his/her ingredients. Only those pieces of the ingredients that are needed for the meal are used. This is where our data analyst starts filtering, enriching (hierarchies, formulas), blending (combining data sources) and cleaning his data.

3.      When this is all done, we typically see the home cook putting his selected ingredient-mix in the pot on the stove. This is where the data analysts starts creating the visualizations, graphs, and maps and combines them to a final storyboard which might be shared with others later on.

4.      Our home cook makes quite an important decision in the last step; either they serve the plate to their guests (his colleagues or management), or the final meal is just put on a buffet for guests/users to consume.

The Final Analysis

So in the end I believe self-service always needs to be seen in the context of the type of end user. Do we talk about a guest in our restaurant who wants to digest analytics, play with the data to any extent and conclude on the fly, or do we talk about a home cook who needs to create the insights from scratch?

In terms of the guest, self-service BI 100% exists today in the sense that they can use applications and reports and do anything (!) with the data as longs as this data is part of the menu. For  home cooks, there is a bit more work to be done—they need to open the fridge and make choices. Maybe some of the ingredients are not in, and our cook needs to go to the shop to buy them. Also, the personal touch given to the meal is fully on the creativity and capability of our cook.

Oh, and You Mr. Restaurant-Owner, What Do You Think?

If you happen to be the restaurant owner—BICC or ICT manager—of course you decide on the quality of the overall meals presented by managing ingredients and menus, but you also monitor the experience your guests go through. We might call this governance and organization. Even in self-service environments, the restaurant owner is key to the success of the restaurant. If you fail, your guests will go somewhere else.

This blog is excerpted from Iver van de Zand’s article, “How ‘Self-Service Like’ Are BI Applications Really? Buffet or a la Carte.” Read the complete article at the Iver van de Zand blog.

In the New Digital Economy, Everything Can Be Digitized and Tracked : Now What?

Woman Buying ClothesWelcome to a world where digital reigns supreme. Remember when the Internet was more of a ‘push’ network? Today, it underpins how most people and businesses conduct transactions – providing peer-to-peer connections where every single interaction can be tracked.

Enterprises are still not taking full advantage. With hundreds of millions of people connected, it’s possible for them to connect their suppliers with their customers and their payment systems, and reach the holy grail of seamlessly engaging in commerce, where a transaction can be tracked from purchase, to order received, to manufacturing, through to shipment— all in real time. It’s clear that end-to-end digitization delivers enormous potential, but it has yet to be fully tapped by most companies.

In the latest #askSAP Analytics Innovations Community Webcast, Reimagine Predictive Analytics for the Digital Enterprise, attendees were given an introduction to SAP BusinessObjects Predictive Analytics, along with some key use cases. The presentation covered native in-memory predictive analytics, deploying predictive analytics on Big Data, and how to bring predictive insight to Business Intelligence (BI).

The live, interactive call was moderated by  SAP Mentor  Greg Myers and featured expert speakers Ashish Morzaria, Global GTM Director, Advanced Analytics, and Richard Mooney, Lead Product Manager for Advanced Analytics.

The speakers noted that companies used to become leaders in their industries by establishing an unbeatable brand or by having a supply chain that was more efficient than anyone else’s. While this is still relevant in the digital economy, companies now have to think about how they can turn this new digital economy to their advantage. One of the keys is turning the digital economy’s key driver —the data— to their advantage.

Companies embracing digital transformation are outperforming those who aren’t. With predictive analytics, these companies can use historical data to predict behaviors or outcomes, answer “what-if” questions, and ensure employees have what they need to make optimized decisions. They can fully leverage customer relationships with better insight, and make meaningful sense of Big Data.

One big question delved into during the call: How can companies personalize each interaction across all channels and turn each one into an advantage? The answer: By getting a complete digital picture of their customers and applying predictive analytics to sharpen their marketing focus, optimize their spend, redefine key marketing activities, and offer product recommendations tailored to customers across different channels.

Real-World Customer Stories

The call also focused on some real-world examples of customers achieving value by using and embedding predictive analytics in their decisions and operations, including Cox Cable, Monext, M-Bank, and Mobilink.

These companies have been able to improve performance across thousands of processes and decisions, and also create new products, services, and business models. They’ve squeezed more efficiencies and margins from their production assets, processes, networks, and people.

One key takeaway is the importance of using algorithms, as they provide insights that can make a business process more profitable or competitive, and spotlight new ways of doing business and new opportunities for growth.

The speakers also presented a very detailed customer case study on Harris Logic. The company is using SAP BusinessObjects Predictive Analytics for automated analytics and rapid prototyping of their models. They execute models into SAP HANA for real-time predictions using a native, logistical regression model. This approach is allowing for the identification of key predictors that more heavily influence a behavioral health outcome.

Learn More

Lots of food for thought. See what questions people were asking during the webcast and get all of the answers here. Check out the complete presentation, and continue to post your questions and watch for dates for our upcoming webcast in the series via Twitter using #askSAP.

AmickBrown.com

What You Need to Know About Supply Chain Risk

#3 in  series by Matthew Liotine, Ph.D. , Strategic Advisor, Business Intelligence and Operations, Professor University of Illinois

In our previous articles, we discussed how disruptions to a supply chain can originate from a multitude of sources. According to some current trends, it is apparent that there is continued rise in measured losses from disruptions such as natural events and business volatility. Traditionally, supply chains are designed for lean operational efficiency wherever possible, yet such efficiency requires the minimization of excess capacity, inventory and redundancy – the very things that are needed to create resiliency against disruptive risks. Risk assessment tools and methodologies help decision-makers to identify the most cost effective controls that can strike the right balance between cost and risk reduction to protect against disruption. Typically, the most cost effective controls are those that can minimize the common effects arising from multiple disruptive threats. In order to understand the kind of controls that could be effective, one must recognize the risk outcomes from common supply chain vulnerabilities, which is the focus of this article.

What is Risk?

Before continuing, it would be worthwhile to revisit some of the terminology that we have been using in previous discussion, in order to understand how risk is derived. Fundamentally, risk is the chance (or the probability) of a loss or unwanted negative consequence. For decision purposes, it is often calculated numerically as a function of probability and impact (sometimes called single loss expectancy), and quantitatively expressed as an “expected” loss in monetary value or some other units. A common flaw with using risk values is that they mask the effects of impact versus probability. For example, an expected loss of $100 does not reflect whether high impact is overwhelming low probability, or high probability is overwhelming low impact. Thus, it is not clear whether this value is the expected loss due to an event that occurs 10% of the time and causes $1000 in damages when it occurs, or due to an event that occurs 20% of the time and causes $500 in damages when it occurs. For this very reason, risk values must be used in conjunction with probability and damage values, along with many other metrics, in order for the decision maker to compare the one risk against another. Risk values are not precise and are usually not to be used as standardized values for business management. Nevertheless, risk values can be used to provide decision makers with a means to distinguish risks and control options on a relative basis. Figure 1 illustrates the fundamental parameters that are used to construct risk values, and how they relate to each other.

SC 3 graphic

Figure 1 – Fundamental Components of Risk

Hazards, conditions and triggers are situations that increase or cause the likelihood of an adverse event (sometimes referred to as a peril). In our last article, we examined numerous sources of hazards that can threaten a supply chain. Vulnerabilities are factors that can make a system, in our case a supply chain, susceptible to hazards.  They are usually weaknesses that can be compromised by a hazardous condition, resulting in a threat. The likelihood, or probability, of a threat circumstance occurring must be considered, for reasons discussed above. If it occurs, failures can take place, whose effects are quantified as impacts. When impacts are weighed against the likelihood of the threat, the result is a risk that poses an expected loss. Controls are countermeasures that a firm can use to offset expected losses.

With respect to a supply chain, there are many ways to classify risk. Academics have made many attempts to try to classify risks according to some kind of ontology or framework (Harland, Brenchley and Walker 2003) (Gupta, Kumar Sahu and Khandelwal 2014) (Tummala and Schoenherr 2011) (Peck 2005) (Monroe, Teets and Martin 2012) (Chopra and Sodhi 2004). Some of the more common supply chain risk classifications include:

Recurring risks – These risks arise within the operational environment due to the inability to match supply and demand on a routine basis. The ensuing effects are lower service levels and fill rates.

Disruptive risk – These risks result from loss of supply or supplier capacity, typically driven by some disruptive event.

Exogenous risk – These risks arise within the operational environment and are process driven (e.g. poor quality control, design flaws, etc.), usually within the direct influence of the firm. They typically require the use of preventive mechanisms for control.

Endogenous risk – These risks originate externally, either from the supply side or demand side, which may not necessarily be under a firm’s direct influence. They typically involve the use of responsive mechanisms for control.

While many classification attempts have been noble in nature, in the end it is difficult to classify risks according to a single scheme, for a variety of reasons. First, the lines of demarcation between risk categories can be blurred and there could be overlap between them. For example, from the above categories, one can easily argue about the differences between exogenous and recurring risks. Second, every firm is different, and thus one framework may not fit all. Finally, risk methodology approaches may differ somewhat across various industries, as evidenced by different industry best practices and standards for risk analysis.

Supply chains can exhibit many kinds of vulnerabilities, but quite often these can be viewed as either structural or procedural in nature. Structural vulnerabilities stem from deficiencies in how the supply chain is organized, provisioned and engineered. Single points of failure can arise when there is insufficient diversity across suppliers, product sources or the geographical locations of sources. Inadequate provisioning can create shortages in inventory or capacity to meet customer demands. Procedural vulnerabilities stem from deficiencies in business or operational processes. Gaps and oversights in planning, production or transport processes could adversely affect a firm’s ability to respond to customer needs. Insufficient supply chain visibility could render a firm blind to oversights in supplier vetting and management practices, quality assurance and control, or demand planning.

Such kinds of vulnerabilities, combined with an aforementioned hazardous condition, results in the supply chain failing in some fashion. Table 1 illustrates some of the more common modes of supply chain failure.

Table 1 – Common Supply Chain Failure Modes

Degraded fill rate

Degraded service level

High variability of consumption

Higher product cost

Inaccurate forecasts

Inaccurate order quantity

Information distortion

Insufficient order quantities

Longer lead times/delays

Loss of efficiency

Lower process yields

Operational disruption

Order fulfillment errors

Overstocking/understocking

Poor quality supplied

Supplier stock out

 

Ultimately, such supply chain failures result in increased costs, loss of revenue, loss of assets, or combination thereof. Common risks are typically assessed as increases in ordering costs, product costs, or safety stock costs. Product stock out losses can be assessed as backorder costs or loss of sales and business revenue. Different kinds of firms will be prone to different types of risks. For example, a manufacturing firm with long supply chains will be more susceptible to ordering variability (or bullwhip) types of effects versus a shorter retail supply chain which would be more sensitive to fill rate and service level variability. Understanding and characterizing these risks is necessary in order to develop strategies to control or manage them. Quantifying risks provides the decision maker with a gauge to assess risk before and after a control is applied, thereby assessing the prospective benefit of a potential control. Using quantified risk values, in combination with other parameters, enables a decision maker to prioritize potential control strategies according to their cost-effectiveness.

Conclusions

Risk is the chance or the probability of a loss or unwanted negative consequence. Inherent supply chain weaknesses such as sole sourcing, process gaps or lack of geographical sourcing diversity can render a supply chain more vulnerable to some hazardous, unforeseen condition or trigger event, such as a strike or major storm, resulting in undesirable increases in costs, asset loss or revenue loss. Such risks can be quantified to some extent, quite often in monetary units, and can be used to facilitate cost-benefit analysis of potential control strategies. In our next article, we will take a look some of the most favored strategies to control supply chain risk.

AmickBrown.com

Bibliography

Chopra, S., and M. Sodhi. “Managing Risk to Avoid Supply-Chain Breakdown.” MIT Sloan Management Review, 2004: 53-61.

Gupta, G., V. Kumar Sahu, and A. K. Khandelwal. “Risks in Supply Chain Management and its Mitigation.” IOSR Journal of Engineering, 2014: 42-50.

Harland, C., R. Brenchley, and H. Walker. “Risk in Supply Networks.” Journal of Purchasing & Supply Management, 2003: 51-62.

Monroe, R. W., J. M. Teets, and P. R. Martin. “A Taxonomy for Categorizing Supply Chain Events: Strategies for Addressing Supply Chain Disruptions.” SEDSI 2012 Annual Meeting Conference Proceedings. Southeast Decision Sciences Institute, 2012.

Peck, H. “Drivers of Supply Chain Vulnerability.” International Journal of Physical Distribution & Logistics Management, 2005: 210-232.

Tummala, R., and T. Schoenherr. “Assessing and Managing Risks Using the Supply Chain Risk Management Process (SCRMP).” Supply Chain Management: An International Journal, 2011: 474-483.

 

 

10 Data Visualizations You Need to Know Now

word cloud predictive dataNo one likes reading through pages or slides of stats and research, least of all your clients. Data visualizations can help simplify this information not only for them but you too! These ten different data visualizations will help you present a wide range of data in a visually impactful way.

1.Pie Charts and Bar Graphs—The Usual Suspects for Proportion and Trends

New to data visualization tools? Start with the traditional pie chart and bar graph. Though these may be simple visual representations, don’t underestimate their ability to present data. Pie charts are good tools in helping you visualize market share and product popularity, while bar graphs are often used to compare sales revenue over the years or in different regions. Because they are familiar to most people, they don’t need much explanation—the visual data speaks for itself!

2.Bubble Chart—Displaying Three Variables in One Diagram

When you have data with three variables, pie charts and bar graphs (which can only represent two variables at the most) won’t cut it. Try bubble charts, which are generally a series of circles or “bubbles” on a simple X-Yaxis graph. In this type of chart, the size of the circles represents the third variable, usually size and quantity.

For example, if you need to present data on the quantity of units sold, the revenue generated, and the cost of producing the units, use a bubble chart.  Bubble charts immediately capture the relationship between the three variables and, like line graphs, can help you identify outliers quickly. They’re also relatively easy to understand.

3.Radar Chart—Displaying Multiple Variables in One Diagram

For more than three variables in a data set, move on to the radar chart. The radar chart is a two-dimensional chart shaped like a polygon with three or more variables represented as axes that start from the same point.

Radar charts are useful for plotting customer satisfaction data and performance metrics. Primarily a presentation tool, they are best used for highlighting outliers and commonalities, as radar charts are able to simplify multivariate data sets.

4.Timelines—Condensing Historical Data

Timelines are useful in depicting chronological data. For example, you can use it to chart company milestones, like product launches, over the years.

Forget the black and white timelines in your history textbooks with few dates and events charted. With simple tools online, you can add color and even images to your timeline to accentuate particular milestones and other significant events. These additions not only make your timeline more visually appealing, but easier to process too!

5.Arc Diagrams—Plotting Relationships and Pairings

The arc diagram utilizes a straight line and a series of semicircles to plot the relationships between variables (represented by nodes on the straight line), and helps you to visualize patterns in a given data set.

Commonly used to portray complex data, the number of semicircles within the arc diagram depends on the number of connections between the variables. Arc diagrams are often used to chart the relationship between products and their components, social media mentions, and brands and their marketing strategies. The diagram can itself be complex, so play around with line width and color to make it clearer.

6.Heat Map—For Distributions and Frequency in Data

First used to depict financial market information, the heat map has nothing to do with heat but does display data “intensity” and size through color. Usually utilizing a simple matrix, the 2D area is shaded with different colors representing different data values.

Heat maps are not only used to show financial information, but web page frequency, sales numbers and company productivity as well. If you’ve honed your data viz skills well enough, you can even create a heat map to depict real time changes in sales, the financial market, and site engagement!

7.Chloropleth and Dot Distributions Maps—For Demographic and Spatial Distributions

Like heat maps, chloropleths and dot distribution maps use color (or dots) to show differences in data distribution. However, they differ from heat maps because they’re specific to geographical boundaries. Chloropleths and dot distribution maps are particularly useful for businesses that operate regionally or want to expand to cover more markets, as it can help present the sales, popularity, or potential need of a product to the client in compelling visual language.

8.Time Series—Presenting Measurements over Time Periods

This looks something like a line graph, except that the x-axis only charts time, whether in years, days, or even hours. A time series is useful for charting changes in sales and webpage traffic. Trends, overlaps, and fluctuations can be spotted easily with this visualization.

As this is a precise graph, the time series graph is not only good for presentations (you’ll find many tools to help you create colorful and even dynamic time series online), it’s useful for your own records as well. Professionals both in business and scientific studies typically make use of time series to analyze complex data.

9.Word Clouds—Breaking Down Text and Conversations

It may look like a big jumble of words, but a quick explanation makes this a strong data visualization tool. Word clouds use text data to depict word frequency. In an analysis of social media mentions, instead of simply saying “exciting” has been used x number of times while “boring” has been used y number of times, the word that is used most frequently appears the largest, and the word that hardly appears would be in the smallest font.

Word clouds are frequently used in breaking down qualitative data sets like conversations and surveys, especially for sales and branding firms.

10.Infographics—Visualizing Facts, Instructions and General Information

Infographics are the most visually appealing visualization on this list, but also require the most effort and creativity. Infographics are a series of images and text or numbers that tell a story with the data. They simplify the instructions of complex processes, and make statistical information easily digestible. For marketers, infographics are a popular form of visual content and storytelling.

Get more information on building charts, graphs and visualization types.

– See more at: http://blog-sap.com/analytics/2016/07/11/10-data-visualizations-you-need-to-know-now/#sthash.UXqH0lkE.dpuf