Knowledge Space

Home » Posts tagged 'lean six sigma'

Tag Archives: lean six sigma

Robotic Process Automation – It is all about delivery!

Since 2018, the appeal for digitizing the workplace has grown dramatically.  According to Google Trends, interest in technologies like Robotic Process Automation (RPA), Machine Learning (ML) and Artificial Intelligence (AI) has grown at an average weekly rate of 23% since the beginning of 2018.  It stands to reason interest in technology has increased so dramatically in the last few years.   The fact that one minute of work from an RPA program translates into 15 minutes of human activity means employees can be released from the “prison of the mundane” to work on higher priority tasks. 

Today its not just theory.  Companies are seeing real “hard dollar” cost savings by leveraging the technology.  According to Leslie Willcocks, professor of technology, work, and globalization at the London School of Economics’ Department of Management, The major benefit we found in the 16 case studies we undertook is a return on investment that varies between 30 and as much as 200 percent in the first year!”  She has also found incredible benefits for employees, too.  “In every case we looked at, people welcomed the technology because they hated the tasks that the machines now do, and it relieved them of the rising pressure of work.”

The clear advantages of big data, artificial intelligence and machine learning are likely to change the nature of work across a wide range of industries and occupations.  According to a recent Oxford University Study, THE FUTURE OF EMPLOYMENT: 47 percent of total US employment is will likely be automated or digitized over this decade.

But not everything is coming up smelling like roses.  Despite widespread global interest and adoption of RPA, a recent conducted study by Ernst and Young has revealed that 30% to 50% of initial RPA projects fail!  While critics blame the underlying technology, this is seldom the case.  Usually, the root cause lies in the inattention to risk and internal control considerations in design and deployment of the bot technology. And that is what this article is all about, how to mitigate the risks involved in RPA deployment and generate greater yields of bot deployment success.

It’s All About Delivery

At TPMG Global© we added digital technology, like RPA, to our Lean Management and Six Sigma service offerings in 2018.  We found the technology to be a natural extension of our value proposition of delivering better, faster, and less costly value streams for our clients.  For those unfamiliar; lean management is all about clinically analyzing internal processes to find and get rid of waste.  Six Sigma, on the other hand, is all about defect reduction and standardizing the ruthless pursuit of perfection.  The natural outcome of both methods is improved productivity and lower cost (output per unit of input). 

Without the technology, a well deployed lean six sigma system helps companies improve their operating margins by 25 to 30%.  With the technology, companies experience tremendous speed and consistently shorter cycle times. Shortened cycle times and fewer defects in core value streams help companies get rid of order-to-cash backlogs, rapidly deliver to their customers and increase their recognized revenue per quarter by more than 47%.  

Above, we mentioned this article is about how to mitigate the risks involved in RPA deployment and generate greater yields of bot deployment success.  Below we have outlined 3 simple steps our obsessive and compulsive lean six sigma black belts use in the deployment of Robotic Process Automation.

Step 1 – Be Clinical

Our lean six sigma black belts think of themselves as doctors and client organizations as patients.  They unbiasedly and unemotionally view the internal operations of a company like the internal systems of the human body – inextricably linked and interdependent.  Before thinking of deploying RPA, they obsessively and compulsively analyze internal value streams from end-to-end.  They examine each step, assess data flows, evaluate the roles of people & technology, and reconcile everything to current methods and procedures.  Like super sleuths they not only search for waste and defects, but they also seek and find the agents responsible for creating both.  This diagnosis serves as the basis for the treatment they use to perform corrective action and mitigate certain types of risks like data security and compliance issues. 

Step 2 – Treat the Patient

Once our black belts examine the patient they create and standardize future state solutions that cure the patient of waste and defects.  It is then and only then they pinpoint and examine the requirements for the job functions of interest for automation. 

Step 3 – Test the Technology for Repeatability and Reproducibility

No one knows better than a lean six sigma black belt that achieving perfection is impossible.  Despite accepting this reality, TPMG black belts take confidence in the fact that by only pursuing perfection can they catch excellence.  We take this fatal attitude with us in the development and testing of bot technology.  TPMG uses a methodology called Design for Six Sigma (DFSS) to ensure functional requirements are translated into technical requirements which are programed and rigorously tested.  As the programming goes through user acceptance testing (UAT), TPMG black belts ruthlessly take developers, employees, and testers through cycles of improvement to maximize the RPA Bot’s ability to repeat and reproduce defect free work for which it is designed.  All jobs have their exceptions.  The routine cycles of repeatability and reproducibility work to minimize the impact of the risks written about above.

Step 4 – Hypercare

Once the bots are developed and ruthlessly tested, TPMG deploys the bots into production and puts them through a process called “hypercare”.  Hypercare is an anal-retentive form of bot operating monitoring where bot functions are monitored for unintended consequences.       

Is your organization interested in learning more about Robotic Process Automation?

In which one of these areas are you personally convinced there is room for improvement in your company: scaling for growth, productivity improvement, cost effectiveness, or cycle time reduction? If you are curious, TPMG Process Automation can not only help you answer this question but can also shepherd you through a no risk/no cost discovery process. We can partner with you to identify a job function and set up a complimentary proof of concept RPA bot. As an outcome of the discovery process, you can: 1. benefit from a free cost/benefit analysis, 2. demonstrate the value of RPA for your operation, and 3. discover if RPA is a good fit for your organization.

Contact TPMG Process Automation

ABOUT THE AUTHOR

Gerald Taylor is TPMG Global© Managing Director and is a Certified Lean Six Sigma Master Black Belt

Advertisement

Regression Analysis Tutorial and Examples

I’ve written a number of blog posts about regression analysis and I’ve collected them here to create a regression tutorial. I’ll supplement my own posts with some from my colleagues.Example of Minitab's fitted line plot

This tutorial covers many aspects of regression analysis including: choosing the type of regression analysis to use, specifying the model, interpreting the results, determining how well the model fits, making predictions, and checking the assumptions. At the end, I include examples of different types of regression analyses.

If you’re learning regression analysis right now, you might want to bookmark this tutorial!

Why Choose Regression and the Hallmarks of a Good Regression Analysis

Before we begin the regression analysis tutorial, there are several important questions to answer.

Why should we choose regression at all? What are the common mistakes that even experts make when it comes to regression analysis? And, how do you distinguish a good regression analysis from a less rigorous regression analysis? Read these posts to find out:

Tutorial: How to Choose the Correct Type of Regression Analysis

Minitab's regression menuMinitab statistical software provides a number of different types of regression analysis. Choosing the correct type depends on the characteristics of your data, as the following posts explain.

Tutorial: How to Specify Your Regression Model

Fitting a curved relationship with MinitabChoosing the correct type of regression analysis is just the first step in this regression tutorial. Next, you need to specify the model. Model specification consists of determining which predictor variables to include in the model and whether you need to model curvature and interactions between predictor variables.

Specifying a regression model is an iterative process. The interpretation and assumption verification sections of this regression tutorial show you how to confirm that you’ve specified the model correctly and how to adjust your model based on the results.

  • How to Choose the Best Regression Model: I review some common statistical methods, complications you may face, and provide some practical advice.
  • Stepwise and Best Subsets Regression: Minitab provides two automatic tools that help identify useful predictors during the exploratory stages of model building.
  • Curve Fitting with Linear and Nonlinear Regression: Sometimes your data just don’t follow a straight line and you need to fit a curved relationship.
  • Interaction effects: Michelle Paret explains interactions using Ketchup and Soy Sauce.
  • Proxy variables: Important variables can be difficult or impossible to measure but omitting them from the regression model can produce invalid results. A proxy variable is an easily measurable variable that is used in place of a difficult variable.
  • Overfitting the model: Overly complex models can produce misleading results. Learn about overfit models and how to detect and avoid them.
  • Hierarchical models: I review reasons to fit, or not fit, a hierarchical model. A hierarchical model contains all lower-order terms that comprise the higher-order terms that also appear in the model.
  • Standardizing the variables: In certain cases, standardizing the variables in your regression model can reveal statistically significant findings that you might otherwise miss.
  • Five reasons why your R-squared can be too high: If you specify the wrong regression model, or use the wrong model fitting process, the R-squared can be too high.

Tutorial: How to Interpret your Regression Results

So, you’ve chosen the correct type of regression and specified the model. Now, you want to interpret the results. The following topics in the regression tutorial show you how to interpret the results and effectively present them:

Tutorial: How to Use Regression to Make Predictions

How to predict with MinitabIn addition to determining how the response variable changes when you change the values of the predictor variables, the other key benefit of regression is the ability to make predictions. In this part of the regression tutorial, I cover how to do just this.

  • How to Predict with Minitab: A prediction guide that uses BMI to predict body fat percentage.
  • Predicted R-squared: This statistic indicates how well a regression model predicts responses for new observations rather than just the original data set.
  • Prediction intervals: See how presenting prediction intervals is better than presenting only the regression equation and predicted values.
  • Prediction intervals versus other intervals: I compare prediction intervals to confidence and tolerance intervals so you’ll know when to use each type of interval.

Tutorial: How to Check the Regression Assumptions and Fix Problems

Illustration of residualsLike any statistical test, regression analysis has assumptions that you should satisfy, or the results can be invalid. In regression analysis, the main way to check the assumptions is to assess the residual plots. The following posts in the tutorial show you how to do this and offer suggestions for how to fix problems.

  • Residual plots: What they should look like and reasons why they might not!
  • How important are normal residuals: If you have a large enough sample, nonnormal residuals may not be a problem.
  • Multicollinearity: Highly correlated predictors can be a problem, but not always!
  • Heteroscedasticity: You want the residuals to have a constant variance (homoscedasticity), but what if they don’t?
  • Box-Cox transformation: If you can’t resolve the underlying problem, Cody Steele shows how easy it can be to transform the problem away!

Examples of Different Types of Regression Analyses

The final part of the regression tutorial contains examples of the different types of regression analysis that Minitab can perform. Many of these regression examples include the data sets so you can try it yourself!

 

VOC Advances: New Paths to Understanding Customers

By Kimberly Watson-Hemphill and Anthony E. Curtis

Voice of the Customer Strategy

Most companies today say they are using voice of the customer (VOC) data to make decisions. But what exactly does that mean? In a 2002 survey by the Confederation of British Industry, with responses from more than 400 companies, the VOC methods mentioned included:

  • Surveys, 65 percent
  • Ideas meetings, 53 percent
  • Service/product testing, 50 percent
  • Formal observation, 18 percent

If a company’s goal is to stay ahead of its competition, there are two fatal flaws with this state of affairs:

  1. These traditional forms of VOC collection are unreliable even when the purpose is simply to improve what is already offered to customers. Odds of them helping the company push its market boundaries through innovations in products or services are virtually nil.
  2. Most companies are not even making good use of these traditional methods. Pushed for details, most managers will describe doing a survey once or twice a year, or say they get customer input only when testing a completely developed prototype. That is far too late in the design process to have a significant impact.

Beyond the Traditional Forms of VOC

There are some hard truths that businesses today are only just starting to grapple with. Most competitors in a particular field have access to the same customers and the same market information. The company that best understands those customers will end up with the biggest business advantage.

Developing this level of understanding demands skills well beyond traditional VOC techniques. Customers usually cannot explain their needs or wishes that would lead to innovative or transformational products and services because:

  • They do not know a supplier’s capabilities as well as that supplier does – so it does not occur to them that a supplier may be able to help them solve a problem.
  • Customers’ creativity is more likely to be focused on their jobs than on the products or services they use.
  • People are better at reacting to specific ideas than coming up with insights on their own.
  • When customers are asked if they like a new offering, they may lie. They may not want to hurt anyone’s feelings; or they may just want to avoid an argument.

Simply asking customers what they like or do not like about current products or services will not work. Microsoft fell afoul of this by asking customers to attend a focus group, use their software for a few hours, and answer questions interactively. It went something like this:

Question: Did you like the product?
Answer: Yup!
Question: Any features you do not like or want to add?
Answer: Nope!

Based on these answers, it might appear that Microsoft had a winner right out of the gate. But when Microsoft developers began recording keystrokes and videotaping customers’ experience, they discovered a wide range of negative customer reactions – grimaces, hesitations, etc.

Ethnography: The New Science for Understanding Customers

If simply asking customers what they like will not work, what will? The answer is incorporating close, detailed observation of customer behavior into design work. The epitome of this trend is the emerging field of customer ethnography, where a company finds ways to “live with” selected customers to get an in-depth understanding of their needs and how they use a product or service in real life. Ethnography is a discipline built on the principles of social anthropology, studying people in their native habitat. (Of course, in a business context, that habitat is more likely to be an office, school or home than the jungles of New Guinea.)

At its simplest level, ethnography includes any direct observation of customers with an eye towards identifying things that could make their lives easier. For example, Scott Cook noticed how much time it took his wife to pay the monthly bills and how repetitive it was. This was the birth of his idea for Quicken, the personal finance software, which grew into a billion dollar company. The practice of observing customer behavior has continued, now alive in Intuit’s “Follow Me Home” research program which is designed to gather what is being called ethnographic customer data. Because of that continued emphasis on understanding customers’ lives, Quicken and other Intuit products are consistently rated at the top of easy-to-use software.

The purpose of ethnography is to generate the kind of deep and intuitive understanding of customer needs and frustrations that cannot help but inspire creative insights. A company will select a few customers or potential customers to observe, typically about 10. (While other VOC methods are concerned with information quantity, ethnography focuses on quality.) A team of trained observers is sent to watch the customers. Their goals are to:

  • Develop a holistic view of customer needs – look at all the behaviors associated with a particular need, not just a single task, including all the activities that surround a product or service a supplier offers.
  • Expose and record “tribal knowledge” – the things that people do automatically, that they do not consciously think about.
  • Identify customer frustrations and areas of less-than-optimal efficiency, whether or not it is related to the product or service a supplier offers.

A Case Study in Ethnography

Other than Intuit’s approach, no financial services businesses have made available reports on their use of ethnography, though Bank of America has set up an experimental branch where it can test any number of customer services. But the experience of a retail chain which wanted to improve customer experiences at its stores provides an example that could easily be adapted to banks with branch locations. This case study shows how ethnography complements more traditional forms of VOC.

The retail company team’s goal was to understand how it could redesign its stores to give shoppers a more pleasant experience (one that would correlate into sales, of course). To get started, the team:

  • Looked at the current state of store layout and design and asked how that matched up with what the customer “wanted” – as much as they knew at that point, at least.
  • Reviewed existing quantitative data. Like most good companies, this retail chain had an abundance of market and consumer segmentation studies, market share studies and business results on hand.

The team used this historical data as a starting point. (Many companies will stop here and not go any further, assuming that this data is true and basing all their decisions on it. In fact, such an assumption is seldom true.) Based on what was learned, the team began working on two different fronts:

1. What Other Companies Were Doing (Benchmarking)

  • The team made many trips to competitors’ stores, did subjective evaluations of whether those designs seemed to be working, and looked for design features they could incorporate into their redesign effort.
  • Team members traveled far and wide searching for the newest, hottest store design examples and concepts. For example, they found that European retail stores were much more cutting-edge in their fixturing designs.
  • The team also looked at designs for other types of stores, hoping to find inspiration.

2. What Customers Wanted (VOC Collection)

  • The team recognized that focus groups, surveys and simple interviews would not supply the information it needed. The team went to customers, on their turf, visiting them in their own homes to hear about their issues and concerns.
  • The team also conducted “shop-alongs,” going to various retailers with consumers to observe their actions, asking for clarification on why they did what they did and capturing detailed notes on consumer reactions.
  • The team turned some staff into “mystery shoppers,” who went to stores to shop for certain things and interact with the sales associates to see how customers are treated and what is offered to them.

Based on the information collected, the team moved into the next design phase – prototyping. Though often used only for new product development, prototyping is critical for all development efforts. This team took its research and ideas and incorporated them into miniature store layouts and designs. For example, to test a completely new design of the music section, they constructed (in open warehouse space) a scaled version of the new fixtures and layout. Then they brought in customers to test out the shopability of the new design. The feedback was immediately implemented into improving the design and establishing a second prototype, which also was tested. The same process was used for each department until the store design was complete.

Conclusion: Getting New Insights

A growing body of case studies shows how ethnography leads to insights that companies simply cannot get any other way. A book about this new discipline, The Art of Innovation by Tom Kelley, profiles IDEO, a firm in Palo Alto, California (USA). The firm has used ethnography to design everything from medical equipment to an office furniture showroom.

One downside of ethnography is that it is time- and labor-intensive. Also, a company needs to guard against designing a product or service based on a just a few customers. The experiences of the few people a business chooses to observe in-depth can be a great source of inspiration and provide the starting point for next-generation products and services. But the more traditional forms of VOC – focus groups, phone interviews, etc. – are still needed to validate findings from an ethnographic study.

Defining Critical to Quality Characteristics: A Key Step in the Design Process

By J. DeLayne Stroud

After starting a project and gathering the voice of the customer (VOC), it is time to define the critical-to-quality outputs (CTQs).

CTQs are the key measurable characteristics of a product or process whose performance standards or specification limits must be met in order to satisfy the customer. These outputs represent the product or service characteristics defined by the customer (internal or external). They may include the upper and lower specification limits or any other factors related to the product or service. Typically, a CTQ must be interpreted from a qualitative customer statement to an actionable, quantitative business specification. Establishing CTQs is vital for a company to meet customer needs and keep up with the competition.

VOC Becomes CTQs

The flowchart in Figure 1 provides an overview of the requirements necessary to translate the VOC into usable CTQs.

Operational definitions of the flowchart steps are:

  • Characteristics of product or service: A word or phrase that describes some aspect of the product or service. Example: dry cleaning process time.
  • Measures and operational definitions: A definition of how the product or service’s characteristic is to be quantified. There may be several ways of quantifying a given characteristic. Example: the unit used to measure time between when the cleaner receives clothes and when the clothes are ready for pickup (hours).
  • Target value(s): The aim for a product or service. If there were no variation in the product or service, this is the value that would always be achieved; it is the desired level of performance. Example: clothes ready for pickup in 24 hours.
  • Specification limits: How much variation is the customer willing to tolerate in the delivery of the product or service? Specification limits are performance limits that are acceptable to the customer. Example: Upper specification limit for dry cleaning process time is 28 hours.
  • Defect rate(s): This is how often the producer is willing to produce a product or service outside the specification limits. Example: 3.4 defects per million opportunities.

Figure 1: Flowchart for VOC to CTQ

Data Quality and CTQs

Although it is often overlooked, data quality is an important consideration in the design effort. The impact of poor data quality can be very serious. From an organizational perspective, it may create extra costs, rework, low productivity; drive the “wrong” decisions (because of outdated data); and prompt a sense of frustration or lack of trust. From a project perspective, it could result in project delays and impairment of testing. Project teams need to assure that data associated with their designs is both accurate and complete. This may be accomplished by defining CTQs for data quality.

Possible data quality CTQs include:

  • Access restriction
  • Age
  • Availability
  • Completeness
  • Definition and format
  • Encryption
  • Timeliness

Types of Data

Data can be discrete or continuous. When possible, practitioners should collect continuous data because it can be recorded at many different points. Examples include length, size, time, temperature and cost. Continuous data can be broken down into smaller parts, meaning practitioners can get more information about what they are measuring than from attribute data.

Setting Measurements

The design of a product or service starts with quantified requirements. Practitioners need to develop measures for which targets and limits can be established. There may be several ways to quantify a given characteristic. Practitioners should try to pick measures that can be used as inputs to design and avoid measures that are only relevant after the product or service is being produced or offered (i.e., customer satisfaction, complaints). Also, it is important to consider how the characteristic will be measured. Practitioners must avoid measurement systems that, in themselves, introduce variation into the process.

Choosing the Right Metrics

Practitioners can save a lot of frustration by choosing the right metrics up front. This will not eliminate the need to evaluate the metrics during the design process, but it will cut down on the overall project duration. The selected metrics need to be solution independent and support the product or service as an indicator of customer needs. But keep in mind that all customers are not created equal – the project may require more than one measure per customer need. Again, also choose continuous metrics if possible.

Developing Targets and Establishing Specification Limits

Unfortunately, there is no specific recipe for setting targets and specifications. This is a function of business know-how and technical expertise, so practitioners should use the business or subject-matter experts to assist them with brainstorming and developing these requirements. There are many variables to consider, as shown in Figure 2. (Note: Current or projected capability to achieve a performance level should not be the primary basis for establishing targets. To ensure success in the market, the customer and competitive information should be the primary drivers.)

Figure 2: Considerations and Drivers Used to Identify CTQs

 

Figure 2: Considerations and Drivers Used to Identify CTQs

Elements of the House of Quality

One of the most powerful tools used in defining CTQs is the quality function deployment (QFD), also known as the house of quality. This is a structured methodology and mathematical tool used to identify and quantify customers’ requirements and translate them into key critical parameters. QFD helps practitioners to prioritize actions to improve their process or product to meet customers’ expectations.

As Don Clausing and John Hauser write in their article The House of Quality about QFD: “None of this is simple. An elegant idea ultimately decays into process, and processes will be confounding as long as human beings are involved. But that is no excuse to hold back. If a technique like the house of quality can help break down functional barriers and encourage teamwork, serious efforts to implement it will be many times rewarded.”

The QFD was originally developed by Yoji Akao in 1966 when he combined his work in quality assurance and quality control points with function deployment used in value engineering. Akao described QFD as a “method to transform user demands into design quality, to deploy the functions forming quality, and to deploy methods for achieving the design quality into subsystems and component parts, and ultimately to specific elements of the manufacturing process.”

Figure 3 shows the design of the house of quality.

Figure 3: House of Quality

QFD is designed to help planners focus on characteristics of a new or existing product or service from the viewpoints of market segments, company or technology-development needs. The technique yields graphs and matrices.

Basic steps in the creation of the QFD include:

  1. Identify customer needs and wants (collect VOC).
  2. Identify the engineering characteristics of products or services that meet VOC.
  3. Set development targets and test methods for the products or services.

Once again, the QFD helps transform VOC into engineering characteristics (and appropriate test methods) for a product or service, prioritizing each product or service characteristic while simultaneously setting development targets for the product or service, all of which are necessary in defining CTQs.

One of the biggest advantages of QFD is that the process requires groups of cross-functional representatives to work together to understand customer expectations in a way that focuses on customer requirements by using and strengthening functional teamwork. It provides flexible and easy-to-assimilate documentation and uses competitive positioning and marketing potential to prioritize design goals. Finally, it translates soft customer requirements into measurable goals.

Benefits experienced when using the QFD include a reduction in design, a reduction in design changes and a reduction in start-up costs.

Lessons Learned When Using a QFD

QFD is more of an art than a science. The big benefit comes from the discussion the process generates. Practitioners might be surprised to find that even with the simplest process, a QFD requires a lot of effort. Many entries may look obvious, even after they are written down; however, if there are no “tough spots,” it probably is not being done right. Practitioners must always focus on the end customer and remember that “charts” are not the objective. Most importantly, QFD is a valuable decision support tool; it is not a decision maker.

QFD is an organizing tool – the bulk of the effort lies in gathering the inputs to the house of quality. The QFD should be performed via a cross-functional team and communicated to all involved in the design. Although QFD takes time, it will ultimately save time spent reworking “defective” designs and assist in balancing time commitment with benefits.

Mitigating Potential Impacts

How does the inability to meet major CTQs in the design – or of not considering a CTQ – impact the customer or a company? Potential customer impacts include an increase in product or service variability, non-functional products or services, delays in delivery time and cost of the product or service, as well as a decrease in value to the customer.

Potential internal impacts include increased rework and costs, and loss of profit margin, customers (or return customers), and growth opportunities for not keeping up with its competition, which can lead to barriers to entry in other markets. Therefore, defining CTQ requirements should be at the top of a project’s priorities.

Revamping Healthcare Using DMAIC And DFSS

By Carolyn Pexton, Bradley Schultz and Richard Stahl

Evidence pointing to the applicability of Six Sigma and related best practices within healthcare has been steadily mounting over the past few years. Primarily based on the implementation of the DMAIC process (Define, Measure, Analyze, Improve and Control), we’re seeing an ever-widening array of documented and publicized results… from improving turnaround time for patients to receive radiology exam reports, to reducing medication errors and infection rates …and even to literally changing the cultural fabric or DNA of an entire organization. But although interest and implementation are growing rapidly, current case studies represent the tip of the iceberg in terms of improving the system as a whole.

If you were to extrapolate the above-mentioned results and extend the DMAIC approach beyond those institutions currently pioneering the way, the impact would be measurable and impressive. It would not, however, represent a total solution and would not completely close the infamous quality chasm that continues to plague the industry. What’s missing? Is there another route we can take?

Profile of the DMAIC Approach in Healthcare

Once they’ve had a chance to see it in action, clinicians have generally embraced the DMAIC approach, since it builds on familiar concepts while adding a level of scientific rigor and sustainability lacking in other initiatives. As one practitioner put it, ‘being able to clearly Define, Measure, Analyze, Improve and Control ANYTHING in the healthcare environment represents a big leap forward.’

DMAIC has been an effective method for improving any process that has measurable response variables, which in healthcare may be classified within four primary groups:

As a technical strategy for process and quality improvement (and particularly when coupled with strong change management tools like change acceleration process [CAP] and Work-out), the DMAIC approach has been successful in driving a wide range of sustainable results, including:

  • Medical error reduction and patient safety improvement
  • Cost management and revenue enhancement
  • Improvements in patient, physician and employee satisfaction
  • Increase in capacity and throughput
  • Improvements in supply chain management
  • Reduction in cycle time for radiology reports
  • Reduction in patient waiting times in ED
  • Identification of market growth opportunities
  • Development of internal leadership capabilities
  • Streamlining and optimizing technologies and related workflow
  • Achieving compliance and meeting regulatory requirements
Figure 1: Formula for Effective Results

Figure 1: Formula for Effective Results

Aligning Six Sigma healthcare projects with the fundamental objectives of the organization is one of the keys to success, and all projects must show a clear business case in order to gain the required allocation of time and resources.

A common application of DMAIC is shown below, illustrating cycle time for reporting radiology results:

Figure 2: Capability Analysis

Figure 2: Capability Analysis

The DMAIC approach has worked quite well for healthcare processes that have both measurable response variables and causal factors that are primarily controllable. However, since healthcare involves human behavior and a great deal of interaction between people, processes and technology, we often face a situation where critical issues are being driven by uncontrollable factors and require intervention at the design level.

Designing Healthcare for Six Sigma Excellence

Progress applying Six Sigma in healthcare has been steady and significant. The next wave of change, however, will likely go beyond DMAIC to involve the creation of new processes not bound by investments in archaic technologies, outmoded policies and procedures and other encumbrances inherent in the system.

To achieve the level of change described in the Institute of Medicine’s report, Crossing the Quality Chasm, and break through the 5 sigma “wall” that Mikel Harry referred to in Six Sigma, The Breakthrough Management Strategy Revolutionizing the World’s Top Corporations, healthcare organizations will need to do more than simply improve upon the current system. They will need to employ the DFSS or DMADV (Define, Measure, Analyze, Design and Verify) approach and build entirely new processes from the ground up. The design process gathers customer requirements and translates them into process specifications, then into system design requirements and finally into subsystem and process design requirements. As with DMAIC, DFSS involves a structured, five-phased approach and the application of rigorous statistical tools and techniques.

DFSS has been used by many industries for a myriad of purposes – to design new medical equipment, develop superior dishwashers, create better systems for customer relationship management and even to launch new businesses from concept to completion.

Since studies have shown that 80 percent of quality issues are linked to the design of a product or process, DFSS addresses this problem head-on and delivers long-term cost avoidance and customer satisfaction by ensuring the design clearly meets customer specifications and has been rigorously tested against possible defects.

Knowing When to Apply DMAIC or DFSS

As mentioned earlier, a typical DMAIC project will have measurable response variables, controllable factors and clear linkage with the overall business objectives. Some projects may obviously lend themselves to application of either DMAIC or DFSS. Sometimes, however, a Six Sigma team may be halfway through a DMAIC project only to discover that it should really be a DFSS project in order to meet the true objectives and satisfy customer requirements. At this point, the team will need to go back to the drawing board and take the project through each of the five phases of the DMADV process.

t’s important to consider the following questions in order to determine at the outset whether a particular project requires application of the DFSS approach:

    • Is this a situation (perhaps involving new technology) where there is no existing process to build upon?
    • For an existing process, to what extent does it meet customer expectations? Have DMAIC improvements been tried? With what measure of success?
    • Does it require decreased variability alone or a radical shift in the mean?
    • Does your organization have the flexibility to either continue or abandon legacy systems linked with this process?
    • What new developments are planned that may affect the project? ( i.e., new clinical service line or facility renovation)

When an existing process is simply broken beyond repair or the “fix” is precluded by bureaucratic entanglement, DFSS may be a better path to follow. Healthcare organizations may also find it beneficial and refreshing to use DFSS as a chance to take ownership and design new systems that clearly meet customer expectations, instead of copying and perpetuating old processes.

One of the differences between DMAIC and DMADV can be found in the first two phases. The Define and Measure phases of a DMADV project may be summarized as a process of CTQ (critical to quality) flow-down. The Analyze phase can be summarized as a process of capability flow-up. In DMAIC, an understanding of causal factors on a specific process outcome is calculated mathematically, while in DMADV, a specific process may not even exist.

Figure 3: DFSS Process

Figure 3: DFSS Process

When new processes, systems and structures are involved, the capability may be projected or forecasted using modeling. In healthcare, the models most relevant to a new service line are those involving capacity, patient queuing, provider resource allocation, and patient routing.

For a DMADV project around a new healthcare service line, the outcome of the Analyze phase will enable the team to:

    1. Translate customer needs into specific service line features, service delivery system and service sub-system/process design specification
    2. Match needs and requirements against a mathematical expression of existing or forecasted process capabilities

During the Design phase, an optimal design is selected and implemented based on merging the CTQ flow-down and the capability flow-up into one integrated design scorecard. Capability forecasting and analysis provides insight into how well design requirements will be met and the QFD (quality function deployment) translates this into customer satisfaction. The result is a formula for understanding customer impact associated with specific design alternatives and trade-offs.

Finally, in the Verify phase, the actual performance from a sub-system is measured against predicted performance by confirming customer satisfaction. Within manufacturing, this is done through component, sub-system, and system level testing. In healthcare, however, the opportunity to “test” segments of the service line may not exist. The key then becomes understanding the degree to which proper controls are operationalized to consistently yield predictable results.

During the Verify phase the team also has the opportunity to rethink existing systems and processes. If a hospital is planning to launch a new imaging center, for example, they may find that the existing patient registration process is not ideal for meeting customer expectations. Redesign of this process should prompt a customer-focused reevaluation of patient registration across the entire institution.

Conclusion

Properly implemented with leadership support and the use of change management techniques to address cultural barriers and build acceptance, Six Sigma has achieved measurable success. It has been tested and proven at the individual department level, within small, rural hospitals, in freestanding imaging centers, throughout large teaching facilities and across multi-hospital systems.

DMAIC and DFSS each have their place within any organization striving for excellence. There are advantages to both applications of Six Sigma within healthcare, and organizations should carefully assess their specific needs and preparedness for either targeted or systemic change.

As they continue to face serious challenges in the years ahead-including regulatory compliance, disaster readiness, shifting demographics and never-ending resource constraints-healthcare providers will increasingly seek solutions that produce verifiable results. But we should paint this picture with the brushstroke of reality…even with DMAIC and DFSS on their side, the tough issues will not be easily or immediately resolved. In either form, Six Sigma is not a magic potion. Substantive change will require leadership commitment, grassroots support and dedicated effort toward designing a healthcare system that is efficient, cost-effective and 99.99966 percent error-free.

Using the Five W’s and One H Approach to Six Sigma

By Pradeep Kumar Mahalik

5W1H (who, what, where, when, why, how) is a method of asking questions about a process or a problem taken up for improvement. Four of the W’s (who, what, where, when) and the one H is used to comprehend for details, analyze inferences and judgment to get to the fundamental facts and guide statements to get to the abstraction. The last W (why) is often asked five times so that one can drill down to get to the core of a problem.

5W1H of Six Sigma explains the approach to be followed by exactly understanding and analyzing the process, project or a problem for improvement. Here is an outline of the 5W1H approach for Six Sigma.

What Is Six Sigma –– a Concept

The “what” in Six Sigma is a concept, a subject whose basic goal is to focus on the basic steps and analysis. Six Sigma also is a level of quality applied to variations in any process. Sigma, the Greek letter “σ” is the symbol in statistics used for standard deviation, a measure of variation in the distribution of values. Six Sigma (σ) equates to 3.4 defects per million opportunities, providing a stretch goal of near perfection in business products or services.

In the Six Sigma methodology, anything that dissatisfies the customer is a defect, and so understanding the customer and customer requirements is the most important issue in establishing a Six Sigma culture. Six Sigma is a problem-solving management methodology that can be applied to any type of business process to identify and eliminate the root causes of defects, ultimately improving the key business processes and saving cost for the organization. In this regard, the main goal of Six Sigma is that any quality improvements in an organization need to be economically viable.

Six Sigma, as a management philosophy, allows an organization to apply a disciplined, data-driven approach that continuously brings improvement in business process performance by reducing the variability in each business process. Six Sigma creates a culture in an organization aimed at learning to build processes that delivers the business output with flawless quality. Six Sigma also focuses on measuring and controlling the variation at each stage of business process. That sometimes creates a mistaken notion that Six Sigma is a set of statistical tools and a mere strategy for their use. The reality is Six Sigma is a blending of the wisdom of an organization with a methodology and an extensive toolkit to improve both the efficiency and effectiveness of the organization in meeting its the customer requirements.

Why Six Sigma –– an Objective

How to achieve the goals is accomplished in “why” context. The goal of Six Sigma is to help people in their attempt to improve business processes to deliver defect-free products and services. Six Sigma requires practitioners to consider both the “voice of the customer” and the “voice of the process,” it reduces the gap between the two voices. That leads to more satisfied customers, and that is what makes the Six Sigma initiative a profitable business proposition. Not only does the Six Sigma organization save costs, but it also has great opportunity to increase sales.

Six Sigma focuses on long and sustained success for every improvement projects, improving each and every process in the organization. That gives organizations a way to continue improving year after year and even provides a system that rewards “out of the box” thinking, which can accelerate the rate of improvement.

Since it is a data-driven approach to problem solving, Six Sigma builds robustness in daily management. This starts a set of chain reaction in strategic, tactical and operational improvements, which compels the organization to set a stretch target for every business performance and set goals for everyone in the organization.

Figure 1: 5W1H Interrelationships

Who Is Involved in Six Sigma –– People Involvement

Six Sigma eventually involves every one in the organization, starting from the top management to the operator/staff level. It requires a companywide understanding of the processes, a commitment toward achieving the set goals and an involvement in projects that accompany those goals. Since Six Sigma is for the total business process, it involves everyone in the organization.

The top levels of management appoint Sponsors, who are members of the leadership team who are responsible for selecting Six Sigma projects and are ultimately accountable for project results. Just under the Sponsors are Champions, who typically have day-to-day responsibility for the business process being improved and their role is to ensure the Six Sigma project team has the resources required to successfully execute the project. Next are the Master Black Belts, who teach and mentor the Black Belts, who have been trained to manage Six Sigma projects and serve as leaders of project teams consisting of Green Belts and other employees. Green Belts are core project team members, and even serve as team leaders on smaller projects. And the other members of project teams are regular employees.

In well-functioning Six Sigma deployments, everyone in the organization is involved in reducing defects, reducing cycle times and increasing customer satisfaction.

Where to Apply Six Sigma –– Business Process Location

Six Sigma is applied to all business processes. To start with, it can be applied to key business processes which have the highest visible impact on the customers and shareholders. All business processes impacting customer satisfaction and profit growth of the organization need to undergo Six Sigma methodology implementation.

Figure 2: Toward Key Business Processes

Figure 2: Toward Key Business Processes

When to Apply Six Sigma

As long as an organization has a strong desire to improve the business performance by identifying each and every key business processes for improvement, the starting point of Six Sigma does not matter. Organizations can implement Six Sigma:

  • When they find out that the customer satisfaction level is eroding.
  • When they want to retain a leadership position through quality in the market.
  • When there is a clear indication of losing market share due to quality.
  • When their processes have not changed for a long time.
  • When the quality of a product is largely dependent on human inspection skills instead of being built-in to the production process.
  • When they think their processes have all reached an improvement plateau.
  • When they are required to improve performance in all areas of their business process.
  • When they decide they want to survive and grow in today’s competitive market.

How to Apply Six Sigma –– Methodology Followed

Depending upon the requirement of the organization and the type of organization different strategies are followed for Six Sigma implementation.

The three main strategies followed in Six Sigma are:

  • Process Management: An ongoing cross-functional ownership and measurement of core support processes.
  • Process Improvement: Focused on problem solving, aimed at eliminating the vital few root causes. It is most common to use the DMAIC roadmap:
    • Define – Select customer’s critical-to-quality characteristics (CTQs). Define the process improvement goals that are consistent with customer demands and enterprise strategy.
    • Measure – Create a measurement system and validate the system. Measure the current process and collect relevant data for future comparison.
    • Analyze – Identify the sources of variation from the performance objectives. Analyze to verify relationship and causality of factors. Determine what the relationship is and attempt to ensure that all factors have been considered using one or more of the tools in the Six Sigma toolkit.
    • Improve – Discover process relationships and establish new and improved procedures. Improve or optimize the process based upon the analysis.
    • Control – Sustain the gain by implementing process controls. Control to ensure that any variances are corrected before they result in defects. Set up pilot runs to establish process capability, transition to production and thereafter continuously measure the process and institute control mechanisms.
  • Process Design/Redesign: The creation of a new process to achieve exponential improvement and/or meet the changing demands of customers, technology and competition. It must handle totally dysfunctional processes and reengineer them. It is also known as Design for Six Sigma (DFSS). DMADV is the most common roadmap followed for DFSS:
    • Define – Define the goals of the design activity that are consistent with customer demands and enterprise strategy.
    • Measure – Measure and identify CTQs, product capabilities, production process capability and risk assessments.
    • Analyze – Analyze to develop and design alternatives, create high-level design and evaluate design capability to select the best design.
    • Design – Design details, optimize the design, and plan for design verification. This phase may require simulations.
    • Verify – Verify the design, set up pilot runs, implement production process and handover to process owners.

The structure of 5W1H of Six Sigma can help organizations to consider all aspects of the Six Sigma situation in detail and hence can be implemented when analyzing a business process for improvement opportunities. The interrelationships also guide Six Sigma practitioners to take systematic error-free steps to complete the Six Sigma project successfully on-time and every-time.

Why Hire a Six Sigma Consultant?

By iSixSigma-Editorial

“Changing what we have the power to change, accepting the things we cannot change, and becoming wise enough to know the difference between the two.” It is a very old saying, and one that many people aspire to. But when it comes to implementing quality within your business, many feel they have or should have the power to effect change themselves. Sometimes, however, an outsider might be best to help lead the change. Here are a few reasons why:

  • Sometimes a Six Sigma consultant is better than we are.
    I know, it is not so. But just sometimes there are Six Sigma consultants that actually do know more than we do on a certain subject. Why not think about outsourcing, which will allow us to focus our energies on more important tasks. Do not worry, you will still get the credit. Just make sure you hire Six Sigma consultants with the proper credentials.
  • Consultants are more skilled at explanations.
    Let’s face it. If you come from an engineering or statistics background, you need strong analytical and problem solving skills. In order to earn their living, Six Sigma consultants need strong verbal and written communication skills. Since their core competencies are honed, they might be best to help with the implementation.
  • Company employees may be biased.
    Implementing quality within your organization is not about writing and distributing a manual. It is a new way of doing work with a new set of tools – and it’s going to have profound implications on your compensation and organizational structure. Even though you may be calling ‘straight’ shots, others may feel you have ulterior motives. An outside Six Sigma consultant, however, has nothing to win or lose by their recommendations.
  • Six Sigma consultants are easy to blame.
    Do you have to reorganize? Is someone going to lose power when you implement Six Sigma and process management? Decisions can lead to fallout within your organization, and sometimes it is easier to blame the Six Sigma consultant than to have to take sides within the company. And if the implementation does not work – well, that is the consultant’s fault also!

When it comes down to it, Six Sigma consultants have typically seen and implemented quality within many organizations and because of it are granted more influence in the corporate world than internal staff. We could deny it and get upset about it, but is it worth it? Instead, maybe we would be better served by changing what we can, accepting that which we cannot, and knowing the difference between the two.

Note: The author is not a consultant and this article is not an endorsement for consultants.

Safeguard Against 4 Forms of Six Sigma Resistance

By J. DeLayne Stroud

resistance

Over the years, I have had numerous inquiries from readers about overcoming opposition to Six Sigma. Interestingly enough, I have also experienced this opposition when speaking with prospective or existing clients, some of whom have asked me not to mention Six Sigma but to speak to tools in a generic fashion. Obviously, opposition to the method is a common deployment obstacle.

Practitioners typically encounter four types of resistance to Six Sigma: technical, political, organizational and personal. In order to resolve this negative force, they must classify the type of opposition they are encountering, understand its root cause and then adjust their deployment strategies accordingly.

History of the Push Back

In the late 1990s and early 2000s there was a growing consensus that Six Sigma was not a “flavor of the month” process improvement methodology. It had crossed over from manufacturing to service industries, including the financial and healthcare industries. Adoption was relatively high, and opposition low. Companies were delighted with the methodology’s accomplishments and sustainability. But stakeholders have become more cautious of using the Lean and Six Sigma terminology, tools and methodology.

While the terminology may be intimidating, Lean and Six Sigma tools have brought much success to corporations. Therefore, practitioners must work to overcome this fear. To do this, they must investigate the root cause of the opposition.

Change Should Not Equal Loss

The root cause of opposition can be seen in an exercise featured in the article “The Change Game: Engaging Exercises to Teach Change.” To play, participants are asked to change things about their physical appearance. Surprisingly, people often begin by taking off pieces of jewelry or clothing. During the post-game debrief, participants note a strong tendency to think of change as a loss – they must lose something in order to change.

If change is somehow equated with loss, how can practitioners expect any Lean Six Sigma program to be successful? The answer is to identify, motivate and mobilize their teams in order to increase commitment and eliminate the fear of loss. This can be accomplished through a stakeholder analysis.

Handpicked Content:   Managing Performance Factors: A Sample Checklist

Completing a Stakeholder Analysis

Stakeholders control critical resources or own key processes impacted by change. They have needed expertise. They are influential in how other critical stakeholders think. They can block projects – directly or indirectly – and must approve certain aspects of those projects. It is important to identify key stakeholders or those stakeholders who have leverage or influence over other stakeholders. Once identified, practitioners should document each stakeholder’s level of support or opposition to the quality initiative.

There are five possible levels of stakeholder support:

  • Strong Support – Advocates for making things happen
  • Moderate Support – Those who may only be involved in helping with the Six Sigma initiative. They do what they are asked to do and nothing more.
  • Neutral – Those who are merely letting things happen. They are not proactive advocates of the initiative, nor are they out to sabotage it.
  • Moderately Against – Those who will not comply with what is asked of them
  • Strongly Against – Not only do they not comply with the efforts underway, but they also go out of their way to lobby against the change initiative.

Table 1 shows one of many examples of a stakeholder analysis. The five categories of support are listed across the top of the chart, while stakeholders are listed down the left-hand side. An “O” represents the stakeholder’s current support for the initiative, and an “X” represents where the stakeholder’s support needs to progress to in order to successfully complete the initiative.

Table 1: Stakeholder Analysis

Practitioners do not always need strong support from every stakeholder; however, all stakeholders need to be aware of the change because neutral or moderate support may indicate that they do not know of the initiative. Practitioners should document the plan or actions required to bring the respective stakeholders up to the required level of support (Table 2).

Table 2: Stakeholder Analysis with Action Plans

Types of Opposition

In most cases, none of the stakeholders will be where practitioners need them to be to ensure the success of the initiative. Many may be neutral because they have not heard of the initiative. Because of these factors, practitioners likely will encounter at least one of the four forms of opposition.

While I’m really good with PowerPoint animations and can maneuver around a database, I am not a wizard with the computer. Most of my technical skills are either self-taught or from observing best practices of my peers. While I want to be an expert at everything, that is not always possible. So, why do I occasionally express opposition to technical aspects? Because I do not want to feel inept. That which is not often understood is typically resisted. Although I pride myself with my accomplishments and deliverables, the computer at times makes me feel insufficient.

This scenario applies to participants implementing Lean Six Sigma initiatives. The methodology is commonly associated with statistics, which can make people feel inept. Many Master Black Belts are trained statisticians who rely more on theory than application.

How to Overcome In the case of technical opposition, it is vital to eliminate theory and concentrate on training the basics. Make concepts easy to understand by providing real-life examples. Encourage participant confidence by building on their own examples and experiences.

Political Opposition

Politics are present within most organizations, and they need to be understood and dealt with accordingly. A former client had a senior manager participate as a stakeholder. This senior manager was extremely intelligent and had a proven track record for her style and approach in getting things done. Unfortunately, she was vocally opposed to the Lean Six Sigma deployment from Day 1 and said she would do the best she could to get the program cancelled and eliminate the need for my team.

It was extremely difficult not to take this as a personal attack. But I understood this as merely political opposition. Political opposition exists when a change is seen as a threat to the status quo. This individual was comfortable using concepts that she believed in and now someone had come in and expected her to do things differently – perhaps affecting her success.

Handpicked Content:   Leadership and Lean Six Sigma

How to Overcome Political opposition can stem from real or perceived loss on behalf of an individual. Practitioners should distinguish between what is real and what is perceived, and then work quickly to provide reasoning as to what can (and in most cases, will) be gained, versus lost, from the initiative.

Organizational Opposition

Like political opposition, organizational opposition involves the feelings of loss. The primary difference is rather than the loss being attributed to a person, it is attributed more to a group or department. Everyone wants to succeed and be recognized; however, when the recognition stems from factors outside of the organization – like a never-before-attempted methodology – egos can become bruised. A Lean Six Sigma initiative can give process owners the perception that they are no longer in the driver’s seat, making them feel unable to manage their business.

How to Overcome When dealing with organizational opposition, recognize that it involves ego, a sense of ownership, control and pride. Extreme care should be given to ensure the involvement of process owners so that they are fully engaged and feel a greater sense of control in the improvement process.

Personal Opposition

I worked with a client who seemed in full support of Lean Six Sigma. She said all of the right things and did not present a negative viewpoint. However, the initiative timeline had fallen way behind and a six-month project was nearing nine months. When asking what the client thought the holdups were, what barriers she was facing and how I could assist with their elimination, she addressed some personal, non-work-related challenges and situations. The root cause for some opposition may be extreme personal stress and fatigue caused by factors outside work.

How to Overcome Be extremely empathetic with someone who exhibits personal opposition. As coaches, practitioners often find themselves acting as a sounding board for topics unrelated to Lean and Six Sigma. Practitioners should modify their behavior toward these individuals; lessen their workload but increase involvement with these people. Everyone benefits from a little compassion, patience and understanding.

Five Guidelines You Need to Follow to Create an Effective Value Stream Map

03 December, 2018

Value stream mapping is a cornerstone of the Lean process improvement methodology, and also is a recognized tool used in Six Sigma. A value stream map illustrates the flow of materials and information as a product or service moves through a process. Creating a “current state” value stream map can help you identify waste and also makes it easier to envision an improved state for process in the future.

You can use value stream mapping to improve any process. But unless you use the tool properly, your value stream map may not capture all of the opportunities you have to improve quality and efficiency.

Here are five guidelines to getting the most benefit from the energy you put into value stream mapping:

  1. Base the Value Stream Map process on customer requirements.
    You must understand what the customer values, and use that as your starting point. If you don’t, you risk, in the words of my favorite band The Fall, paying “the highest attention to the wrong detail.”
  2. Capture the process as it operates now, not how it’s supposed to operate.
    A process that worked well when you had 20 employees may not perform as efficiently now that the business is a 200-person company. Be sure you map the process as it happens now, not the way it used to work—or how you wish it worked!
  3. Assign a Value Stream Map manager to lead the mapping effort.
    Input from team members and stakeholders is important, but appoint (or elect) one team member to draw the entire value stream map. This ensures that the manager understands the material and information flows.
  4. Walk through the process to ensure that the flow of materials and information is accurate.
    Make sure your map reflects the reality of the process—verifying this by following the process from start to finish can reveal crucial details you might have missed.
  5. Focus on one small step at a time. Make sure you capture each step accurately. For example, don’t trust the clock on the wall to measure cycle times—use a stopwatch.

Creating a value stream map of the current state of your process helps you focus on areas of waste such as excess inventory, non-value-added time, and multiple operators. As you envision the future state of your process, you can vary data on the current state map to explore the effects of possible improvements.

While it’s possible to do value stream mapping on paper, software tools like those in Companion by Minitab make value stream mapping a lot easier.

 

Let Me Count the Ways: Brainstorming Why I Love Dogs with a Fish-bone Diagram

Shelby Anderson 14 February, 2020dogs (004)
Here are some of the cute dogs in our Minitab team; pictured above from left to right is Archer, Jake, Archie, Reed, Petey and Rookie. Picture below is Birdie and King.

It’s Valentine’s Day! Whether you’re single, in a relationship, married, or something in between, Valentine’s Day is a day to celebrate and share messages of affection and love with partners, family, friends and pets. And with the average for spending on Valentine’s Day expected to rise to $196.31 per person, sometimes the best ways to celebrate in my opinion are free.

One thing I’ve realized about myself over the years? Like many countless others, I have big soft spot for in my heart for dogs. Big, small, fat, skinny, active, lazy, furry — I love them all. With over 900 million dogs around the world, there are plenty of dogs to go around, so for this Valentine’s Day I decided to celebrate by taking a deeper look and figuring out all the reasons why I love dogs.

Where to Begin?

dogs_2

I’m a visual person, so I decided that I would need something graphical but still organized to truly get to the bottom of this. Because I would need to brainstorm and list all the causes of why I love dogs, I realized a Fishbone or Ishikawa Diagram would be one of the best tools to use because it lists all the causes while clearly capturing the associations and relationships between them and the effect.

Now that the tool was picked, I had to decide how to create my Fishbone. Paper and pencil would work, but it could get messy quickly so naturally it hit me that I should use Companion by Minitab.

One Paw at a Time

With Companion open, I began creating my Fishbone diagram by selecting ‘Fishbone‘ in the ‘Brainstorming Tools‘ section under the ‘Insert‘ tab.

As with any proper Fishbone, the first step is to add the effect, which in this case was my love of dogs. Then I began to focus on affinities, as known as the primary categories of the causes, to help me group the causes appropriately. I ended up with four affinities: Appearance, Personality, Offers and Actions.

Valentines Day Fishbone on Dogs - Just Affinities

Next, I started to brainstorm and come up with specific reasons why I love dogs. With each idea, I categorized and added it under the best-fitting affinity. I put a timer on as well to ensure I was spending my time wisely and listing the most important causes. Once the timer was up, I paused and took a full look at my Fishbone Diagram.

Final Fluffy Results

Valentines Day Fishbone on Dogs

As you can see in my Final Fishbone Diagram above, I had 26 reasons for why I love dogs. Thanks to Companion by Minitab, it was easy to create and visualize all the reasons in a flash.

See anything you would have added to your Fishbone Diagram on why you like dogs? Or maybe you’re more of a fan of cats!

%d bloggers like this: