Sunday, August 17, 2025

Ignoring the Voice of the Customer

The voice of the customer (VOC) is the driving force behind Quality Function Deployment (QFD), and it sets the direction for a company to improve its products, satisfy its customers, and respond to the competition (Goetsch & Davis, 2021, p. 290). There are many ways this can go wrong. For example, the individuals comprising the VOC may be in conflict - whis is addressed by Xiao & Wang (2024). Another way is that the VOC is misrepresented - this is described in another post on this blog. A third way this can go wrong is when a company completely ignores the VOC, when the vox populi is not the vox Dei.

For this post I was going to write about the process by which the U.S. military decided to use the High Mobility Multipurpose Wheeled Vehicle (HMMWV) in Iraq. Insurgents soon learned that the HMMWV and other vehicles were vulnerable from below to Improvised Explosive Devices (IEDs). U.S. troops modified the HMMWVs and other vehicles to improve survivability by adding sandbags to the floor (Haji sandbags) and welding scrap metal to the bottom. We don’t know whether the QFD process was used in choosing the HMMWV, but the VOC certainly did not include input from anyone with experience in asymmetric warfare, where attacks from unexpected directions is a rudimentary and fundamental tactic.

Beyond that one observation, for now, I have nothing more to write on that subject, but there are numerous other examples where the VOC is minimized or misinterpreted. I do have something to write about a more recent example where the VOC was completely ignored...

Consider the 2024 “Copy Nothing” advertising campaign for Jaguar Cars (Jaguar, 2024). This campaign was launched on 19 November 2024 to announce their conversion to an all-electric brand. This conversion was not mentioned in the ad itself, nor was the fact that the ad was even for an automotive manufacturer until the word “Jaguar” appeared at the very end, in a new font and without the stylized image of a leaping jaguar that used to be their logo.

Instead, the advertisement begins with elevator doors opening onto a barren wasteland of a set. Stepping from the elevator are various stunning and brave gender-ambiguous runway models each feigning purposefulness but are really just displaying a mixture of smugness and boredom. Next, there are scenes of the models alongside the phrases “create exuberant,” “live vivid,” “delete ordinary,” “break moulds,” and “copy nothing.” The models then walk out of frame and the name of the brand is finally revealed.

The soundtrack to all this has a heavy beat, which represents the heartbeats of Jaguar stockholders as they experience cardiac arrest upon watching this ad.

Immediately, the advertisement became more popular than the Jaguar car brand itself, and indeed it became an embarrassment to Jaguar. Talk show hosts lampooned it, it was roasted on social media, and people used AI to add a jaguar back into the commercial – with the jaguar attacking the models! (Sunrise Video, 2024)

The traditional customer base of Jaguar consisted of people going for the “James Bond aesthetic.” Even when attempting to attract new customers, existing customers must not be forgotten – they are still customers, and their voices must be part of the VOC. The “Copy Nothing” ad campaign went further: Jaguar not only ignored the traditional base but seemed to reject them. This is verified in an interview with Rawdon Glover, managing director of the automaker, who stated that “We need to re-establish our brand and at a completely different price point so we need to act differently. We wanted to move away from traditional automotive stereotypes” (Brady, 2024).

Jaguar sales dropped 97.5% in Europe following the rebrand (Singh, 2025). The automaker cut 500 management jobs in the United Kingdom, and Adrian Mardell, the CEO of Jaguar’s parent company JLR, will be retiring at the end of this year (Creed, 2025).

It is not clear why some corporations ignore the VOC. Kolarska & Aldrich (1980) note that managers and leaders can become highly unresponsive when a company is in decline, and one of the reasons for this is that formally loyal customers have switched brands (“exited”). In that case, the VOC itself is harder to interpret because there are fewer customers, so the “smoothing” approach taken in Xiao & Wang (2024) is less effective in arriving at consensus.

Research by Xueming (2007) has shown that customer negative voice in the form of complaint records hurts a company’ stock price and concludes that “investments in reducing consumer negative voice could indeed make financial sense in terms of promoting firm-idiosyncratic stock returns.” This should come as no surprise.

Neither of these reasons – declining customer base or unhappy customer base – is enough to explain why Jaguar approved the “Copy Nothing” ad campaign. Further, the experiences of other brands that “went woke,” such as Bud Light’s 2023 partnership with Dylan Mulvaney, provide direct evidence that nothing good can come from Jaguar’s style of rebranding. One must therefore conclude that their actions were nothing other than corporate suicide.


References

Brady, J. (2024, 23 November). Jaguar boss hits out at 'vile hatred and intolerance' after car fans turned on firm's widely-ridiculed woke rebrand. Daily Mail. https://www.dailymail.co.uk/news/article-14117385/jaguar-boss.html

Creed, S. (2025, 1 August). On the move: Jaguar Land Rover boss behind ‘woke’ pink rebrand to quit after campaign saw carmaker universally panned. The Sun. https://www.thesun.co.uk/motors/36107702/jaguar-land-rover-boss-quits-woke-rebrand-backlash/

Goetsch, D. L. & Davis, S. B. (2021). Quality management for organizational excellence: Introduction to total quality (9th ed.). Pearson.

Jaguar. (2024, 19 November). Jaguar | Copy nothing [Video]. You Tube.https://www.youtube.com/watch?v=rLtFIrqhfng

Kolarska, L. & Aldrich, H. (1980). Exit, voice, and silence: Consumers' and managers' responses to organizational decline. Organizational Studies 1(1). https://doi.org/10.1177/017084068000100104

Singh, E. (2025, 3 July). Woke woe: Jaguar sales plummet 97.5% after fierce backlash over woke pink ‘rebrand’ that left fans slamming ‘nonsense’ EV. The Sun. https://www.thesun.co.uk/motors/35669921/jaguar-sales-plummet-woke-pink-backlash/

Sunrise Video. (2024, 30 November). New Jaguar commercial part 2 [Video]. You Tube. https://www.youtube.com/watch?v=9awWSN-h1es

Xiao, J. & Wang, X. (2024). An optimization method for handling incomplete and conflicting opinions in quality function deployment based on consistency and consensus reaching process. Computers and Industrial Engineering 183. https://doi.org/10.1016/j.cie.2023.109779

Xueming, L. (2007). Consumer negative voice and firm-idiosyncratic stock returns. Journal of Marketing 71(3). https://doi.org/10.1509/jmkg.71.3.075

Application of the PDCA Cycle

Introduction

The Plan-Do-Check-Act cycle is an iterative problem-solving technique that can be used to improve the quality of an organization’s products or services and to increase customer satisfaction. This paper begins with a description of this problem-solving technique. Next, it is applied to a typical problem encountered at a fictitious software company. Finally, the paper concludes with a discussion of some modifications to the Plan-Do-Check-Act cycle that increases the speed that solutions can be found using this process.


Description of the PDCA Cycle

The Plan-Do-Check-Act (PDCA) cycle is a problem-solving method that can be applied to either existent or latent problems (Goetsch & Davis, p. 272-278). The PDCA cycle begins with the “Plan” step, which presupposes an observation of an undesired behavior, undesired quality, or an opportunity for improvement, and to create a plan to address this. The “Do” step involves implementing the plan from step 1, on a limited basis. The “Check” step determines the success or failure of the implementation. Finally, the “Act” step – also called the “Adjust” step – involves acting based on the results of the “Check” step. If the plan did not work, a brief diagnosis is performed, and the cycle repeats with a new plan based on what was learned from the diagnosis. If the plan was successful, then the plan can be executed on a wider basis, or additional changes can be made, in the next cycle.


PDCA Application

The PDCA cycle will be demonstrated on the following fictitious problem. The software company called Gaggle dot Com makes an online email service called GaggleMail, which is in no way a copy of Google’s Gmail. The PDCA problem solving method will be used to address customer comments that the user interface (UI) is not very friendly to color blind users. The names, phone numbers, and email addresses of people experiencing this problem are recorded by customer service representatives and passed on to the team that will be fixing the problem.


Step 1: Plan

To make the UI accessible to color blind users, the software developers and graphic designers quickly decide to add a toggle for changing the color of the UI. When the toggle is activated, the screen color changes to make the text readable to color blind users. A simple web search indicates that there are different types of color-blindness (WebAIM, 2021), and it is sufficient to increase the contrast between text color and background color, as well as avoiding certain color combinations.


Step 2: Do

The developers implement the toggle in GaggleMail’s UI. In the process of doing so, the software developers test that the toggle does change text and background colors, and the results can be verified using a tool that simulates the way a color-blind person would see the page (Bureau of Internet Accessibility, Inc., 2022). According to PDCA dogma, this action belongs in the “Check” step, but it makes sense to do it here – it is an easy verification, performed by the individual who is in the position to correct any problems, and it takes almost no time. This check verifies the functionality of the toggle, that it changes text and background colors. But are they the right colors? This is where the next step becomes relevant.


Step 3: Check

As described above, the changes in color are first verified by using a color-blindness simulation tool. Assuming that the new colors work with the simulator, it is time to have the customers who reported the problem check that activating the toggle does indeed make the GaggleMail’s UI readable to them. This is called “beta testing,” and the customers are called “beta testers.” The changes to the UI will be rolled-out on a limited basis, just to the beta testers.


Step 4: Act or Adjust

The responses from the beta testers will fall into three categories: “the page is extremely readable,” “the page readability can be improved,” or “the page is still unreadable.”

If the page is extremely readable to color-blind users when using the toggle, the changes will then be made available to all users.

If the page readability can be improved, better colors are chosen, and the PDCA cycle is repeated. If the page is still unreadable to the color-blind, the toggle will be checked to verify that it is indeed working, and if so, a different set of colors are used. Again, the PDCA cycle repeats.


Conclusion

In a highly competitive industry, in which the fictitious Gaggle dot Com company is a part, it is absolutely necessary to run the PDCA cycle as rapidly as possible. This can be done by minimizing the number of employees involved and limiting the team to only the most relevant people. In this example, the relevant employees are the graphic designers, the software developers, and customer service representatives. Only one of each type of these employees will be required, for a total of three people. A formal QA process is not required.

Another way to speed the PDCA cycle is to partially move the “Check” step into the “Do” step. In this example, the software developer personally tests (checks) the toggle while he implements it during the “Do” step.

By executing the PDCA cycle as described here, the GaggleMail UI is improved to make it extremely usable to color-blind customers in just a few minutes. If a bureaucratic process is used – especially processes that involve litigious web accessibility advocates – the changes may be tied-up in committee meetings for weeks.


References

Bureau of Internet Accessibility, Inc. (2022, 4 November). What is color blindness accessibility? https://www.boia.org/blog/what-is-color-blindness-accessibility

Goetsch, D. L. & Davis, S. B. (2021). Quality management for organizational excellence: Introduction to total quality (9th ed.). Pearson.

WebAIM. (2021, 12 August). Visual disabilities: Color-blindness. https://webaim.org/articles/visual/colorblind

Misrepresenting the Voice of the Customer

In Quality Function Deployment (QFD), input requirements from the customers are translated into a set of customer needs, known as the “voice of the customer.” In small companies, there are very few employees standing between the person who determines the Voice of the Customer (VOC) and the person who implements the recommendations of the QFD analysis. In fact, they may be the same person!

When they are not the same, there is a problem not mentioned in Goetsch & Davis (2021, p. 289-302). The problem is the misrepresentation of the VOC. This can result from incorrect analysis of the customer feedback as presented in the customer needs matrix (Goetsch & Davis, p. 291-292), or it can have nefarious causes.

Here is an example of the latter from a former employer, a once large software company. To explain the situation, four pieces of background information are required. Stay with me.

First, software companies usually attempt to cater to as many people as possible. This requires consideration of the types of computers customers are using (Windows or Macintosh) as well as the browsers they are using (this was in the early 2000s, so it was Internet Explorer and Firefox). To minimize costs, software companies try to develop web pages that work on both Windows and Macintosh and in both types of browsers.

Second, the events described below happened shortly after the bursting of the dot com bubble, when even badly-ran software companies still had money. This attracted ambulance chasers, and a new player entered the chat: litigious companies going after money under the guise of enforcing the ADA, the Americans with Disabilities Act (ADA National Network, 2023). Company management was too spineless to mount a resistance, so these ADA enforcers frequently called the tune within software companies, even down to the level of individual software developers. The situation was very reminiscent of the diversity, inclusion, and equity (DEI) racket now plaguing universities, organizations, and companies (Lawson, 2025).

Third, there was an extraordinarily strong push for “open source” software, which is software whose internal mechanisms (source code) can be read by anybody. Those advocating open-source software sometimes stand to gain from stealing a competitor’s source code, but in many cases the goal is openness for the sake of openness: most advocates wouldn’t be able to understand the source code even if it were open and easily available.

Finally, the software industry is rife with politics, and not just the usual workplace cattiness. This existed even back in the early 2000s. At present, the level of politics in software companies is turned up to Spinal Tap level 11.

With that background, the nefarious misrepresentation of the VOC can now be described!

“User advocates,” the personification of the VOC, claimed that a certain type of software, Adobe Flash, was unsuitable for use on our web pages. Their reasons were as follows: it was not open source, it was claimed that it was inaccessible under ADA standards, and it was not available to all our users. Because of this, user advocates wanted all the games, financial charting apps, and other engaging user experiences on our websites to be dropped and replaced with other technologies.

An investigation into these claims and recommended actions revealed some disturbing information. While Adobe Flash was indeed closed source, it could be made ADA compliant. Also, only 3% of our users did not have Flash on their computers. For comparison, 10% of our customers used Macintosh computers.

These findings undermined the user advocates’ case for eliminating Flash from our websites. In addition, the user advocates had a visceral hatred of Flash, and those strong emotions compromised their objectivity.

Most damaging to the user advocates was the fact that their proposed actions were simply impractical: viable alternatives to Flash did not exist at the time, and that the user advocates did not consider the costs involved in changing from Flash to a (non-existent) alternative technology.

Further investigation showed that the user advocates had no evidence that customers were indeed calling for the elimination of Flash. Instead, the user advocates were advocating their own beliefs and passing them off as the customers’ voice.

In the end, these user advocates won, sort of. Interactive Flash experiences were sometimes replaced with less interactive experiences, but usually they were simply dropped with no replacements. This spread throughout the industry, and the entire world wide web is now a far less interesting place.


References

ADA National Network. (2023). Americans With Disabilities Act: Enforcement options under the Employment Provisions (Title I). https://adata.org/factsheet/enforcement-options-employment-provisions

Goetsch, D. L. & Davis, S. B. (2021). Quality management for organizational excellence: Introduction to total quality (9th ed.). Pearson.

Lawson, T. (2025, 14 January). Black is NOT a credential: The corporate scam of DEI. FIG Ink.

Statistical Process Control

Introduction

Goetsch & Davis (2021, p. 306) define Statistical Process Control as follows:

Statistical process control (SPC) is a statistical method of separating variation resulting from special causes from variation resulting from natural causes in order to eliminate the special causes and to establish and maintain consistency in the process, enabling process improvement.

SPC is a methodology for maintaining and improving quality in production processes. It is implemented so as to control variation, eliminate waste, make processes predictable, perform product inspections, all with the goal of continual improvement.

In this post, the function of management in SPC is described. This includes management’s role in establishing quality measures and using control charts to maintain the level of quality. The actions management must take when quality from standards are also described.


Role of Management

Management’s primary responsibilities is to establish the production quality level so that it matches customers’ expectations. This requires setting measurable standards for product and service quality. By providing these standards, management demonstrates a commitment to quality, and to convert that commitment into a culture of quality. Research has shown that a gradual implementation of SPC is more successful than abrupt enforcement (Bushe, 1988), but SPC implantation does set the direction. These production quality levels are also required for control charts to be applicable to monitor and maintain quality (Rungtusanatham, 2001).

Management must also be involved in establishing budgets and to allocate resources in support of statistical process controls. This includes funding new machines and modern technologies that may be required for process improvements.

In addition, management is responsible for approving and sometimes conducting training programs needed by employees to use SPC effectively (Goetsch & Davis, p. 320).

Management is useful for evaluating and approving changes to processes suggested by other departments. In a sense, management is acting like a sieve, allowing only promising ideas through to line workers. Besides this, implementing these changes may involve new machinery or personnel changes, which are budgetary issues.


When Production Quality Slips

Management is also involved, or should be involved, in diagnosing problems when production quality falls below the established levels. In the context of manufacturing, machine operators would have the most direct understanding of the problem. It is managers’ responsibility to appraise the operator’s findings and approve the budget necessary to repair the machine or replace it.

Another situation that requires managerial intervention is when a supplier’s parts fall below the expected quality level. There are several courses of action, all of which require a manager’s decision.

One option is for the manager to contact the supplier to get an estimate for the time needed for them to resume manufacturing products that are within specifications. Based on this information, the manager may have to delay delivery to his customers or deliver less than what was promised.

A second option is to temporarily require the supplier to provide additional parts with the hope that there will be enough parts that are within specifications to satisfy customer orders.

A third option is to switch suppliers, which requires a manager’s decision. This will entail delays in fulfilling customer orders.

The least desirable option is to provide the customer with substandard parts. This is contrary to the philosophy of total quality management, however.


Conclusion

Statistical process control is a vital methodology for ensuring consistent quality and continuous improvement in production contexts. Management plays a pivotal role in successfully implementing SPC by setting quality standards, allocating financial resources for training, new machinery, etc. Managers are also essential for addressing deviations from quality standards. This could entail working with machine operators to diagnose and resolve such problems or making decisions about supplier relationships. By accepting these responsibilities, managers uphold the agreed-upon product quality standards as well as maintain customer satisfaction.


References

Bushe, G. (1988). Cultural contradictions of statistical process control in American manufacturing organizations. Journal of Management 14(1). https://doi.org/10.1177/014920638801400103

Goetsch, D. L. & Davis, S. B. (2021). Quality management for organizational excellence: Introduction to total quality (9th ed.). Pearson.

Rungtusanatham, M. (2001). Beyond improved quality: the motivational effects of statistical process control. Journal of Operations Management 19(6). https://doi.org/10.1016/S0272-6963(01)00070-5

Quality Function Deployment

Introduction

This post explains how Quality Function Deployment (QFD) could be used in a software company. QFD makes (some) sense for improving existing products, but not for new products, unless that new product is a copy of a competitor’s product. For this thread it is assumed that the goal is to improve an existing product.

For this example, imagine we work for a software company called Gaggle dot Com, and one of our products is an online email service called GaggleMail, which is completely not a copy of Google’s Gmail. In this post the Quality Function Deployment (QFD) method is applied to GaggleMail to improve its quality and increase customer satisfaction.


The Problem with QFD’s Math

Goetsch & Davis state “The math used to establish priorities lacks the precision necessary for a Six Sigma company” (p. 289). A “Six Sigma company” is not defined, nor is the actual meanings of the calculations involved in QFD. Take for example the value engineering equation (Goetsch & Davis, p. 290):

V = F/C
Where V represents value, F is function, and C is cost. What exactly are the units on these three variables, and how exactly are they measured? The purpose of this, along with the other calculations in QFD, is to give the appearance of quantitative understanding, and would be immediately rejected by the technical employees involved as being nothing but smoke and mirrors.

This post takes a different approach, using popularity for ranking the desired features of a software product, which is similar to the approach taken in Wong et al, (2023). This method may be incorrect, but at least technically minded people can remain engaged.


Step 1: Developing the Set of Customer Needs

Because there are so many parts to an online email service, it is decided that the focus will be on improving the user interface (UI) of GaggleMail.

After the scope is determined, the next step is to choose a cross-functional QFD team with members from development, customer support, graphic design, marketing, sales, and other relevant departments. All the team members should be somehow invested in the product under review. This is a problem at large software companies like Gaggle dot Com – there can be so many products that some employees do not use them all.

This team would prepare an online questionnaire, which can include initial guesses from the team about what the users want or need, along with free-form input for unanticipated needs. Team members who work in customer support will provide the most valuable input. For each of the initial guesses, we wish to determine two pieces of information: importance of improving the feature for the customer, and customer satisfaction with the existing feature.

The number of users to which this questionnaire is sent depends on the total size of the user base: if there are hundreds of thousands of GaggleMail users, then there would be too many responses to analyze. It would make sense to choose the users carefully. The users should be what are called “super users” – users that use GaggleMail heavily and are enthusiastic about the product. A second, separate, qualification to receive the questionnaire is that the recipient will not pass information on to competitors, and that they do not enter misleading information that would degrade the quality of the product. This type of security is hard to control except (maybe) in an in-person focus group.

The responses to the questionnaire represent the voice of the customer (VOC).

The QFD team will then sort and prioritize the responses to the questionnaire, including user input that doesn’t fit into the predefined questions. The responses can be written onto three-by-five cards and then arranged into an affinity diagram (Goetsch & Davis, p. 291) or into a tree diagram. An affinity diagram would be best, as the three-by-five cards can either be physically grouped or tallies can be written on them to record the number of people who are interested in a feature. This can then be translated into a table listing the customer needs (WHATs) and the corresponding importance for each need, rated on a 1 to 5 scale – where the numbers (ranks) come from the popularity of the requests for improvements.

Since the UI of GaggleMail is the thing under consideration, requests for changes to non-UI features (like over-abundance of spam in the In Box and valid emails in the Spam folder) can be ignored. This information should not be thrown away, but instead should be saved for non-UI improvements.


Step 2: Planning the Improvement Strategy

The second step is to compare our product against the competition. This can be done by a focus group or by sending a questionnaire to users of a competing product, using the same customer needs matrix from stage 1. This should be done for two prime competitors (Goetsch & Davis, p. 293).

The competitors’ customer satisfaction (CS) on each feature is again ranked 1 to 5.

This information is entered into the Planning Matrix: the first column should be the CS rating of our own product, and the second and third columns are for the CS ratings for the two competitors.

The company should then decide what should be the desired ranking (called level of improvement on p. 293 of Goetsch & Davis) for each customer need. There isn’t an objective way of determining this, but several factors must be considered: the time and money required to implement this feature, how valuable it is to the customer, and the CS rankings of the two competitors for the same feature. These rankings are entered into the Planning Matrix in the fourth column, called “Our Planned CS Rating.”


Step 3: Selecting the Technical Requirements

This step can be called creating the "voice of the company" (Goetsch & Davis, p. 295). In it, technical requirements are generated from the VOC findings. These requirements are descriptions of HOW to implement the desired changes to GaggleMail.


Step 4: Evaluating Interrelationships Between WHATs and HOWs

At this step, the relationships between the customer needs (step 1) and the technical requirements (from step 3) are made. The interrelationships between the two are placed into a matrix (Goetsch & Davis, p. 297), ranked 0 of 5, where 0 means that there is no relationship between a particular customer need and a particular technical requirement, and a 5 means that there is a direct relationship between the need and the requirement.


Step 5: Evaluating the Correlation Between the HOWs

The fifth step involves the creation of the Correlation Matrix that indicates the relationship between each pair of technical requirements. These relationships can be classified into being supporting, impeding, or no correlation. (Goetsch & Davis, p. 298). A Correlation Matrix would be useful if the changes involve multiple teams (in the text’s example this would be the authors, graphic designers, book binders, etc.). In the context of a software company like Gaggle.com, these correlations would be better understood if a dependency graph were used. Using such a graph, the dependency between each task would be obvious, tasks that can be done in parallel can be identified, critical paths can be determined, and so on (Tannenbaum, p. 274-309).


Step 6: Setting the Design Targets

In this last stage, targets for improvement (design targets) are specified. As in earlier steps, none of the calculations in this step (Goetsch & Davis, p. 298-303) make any sense.


Conclusion

Now, after all this is done, work can (finally) proceed on making the desired changes. In a company with minimal bureaucracy, development can proceed following the completion of step 1 and step 2, saving weeks or months in a highly competitive industry.

As mentioned in the introduction, QFD would work best for improving existing products. However, there are some situations that are not addressed in Goetsch & Davis.

First, there is no explanation of how to manage cases of conflicting VOC needs (Xiao & Wang, 2024). For example, suppose some customers want the text in Gaggle Mail to be larger, others want it to be smaller, and some customers want the text to be visible to those who are color blind. In a real software company, the developers and graphic designers would devise a solution that allows all customers to be happy.

A second missing aspect, at least in Goetsch & Davis, is the situation where a competitor has a feature that is not in Gaggle Mail. It could be addressed in the questionnaire designed in Step 1, but what if it goes unnoticed?

Finally, the success of the improvements to Gaggle Mail are not considered in Goetsch & Davis. This is not unexpected since actual work only begins after the QFD analysis is complete. With Gaggle dot Com, or really any company, the changes would be evaluated by some of the people that determined the VOC in step 1. Without this, the QFD analysis as well as actual work on Gaggle Mail can result in what software engineers call “ready-fire-aim.”


References

Goetsch, D. L. & Davis, S. B. (2021). Quality management for organizational excellence: Introduction to total quality (9th ed.). Pearson.

Tannenbaum, P. (2007). Excursions in Modern Mathematics with Mini-Excursions (6th ed.). Pearson.

Wong, J. et al. (2023). New approach for quality function deployment based on social network analysis and interval 2-tuple Pythagorean fuzzy linguistic information. Computers and Industrial Engineering 183. https://doi.org/10.1016/j.cie.2023.109554

Xiao, J. & Wang, X. (2024). An optimization method for handling incomplete and conflicting opinions in quality function deployment based on consistency and consensus reaching process. Computers and Industrial Engineering 183. https://doi.org/10.1016/j.cie.2023.109779

Cheerful PDCA, OODA, and DOCA Loops

According to Goetsch & Davis, the Plan-Do-Check-Act (PDCA) cycle is “a kind of generic, basic format for bringing order and logic to the problem-solving process.” It is not as detailed as a step-by-step process but rather is “a simple cycle that is allowed to continue until a solution’s results match the planned outcome.” (Goetsch & Davis, 2021, p. 274).

Similar cyclic processes are also used in military and local-defense contexts.

One of them is the so-called OODA loop (Boyd, 1995), which stands for observe-orient-decide-act. This is a decision-making framework that consists of the following stages:

  • Observe – gather information from the environment
  • Orient – analyze this collected information and determine its relevance to one’s mission
  • Decide – Choose a course of action based on the orientation
  • Act – the chosen action is then executed

So, for example, suppose you see a group of fighters (observe). You analyze the situation to consider whether those fighters are friends or foes (orient). If they’re foes, you decide whether and how to act based on their size, position, and resources and your own size, position, and resources. If you decide to attack, do so with extreme aggression! Wash, rinse, and repeat.

The OODA loop is fundamental to all military action. Defeating the enemy requires either running your OODA loop faster than the bad guy’s loop or obstructing his OODA loop such as through misdirection.

In principle, the PCDA loop is similar to the OODA loop, except that observation happens earlier in the OODA loop than it does in the PCDA loop.

Cheerfully Executing a DOCA Loop
From Reuter's Training with America's Militias

Another loop, performed mostly by local-defense groups and other groups practicing irregular or so-called “4th Generation” warfare (Lind & Thiele, 2015), is the DOCA loop, which stands for Disperse-Orient-Concentrate-Act. The idea is that in the context of irregular warfare, dispersed groups of fighters are hard to kill because they are… dispersed, and they only become combat-effective when they are close together. The stages are as follows:

  • Disperse – spread out
  • Orient – analyze the situation and determine its relevance to the mission
  • Concentrate – move close to each other (or otherwise work in some coordinated manner)
  • Act – execute the previously agreed-upon plan

Because this is a loop, the next stage after the act is to disperse. The neat thing is that the group of fighters is at risk only during the “concentrate” stage when they are close together.

For example, suppose the fighters’ mission is to destroy an enemy’s railroad line. They start out as dispersed. They make observations (orient) about the location of the railway line and how heavily it is defended. If they decide to complete their mission, they converge on the railroad line (concentrate) and sabotage it in some way (act). The loop starts over, so they disperse. Again, wash-rinse-repeat.

The OODA loop is the more fundamental of the two types of loops and works everywhere, even in business situations (D. Brown Management, 2024). Besides explicitly calling-out the “observe” stage, the primary difference between the OODA and DOCA loops is the way they repeat. With OODA, fighters would then return to home base and report their results. With the DOCA loops, the fighters have the option of building upon their success and continue with the mayhem!

Fun stuff!


References

Boyd, J. (1995). The Essence of Winning and Losing. https://web.archive.org/web/20110324054054/http://www.danford.net/boyd/essence.htm

D. Brown Management. (2024). Observe, Orient, Decide, and Act (The OODA Loop). https://dbmteam.com/insights/observe-orient-decide-and-act-the-ooda-loop

Goetsch, D. L. & Davis, S. B. (2021). Quality management for organizational excellence: Introduction to total quality (9th ed.). Pearson.

Lind, W. S. & Thiele, G. A. (2015). 4th Generation Warfare Handbook. Castalia House. https://ia802901.us.archive.org/27/items/4thGenerationWarfareHandbookWilliamS.Lind28129/4th_Generation_Warfare_Handbook_-_William_S._Lind%25281%2529.pdf

Iterating SWOT Analysis

SWOT (Strengths, Weaknesses, Opportunities, and Threats) analysis has applications to both personal and organizational situations. It is extremely useful for strategic decision-making, if properly and thoughtfully applied, with or without “flipcharts” (Goetsch & Davis, 2021, p. 47).

SWOT analysis requires an individual or company to take stock of their internal strengths and weaknesses while observing the environment for both opportunities and threats. In a personal context, it allows one to identify leadership and organizational strengths as well as observing a relative lack of those strengths among one’s peers. This then is an opportunity for advancement.

In a different (non-competitive) situation, an individual may create a position to fulfill a need that no one even knew existed!

In a corporate context, SWOT analysis not only requires knowledge of the company’s strengths and weaknesses, but also the strengths and weaknesses of competitors. (Why oh why don’t managers maintain their employees’ resumes on hand?)

The gap between the strengths of one’s own company and a competitor’s strengths can be translated into a business opportunity. The best opportunity is when your own strengths fill in the weaknesses of the competitor. Of course, one’s own company’s weaknesses is a vulnerability that competitors will use to the utmost, if they are worthy competitors!

Something that is often overlooked about SWOT analysis, that Swart (2022) notes, is that it is an iterative process is that strengths, when acted upon, create new opportunities. This holds for both individuals and corporations. This idea is echoed in a quote attributed to Sun Tzu: “opportunities multiply as they are seized” (Sun Tzu, n/d). Taking initiative and actively pursuing possibilities are crucial for success in both personal and professional contexts.


References

Goetsch, D. L. & Davis, S. B. (2021). Quality management for organizational excellence: Introduction to total quality (9th ed.). Pearson.

Sun Tzu. (approx.. 5th century BC). The Art of War. https://gutenberg.org/cache/epub/66706/pg66706.txt

Swart, J. (2022, December 26). The Personal SWOT Analysis as a Coaching Tool. https://doi.org/10.13140/RG.2.2.26485.04322