Guest Column | May 27, 2020

How Military Thinking Can Improve Medtech Decision-Making Under Stressful Conditions

By Stacey Largent, ValSource, Inc.

Toy Soliders

Anyone in the medical device industry can tell you the number of regulatory requirements and internal procedures they are obligated to comply with to produce a safe and effective medical device. If you add in the complication of stressful conditions, such as the COVID-19 pandemic, many companies are struggling to make both typical day-to-day decisions and those required in the current environment (e.g., how to maintain compliance with an equipment maintenance program with a reduced workforce and/or supplies).

Decisions made within the first hours, days, and even weeks after the start of a crisis are the most critical to ensuring success and preventing realization of risks. Bruce T. Blythe, chairman of the R3 Continuum, once said that decision-making in a crisis is “located somewhere between analysis and intuition.” Intuition equates to making the right decision without necessarily knowing the reason why.

Here we examine decision-making practices used in the military to provide insight into how operators in the medical device sector can reach appropriate conclusions during the COVID-19 pandemic or under other stressful conditions.

Powell’s 40-70 Rule

Colin Powell, retired four-star U.S. Army general — as well as former National Security Advisor, commander of the U.S. Army Forces Command, and chairman of the Joint Chiefs of Staff — has his own interpretation of the paradigm presented by Blythe.

A well-respected military leader, Powell has been forced to make numerous decisions under stressful conditions and has often been quoted about what is now known as the 40-70 Rule: “Use the formula P = 40 to 70, in which P stands for the probability of success and its numbers indicate the percentage of information acquired. Once the information is in the 40 to 70 range, go with your gut.”[1]

So, what does this mean?

Judgment is more important than additional data; if you wait until you are 100 percent sure of success, often it will be too late. This rule equates to finding that point in time when intuition combined with some level of information or data outperforms the level of success obtained by waiting for more knowledge. After estimating the P value for a situation, if it falls within that sweet spot, then intuition should be used to make the correct decision.

Individuals put in charge of critical decisions must be decisive and may not always have all necessary information available to them at the time. If P is less than 40 (i.e., less than 40 percent of the information needed is available), the decision is likely being made too swiftly and will not be well informed; if P exceeds 70, the decision-maker may be viewed as indecisive. If time is of the essence in making a decision, P can be closer to 40, because waiting for additional information may result in deadlines being missed or even the wrong decision being made as a result of “analysis paralysis” (too much information results in overthinking a situation and no decision is made).

Powell has said that, in the military, they are taught to only use about 30 percent of the time allotted for decision making, with the objective of reducing risk. This principle minimizes the amount of time available for a risk to be realized.

Prospect Theory

In addition to varying amounts of available information, the presence of stressful conditions during decision making can impact the decision. Prospect theory was introduced in 1979 by psychologists Daniel Kahneman and Amos Tversky, and was further evaluated in the military setting by U.S. Army Lt. Col. James Schultz. This theory, as explained by Schultz, suggests that “the decision maker’s reference point determines the domain in which he makes a decision.” In other words, a person’s decision is based on perception of gains or losses with respect to their current environment or situation, rather than the outcome. Because gains and losses are valued differently by everyone, decisions are typically made based on perceived gains rather than perceived losses.

Three different biases are built into prospect theory:

  1. Certainty — put more importance into a decision option that is certain (i.e., choose the option that is an assured “win” even if it results in a smaller gain)
  2. Isolation Effect — focus on elements unique to each option
  3. Loss Aversion — belief that losses result in greater negative reactions than gains result in positive ones[2]

While prospect theory does not predict the choice that will be made, it can demonstrate the decision-maker’s risk tolerance. Essentially, an individual will be more risk-seeking if he or she is in a negative state, (i.e., a state of perceived losses), versus being more risk-adverse if he or she is in a more positive state (perceived gains).

As an example, consider the decision to launch World War II’s failed Operation Market Garden to test the theory. Operation Market Garden was intended to seize key bridges in Holland through use of the Allied Airborne Army under the command of Supreme Commander Gen. Dwight D. Eisenhower. The operation would utilize previously effective principles used at Normandy and its success would increase troop morale. However, Operation Market Garden also had great potential for Allied losses (e.g., a delay in establishing a base at Antwerp).

Eisenhower approved the operation despite having a less risky alternative available and a previous history of being risk-adverse in his military decision-making. If Eisenhower proceeded with action to improve logistics, it would delay more decisive action against the Germans (ultimately seen as a gain). If he delayed, military force would become relatively weaker (seen as a loss). With his domain being one of loss, as his opportunity to seize the bridgeheads was slipping away, a change in his reference point and framing likely contributed to his proceeding with the risky operation. Ultimately, Eisenhower chose the option that offered the largest gain and greatest risk (i.e., favored enemy success/Allied failure).[5]

Recognition Planning Model

Military planners have historically utilized the military decision-making process (MDMP) to facilitate and expedite decision-making to get ahead of and neutralize threats. The MDMP is often abbreviated to speed up the process, as it typically requires evaluating three courses of action (COA). This is where the recognition-primed (RPD) model, presented by Gary A. Klein et al., can be of benefit. It speeds up the decision-making process through use of quick mental simulations.

The RPD model is a paradigm where one probable COA is identified through intuition based on knowledge, training, and experience, in lieu of an analytical process where comparisons are made. The shortcomings of the MDMP and RPD models led Klein and John F. Schmitt to develop the recognition planning model (RPM), which has been shown to increase planning tempo by approximately 20 percent. The basic RPM can be described in four steps:

  • Identify mission — guided by situational information and/or tasking from higher ranks
  • Test/operationalize COA — may identify weaknesses that require evaluating consequences of an alternative COA
  • War game COA — tests whether the COA will be effective against the opposition
  • Develop orders — said orders are then disseminated, executed, and potentially improvised.[4]

Risk-Based Thinking And Risk Management Tools

One purpose of risk management and risk-based thinking is to assist informed decisions with respect to risk. While quality-related risk management is typically thought of in the pharmaceutical and medical device industries, its basic principles, along with risk-based thinking, are universally applicable, including in the military. Risk-based thinking requires companies to evaluate risk when developing processes, controls, and continuous improvements.

The previously discussed theories and models all include some element of risk-based thinking. However, the use of risk management tools (both formal and informal) also can help with analysis paralysis when too much information is available. These tools can provide structure and guidance on sorting through what is useful and when information is lacking. They also can help identify which decisions should be prioritized. Ultimately, risk-based thinking (and the optional use of a formal tool) offers a method to think about things more strategically when “unknowns” are greater than “knowns” during a stressful time.

Final Thoughts

Wartime and military decisions occur under some of the most taxing conditions, but methods have evolved to aid in these decisions, and their principles can be applied universally. While the exact manner of the relationship between stress and decision-making remains unclear, we know stress can narrow the focus of one’s attention. Many hypothesize that stress may not necessarily be detrimental, as it can simplify a crisis by helping the decision-maker focus on critical issues or activities.[3]

Risk-based thinking and tools can help to further focus thoughts for decision-making. It is important to understand what decisions should be made quickly and which ones can be delegated or even left undecided. When insufficient information is available, individuals should rely on intuition, prior experience, and training to make as educated a decision as possible. In many cases, decisions can be reversed should additional information provide further clarity, allowing for course correction.

While the medical device industry is not the military, and producing medical devices is not war, the methods described in this article can be applied in a variety of situations. The use of risk-based thinking and the structured approach of using risk management tools can provide the framework needed to make decisions under duress.

Thank you to Dr. Jim Vesper and Chris Smalley for their input on this article.


  1. General Colin Powel – A Leadership Primer
  2. Kahneman, D., & Tversky, A. (1979). Prospect Theory: An Analysis of Decision under Risk. Econometrica, 47(2), 263-291.
  3. Kowalski-Trakofler, K.M. & Vaught, C. (2003) Judgment and decision making under stress: an overview for emergency managers. International Journal of Emergency Management 2003; Jun: 1(3):278-289. Retrieved from:
  4. Ross, Karol & Klein, Gary & Thunholm, Peter & Schmitt, John & Baxter, Holly. (2004). The Recognition-Primed Decision Model.
  5. Schultz, J.V. (1997). A Framework for Military Decision Making under Risks. [Master’s thesis, School of Advance Airpower Studies, Maxwell Air Force Base, Alabama]. Retrieved from:

About The Author

Stacey Largent is a senior consultant at ValSource, Inc. She specializes in quality risk management (QRM), aseptic processing, sterility assurance, and supporting the needs of vaccine/biopharmaceutical product development and manufacturing. She holds a B.S. in biomedical engineering from Drexel University and an M. Eng. in biological chemical engineering from Lehigh University. With almost 20 years in the pharmaceutical and medical device industry, she spent over 16 years at Merck and most recently served as head of its global QRM Center of Excellence. Stacey also has experience in evaluating new technologies, tech transfer, leading complex investigations, and method development. She can be reached at