Guest Column | July 24, 2020

Lessons Learned From An Outsourced Validation Project That Came Off The Rails

By Allan Marinelli, Quality Validation 360 Inc.

Train Tipping Over

The pharmaceutical, biopharmaceutical, and medical device industries have been in constant flux in recent years as a result of organizational changes, mergers/acquisitions, and the introduction of new technology systems. Sponsor companies must learn to swiftly adapt to meet the demands of these new market conditions, and this often requires working with consultants while simultaneously improving upon their own internal capabilities.

Recently, I was hired by a sponsor company (the “client”) to support initiatives related to computer system validation (CSV), information technology (IT) validation, and automation system validation, specifically:

  1. to support its existing infrastructure services group, since the company lacked the resources to keep up with increasing demand;
  2. to support the CSVs, infrastructure validations, and automation system validation efforts that were expected to go live within a certain time frame prior to inspection;
  3. to provide advice and recommendations for increasing efficiency and effectiveness while maintaining/sustaining quality directives and quality paradigms, including using science- and risk-based tools;
  4. to always use the most updated standard when conducting any validation and statistical approaches.

This article discusses my experiences and perspectives on this engagement, including best practices and lessons learned that other life sciences sponsor companies can apply in their interactions with their consultants.

I was initially hired as a consultant in a QA CSV/IT capacity at one of the client’s U.S. subsidiary companies. My mandates were as follows:

  1. Write/revamp general (IT/CSV, etc.) SOPs, governance plans, and data integrity-related SOPs:
    1. Since there was a lack of instruction with respect to incorporating doctrines of data integrity, CFR Part 11, audit trail reviews, or how to conduct audit trail reviews, I was able to redline the documents (approximately 90 percent of the reviewed documents were incorrect or did not meet current technology/application standards, so 90 percent of the documents were redlined).
  2. Conduct nine system CSV/IT assessments, each containing 15 pages, by meeting with the respective “owners” of each system in question.

As a result of successfully completing these tasks, the client extended my contract several times beyond the original scope of work. In addition, the client asked me to work at the parent company’s site outside the U.S., after the assigned tasks/projects at the subsidiary were completed.

Unclear SOPs Lead To Inefficiency, Interruptions

In my first week at the parent company, I began reading and understanding the SOPs that were part of the training to acquire signing authority in the QA CSV/IT role. I noticed that the SOPs were written in two languages and that the English version was positioned toward the end of the combined document. I also observed some grammatical and sentence structural errors, along with a lack of logical flow (lack of logic continuity) in technical content with respect to CSV, IT validation, and automation systems validations, etc. Furthermore, the paragraphs illustrating the emphasis of the SOPs or intended uses were written very vaguely at best.

This resulted in a culture in which many of the full-time workers, including many of their most experienced scientists, were constantly interrupting management on a daily basis to request additional clarity. The SOPs, even though approved by management, were not clearly written to reflect what needed to be completed chronologically, in parallel, or in series.

For example, one of the SOPs for analyzing samples directed adding hydrochloric acid when performing a certain step, using as much as needed to create X reaction. However, the SOP did not indicate what would happen if one added too much hydrochloric acid or what the expected minimum and maximum allowable amounts should be added at that step. A more specific instruction, such as, for example, stating that hydrochloric acid should be added until the pH is 4, would avoid having scientists continuously barging into the management office to ask questions. The lack of specific direction was causing numerous interruptions daily, reducing the ability of both management and scientists to work efficiently and effectively.

Lessons Learned

Rather than assuming that what was initially written for the research and development (R&D) department many years ago would be “good enough” to be used when analyzing cGMP engineering runs or manufacturing runs, I recommended a total revamp of the existing SOPs, whether for related laboratory or manufacturing applications, before using them in those applications.

However, if time is restricted, such that these SOPs must be adjusted/corrected in parallel or on the fly, then a designated leader should be assigned for each segment of the manufacturing process. This should be someone in a Project/Engineering Manufacturing Management role or other applicable authority (not a QA lead with insufficient manufacturing, validation, or engineering experience) who could discuss updates to the SOPs, on a daily or two to three times a week basis, with the relevant designee or author of the SOPs. Consequently, the SOPs would remain as a “work in progress” version until they are ready to be submitted – as quickly as possible – in their finalized state as the next revision.

NOTE: Any updates to the next revision can potentially occur from one week later to a month or months later depending on the level of complexity commensurate with the level of risk of approving the selected SOPs as quickly as possible within the bandwidth available.

I also recommended that the client put the primary language conundrum aside, as well as other non-value-added unsubstantiated opinions or assumptions. Instead, the company should focus on what was clearly portrayed in writing and in the daily, weekly, and monthly verbal communications agreed upon by the entire team, including management, that were derived through various brainstorming sessions, educational/training sessions, and writing sessions.

Adhering to this practice and maintaining discipline with respect to the aforementioned recommendations will help the team focus on the agreed list of tasks, which in turn will increase the probability that the project would be successfully completed. At the same time, the company could maintain a “baseline quality” that is intended to be subsequently improved in the next review/approval iteration cycle by using the system design life cycle approach. Instead, changes were made months later that were authorized by a single person (an inexperienced QA lead with insufficient overall manufacturing, validation, engineering, and FDA inspection experience), without the agreement of other management stakeholders.  This resulted in throwing away all the previous lesson learned and tasks that were agreed upon months before by all stakeholders involved.  The authorized QA lead subsequently replied that “We do not have time,” which angered the other stakeholders.

Lack Of Alignment Leads To Deviations, Delays

I was initially given the authorization to promptly review, provide recommendations, and approve computer system validation protocols. I recommended the following to avoid unnecessary deviations that could cause additional unforeseen delays in the project:

  1. The Instruction/Action column must be clearly written, including applicable sub-steps or prerequisites, to account for the completion of each action or intended action in each step to reach the expected output as written.
    1. It was observed that in the pre-execution phase, the initial instructions had only one or two clauses, but six preliminary actions needed to be performed as part of those clauses or prerequisite steps. As a result, when the user attempted to execute the protocol, the expected result or output result was not reached. This resulted in many failed steps, but if the instructions had been correctly written in the first place, the steps would have instead passed.
    2. This lack of alignment between the Instruction/Action column and the Expected Result column represented at least 90 percent of that protocol. Therefore, I put together examples of what to do and what not to do in a PowerPoint presentation to train the team and other stakeholders.
    3. It was observed during the execution phase that the Actual Result column would lack information to correspond with the Expected Result column, leading to more questions than answers. In fact, it should have been self-explanatory that when you transcribe information in the Actual Result column it would be supported with clear, readable screenshots.
    4. This protocol, among others on the list, needed to be rewritten and re-executed based on the points mentioned in (a), (b), and (c), coupled with a lack of continuity between the sequence of steps. A 30-minute training session was highly recommended to avoid additional delays.

However, because it was later known that there were many other protocols that had to be rewritten in this fashion, the QA lead told me that due to time constraints, unless there was something really wrong with the steps not working, I should simply sign/approve the next stack of protocols, even though the protocols were not in congruence with what I initially suggested.

Lessons Learned

The remainder of the protocols were executed as outlined by the QA lead, but the protocols that I initially indicated needed to be written were not rewritten. This led to the generation of unnecessary deviations, and the project subsequently came to a screeching halt.

It is always best practice to write test scripts as coherently as possible so that all intended actions, including prerequisite steps, expected results, and actual results, including the corresponding screenshots (readable and clear), are in alignment with the intended uses of the protocol. Otherwise, a number of deviations may need to be written, causing even more delays. This will also facilitate the explanations by the relevant stakeholders to the auditee and/or auditors during an FDA inspection.

Applying Current Statistical Sampling Standards

The client wanted to establish a sampling plan instead of continuing to conduct manual review of data output from its enterprise resource planning (ERP) software. Based on their research, they decided to use ASQ Z1.4-2003 standard.

In performing my own research, I discovered that an updated version of the standard had been published in 2018 (ASQ Z1.4-2003 R2018), which I brought to the attention of the client. I recommended that the client either (A) prepare a defense showing that the changes do not affect the validity of its current CSV approach, in the event the FDA raises any questions, or (B) purchase the inexpensive current online version for immediate use. However, the client was in a hurry to complete the project, and so they replied (essentially), “statistics is statistics, and methods never change” and decided to use the 2003 version of the ASQ standard without preparing a defense.

Lessons Learned

It is always advisable to use current standards in conducting computer system / IT validation — or at a minimum develop a defense for using an older version, so you are prepared should auditors or FDA inspectors question your approach. If the FDA inspector deems your rationale unacceptable, they may question other aspects your validation protocol that they originally did not intended to examine.

Conclusion

The lessons learned in this paper led to recommendations to ameliorate the previous ways the company was conducting its business so it could move forward with the smallest number of potential FDA 483s and warning letters, as well as with increased project efficiency and effectiveness. Other sponsor companies that want to conduct business in the U.S. should follow the best practices stipulated in this article to ensure the process of obtaining licensure for their facility goes quickly and smoothly.

About The Author:

AllanAllan Marinelli has over 25 years of global cGMP experience (FDA, EMA, SFDA, KFDA, WHO, etc.). He is currently president/CEO of Quality Validation 360 Inc., providing consultation services to the (bio)pharmaceutical, medical device, and vaccine industries. Marinelli has authored 59 publications, including peer-reviewed papers (Institute of Validation Technology), chapters of PDA books, articles in ASQ journals, online articles (Life Science Leader, Pharmaceutical Online, Bioprocess Online, Outsourced Pharma, etc.). He has provided comments to ISPE’s Baseline Guides and recently conducted reviews of the new ISPE Good Practice Guide drafts on data integrity by design and equipment reliability. You can contact him at amarinelli360@gmail.com and connect with him on LinkedIn.