A lot of attention has been given to the role that human factors play in eliminating data integrity failures within organizations. The ISPE Cultural Excellence Report (1) should be required reading for every organization that wants to better their data integrity and quality practices. But I want to introduce another factor contributing to data integrity issues in the organization: process capability.
Process capability – what has changed over the years?
The importance of process capability was highlighted when reviewing QC analytical records for a firm where the ability to generate an analytical value “right first time” was better 2-3 years ago than today. Same method, same equipment, and even a few personnel added to the QC Lab. So what changed? Manufacturing output more than doubled over 3 years with more line extensions. Manufacturing now produces more batches and more stability samples. Manufacturing capability improved, while the QC Lab capability did not; in fact, the equipment is now older and more likely to fail than before. Out of date methods were not consistently able to generate values that could be released the first time, so rework (and investigations) began to take away from bench time. A downward spiral of issues and investigations resulted in slipped schedules and unhappy customers, putting everyone on edge.
Process Capability and personnel pressures
When any process (purchasing, manufacturing, lab, calibration) is not sufficiently designed to work right the first time, every time at the timeliness of output demanded by customers, the outcome will be pressure applied to the organization. Personnel are tempted to falsify data or take improper shortcuts to “get it out the door” and relieve this pressure. Culture can mitigate the situation by reminding people that honest reporting of data and issues is expected and supported by the organization. That does not make the pressure go away—it is a temporary “escape valve” for that pressure. Pressure is minimal when repeatable processes meet the demands.
Process Capability: the QC lab blind spot
Once the root cause –lack of capability—is understood, its application yields some immediate insights. First, labs are trouble spots for data integrity because they are rarely designed to be capable. Yes, admit it. Manufacturing processes are designed to a known quantity of output. QC Labs, as a routine, are not. From a data perspective, a QC Lab is a miniature production process with people, procedure steps, equipment, vessels, inputs and output (data). It can be modeled just like a manufacturing operation. The capability could be known. But why is it so seldom done?
- There are many lab methods—several times more than manufacturing processes;
- QC Labs rarely have support specialists as do their Manufacturing counterparts: planners, equipment specialists, validation specialists, process design/engineering, IT, etc. Many QC Labs do all functions internally;
- Site leaders often have no insight into their QC Labs because their career progression did not include a QC Lab rotation.
The results are QC Labs with capabilities not matched to the Manufacturing operations they are tasked to support.
Process Capability: manufacturing’s moment of truth
Lack of capability can also impact Manufacturing operations. Processes created under inflexible time constraints (e.g. first to file, exclusivity period) are at risk for this same issue. These processes can lack the rigorous design space develop that assures a high probability of right first time. This observation is consistent with the increased data integrity issues in generic firms where the perfect combination of low profit and timing pressures to file can result in manufacturing processes that can be validated but are not capable of “right first-time” outputs that keep a plant on schedule.
Death of the quality culture
An unintended consequence of non-capable processes is the death of the quality culture. How? Human nature kills it. In developing a quality culture, personnel are told to “stand up and speak up” when there are issues. Great. If the voice is heard and a solution is developed, the issue is closed, and that complaint is no longer heard. But process capability issues are more involved: they require development, validation, regulatory submission and approval. This takes time and money; therefore, these issues will continue to be reported, week upon week, until the line supervisor says, “Enough! I hear you! Stop reporting that!” And here is the death of “stand up and speak up”—the boss just effectively said, “shut up and sit down”. When issues exist that will require months to resolve, it is important to provide some means of keeping the issue visible to personnel, such as a workroom board. The issue—and its estimated date of resolution—must be visible, to stop re-reporting, thereby frustrating line supervisors. To keep “stand up and speak up” alive, the organization must deliver solutions to issues that hinder capability or human nature kills the culture.
Development – what is the root of the problem?
Little has been written about the link between Development/MS&T teams and data integrity in GMP operations, yet poorly developed processes add considerable pressure to GMP operations when they fail the “right first time, every time” standard. There is a direct causal link between development/support practices and GMP conformance. Your Development team has considerable influence over your data integrity pressures by the robustness of their efforts.
It is acknowledged that many methods are registered and can require considerable resources to update them. This is where efforts like ICH Q14 (2) can help reduce the unnecessary regulatory burden that now impedes industry progress in adopting better analytical technologies. At the same time, an assessment of time spent in additional testing, investigations, and documentation may conclude that the investment will be repaid in a short time period. My former employer once estimated the time to write a deviation report at 40 hours. That is probably a low estimate in today’s environment.
Looking for capability issues
Is lack of capability an issue in your organization? There are questions and metrics that can provide insights to you:
* What is your site’s conformance to schedule? When operations routinely fail to meet the schedule is a lack of process capability the root issue?
* What is the “right-first-time” percentage for your various processes? Is it near 100% or more like 80%? Each failure involves extra work by someone. There soon is a point where the extra work is better spent on a better process;
* Do you routinely have backlogs in any part of your process? Instead of people issues, perhaps the issue is your process;
* Does a bottleneck remain even after people have been added? This is a flag that the issue is not people, but rather process;
* If there have been productivity gains in one area of the operation, were supporting operations also improved at the same pace? Or were they considered at all?
Linking Process Capability to Data Integrity
Based on such experiences, it is worthwhile to add a question when investigating data integrity issues in your organization: Was a process-related deficiency a potential contributing factor to the issue? It is too simple to blame data integrity failures on behaviors and ignore the source of the pressures that pushed them toward the undesired actions.
Eighty-five percent of the reasons for failure are deficiencies in the systems and process rather than the employee. The role of management is to change the process rather than badgering individuals to do better. – W. Edwards Deming
Process capability has a direct relationship to data integrity pressures. We need to consider process capabilities in all areas of the site to assure that pressures on our people are minimized, and people can focus on making safe, effective medicines that patients expect.
Special thank you to Bob McDowall (McDowall Consulting Ltd) for reviewing this document prior to submission.
- ISPE Cultural Excellence Report, April 2017. Available from ISPE (ispe.org)
- International Conference for Harmonization. Q14 Analytical Procedure Development (www.ich.org/page/quality-guidelines)
Mark E Newton, Heartland QA
Mark Newton is an independent consultant who specializes in data integrity, laboratory informatics, computer systems validation, and Quality. He has 35 years of experience in the pharmaceutical industry.
He can be contacted through LinkedIn: linkedin.com/in/mark-newton-3848236
In a data-driven world, companies that succeed in gaining actionable insights through data management will be able to innovate faster, develop better
In an era dominated by technological innovation, the implementation of a laboratory information management system (LIMS) is one of the most important
The advent of Industry 4.0 technologies, such as the Internet of Things (IoT), Big Data analytics, artificial intelligence (AI), and advanced robotics