data integrity cGMP MHRA


by Mark Newton, Principal at HeartlandQA,

To a careful observer, the realm of Data Integrity currently has signals that indicate which direction the regulators—and hence, the industry—will be moving.  My crystal ball is currently aglow with these future visions, in no particular order

1. The Data Periodic Review is coming

The subject was introduced in the recent MHRA Final Guidance for GXP Systems (1).  I discussed this topic last October at the ISPE/GAMP Annual Meeting in San Diego in my presentation on Data Integrity metrics.  It is a logical development:  GAMP 5 has a system lifecycle, and includes a Periodic Review to assure that systems remain in a state of compliance; GAMP Records & Data Integrity defines a Data Lifecycle, working in parallel to the system lifecycle.

So, if you have two parallel lifecycles, and one has a periodic review, why would you not expect the other (parallel) lifecycle to have a periodic review? 

Only makes sense.

That was the logic for my presentation point at the ISPE/GAMP 2017 Annual meeting, and MHRA sealed the deal with their final guidance. 

Write it down. Data Periodic Reviews are on the horizon!

2. Data Integrity remediation and governance will find its way into research

Gupta (2) reported that 40% of clinical researchers were aware of at least one instance of misconduct that was not reported. Wow!

So, there is widespread misconduct in research—no real surprise there.  The FDA alone has written over 130 Warning Letters and 483s in the past 3 years due to integrity violations, and GxP labs are a small percentage of labs compared to research of all types.  Add the fact that research labs have little oversight and no regulatory inspections—all making fraud more difficult to expose. Taken together, these factors (more labs, less oversight, no inspections) indicate a huge issue to be uncovered.

Detection capabilities have improved–inspectors are learning forensic inspection techniques and applying them with great success.

What is missing?

An organization with a vested interest in vetting the research to protect themselves, and the willingness to borrow and use forensic techniques from regulators.  Perhaps NIH, a publisher, or a university/medical center.  Once they conduct forensic data evaluations, misconduct is exposed, and the research data integrity avalanche starts rolling.

Once rolling, it will be big. Very big.  And expensive.  You have been warned.

3. Data Provenance will gain importance

Think beyond our current focus that data be recorded accurately (along with metadata) at the time of collection.

The next logical step is to assure that we have the “chain of data” from the source (raw) data all the way to where that data is used to make a business decision.

This chain is probably documented now in validation/process documents, known only to the validation team and not to the users who are making the decisions.  The next step is to put that chain (provenance) in a format available to the business user at decision time or provide that user with a link back to the source data. At a minimum, business users should be aware that they are about to make decisions with a data field(s) that came from a different source.

Blockchain has been suggested as a possible solution for this issue, and it does have potential, since it embeds transfer history into the blockchain; however, applications would need to extract the history from the chain and make it available to the user at the decision point.

Don’t expect people to jump to another app to read the blockchain provenance—isn’t going to happen.

Time to plan for complete data provenance, available to the business users who make decisions, at the time the data supports a regulated decision.

In all of these, we are drawing conclusions from events already happening around us.  Be wise and start planning now.  Avoiding future change only makes it more expensive—and harder to take.


Mark Newton paperless lab academy
Mark Newton (retired – Associate Sr Consultant QA at Eli Lilly  ) has 30+ years of experience in pharmaceuticals as a laboratory
scientist, then as a QA professional, supporting LIMS systems, standalone lab instruments, lab informatics and metrics, validation, quality systems, and data standards.  Has been deeply involved in data integrity and data integrity training to people in QC Laboratories, Manufacturing, and IT.Current Chair of the ISPE Global Documents Committee
– Current co-leader of the ISPE/GAMP Data Integrity Special Interest Group
– Co-author for “GAMP Guide: Records and Data Integrity” April, 2017
– Co-author of “Harmonizing USP <1058> and GAMP for Analytical Instrument Qualification”  Schuessler, Newton, Smith, Burgess, McDowall. Pharm. Engineering, Jan/Feb 2014
– Co-editor of the GAMP Good Practice Guide “A Risk-Based Approach to Compliant Computerized Laboratory Systems” Nov. 2012.