Because You Needed it Done Yesterday: Keeping Data Contemporaneous

By: Steve Ferrell, Chief Compliance Officer
backdating

Time in the lab is a resource as precious as any instrument, test, or person. Many a fine lab technician has felt the need to find more; to time travel in a sense by recasting or fitting results into a more convenient window. I’m speaking of course about backdating.

Backdating is the original sin in a GxP environment. We are all told in our orientation not to, but it is almost assuredly the first temptation that we face. Backdating has many seemingly reasonable justifications, ‘no one will know’, ‘what’s the harm in it?’, etc. etc. Here is thing to confront: there is always harm in it. If you have backdated, one of the following things has occurred…

  1. You weren’t trained at all and didn’t know any better
  2. You knew better but didn’t care
  3. Your training was inadequate
  4. Your CAPA process was not understood or was so onerous that it is, by accidental design, causing team members to circumvent it.

None of those four points is acceptable. Remember that bad process is stupid process. If your mechanism for correcting a dating issue is a painful experience, it is likely to encourage bad behavior.

Maintaining contemporaneous data in the paper lab is almost an act of origami. Pushing papers back in forth in an ordered and timely way is as difficult as it is archaic. While these manual processes strain to be contemporaneous, modern lab software offers a rare opportunity to eliminate that risk and provide e-signed, audit trailed, time and date information for both record creation and often much of the record’s metadata.

But even while automation in the lab can win many of the backdating battles from a workflow perspective, the modern lab can still lose the war when it comes to peripheral – yet critical – quality system processes.

Consider this example, I reviewed a lab validation package recently that was prepared by a third party; they felt pretty good about their work but wanted a second set of eyes. What I found was arguably the most creative (and egregious) validation package I had ever seen. Sure it passed the volume test, lots of screenshots and even a nice summary report, but it was, in effect, a total fabrication and utterly lacking in integrity. Perhaps all of the test steps had passed without incident, but the fact that the screenshots were captured 2 months before the validation plan was even approved, it was clear that the only thing that was contemporaneous about the whole exercise was that they’d managed to zip it all into one file at the same time.

I share that story to illustrate the wider point: fools with tools are still fools, your lab software is only as good as it’s configuration, it’s configuration only as good as it’s validation, and if any one of those variables are lacking the ‘contemporaneous’ ingredient, then your data either lacks integrity or worse has it but is being devalued by poor peripheral processes.

Time travel is and always has been too risky.