Here’s How to Achieve TMF Excellence

Since becoming a freelance consultant in 2004, I have been involved with around 50 different projects to improve practices for managing clinical trial records (aka the trial master file or TMF). In many of these, the company concerned had chosen an electronic document management system for TMF management (eTMF) and required my assistance to implement the technology as efficiently and effectively as possible. This involves re-engineering the processes to be aligned with functionality of the technology and implementing various change management activities. The projects were much more complex than this but in a nutshell, that was usually the primary objective (simplified for the purposes of this blog!).

Several months ago, I took a step back and assessed the effectiveness of those projects. In many cases, the changes introduced were not always delivering the results that I expected and it was apparent that the primary reason was the company’s choice of process that was implemented; I never impose a process on a client….. it is always their choice and I do my best to optimise the choice that they have made.

In addition, last year I conducted research across the industry to try to determine how successful companies were at managing their trial master files and if they were struggling, what the root cause was. This research included three workshops that I conducted at industry conferences (DIA, IQPC and ExL-Events), involving almost 100 participants in total. My analysis shows:

  • despite efforts to improve processes and adopt best-in-class technology, companies are still not achieving great levels of compliance with regulatory requirements;
  • where some success has been achieved, it has usually required significant ongoing expenditure on quality control activities (i.e. high financial cost); and
  • improvements in TMF completeness and quality are often only achieved through dependence on constant monitoring of quality and implementing laborious checks.

Does this sound familiar? Is this your experience too? Have you implemented an eTMF but are finding that it is still a real struggle to achieve any significant improvement in TMF quality or completeness without constantly badgering people to conduct their quality reviews?

The good news is that the research I conducted also showed what the solution is….. and VERY clearly! I am very excited to start sharing this with colleagues and helping turn their situation around. I am confident that I have the solution to this puzzle…. and it is not dependent upon spending huge sums of money either!! If you want to be part of this, just give me a call and I’ll be only too pleased to let you in on the secret!

Advertisements
Posted in e-records, TMF, Uncategorized

Ready for a TMF Masterclass?

After a break of almost two years, we are planning to schedule another TMF Masterclass. Is this something that you or a colleague would benefit from? If so, please read on.

This will be an intensive 1-day, interactive workshop for colleagues who are already familiar with the fundamental principles and requirements for the Trial Master File (TMF). It will take your knowledge to the next level by examining the requirements applicable for an eTMF environment. Maybe you are about to start considering the implementation of an eTMF solution or you are struggling to make your eTMF work effectively within your company? This workshop will definitely be for you!

We will look at a range of different topics and, where possible, will adjust the content to meet the specific needs of those attending. We’ll likely be covering topics such as:

  • metadata and naming conventions;
  • quality control – when and how to manage quality;
  • record lifecycle management with an eTMF;
  • signature technologies;
  • how to increase levels of compliance;
  • how to differentiate between systems during the procurement process.

Space is limited for this event so please consult colleagues within your company quickly to reserve your place. The date is likely to be during the last week in March but we are currently flexible on the timing until we have the required minimum number of attendees. In the first instance, please register your interest in the Masterclass (no financial commitment) and we’ll get back to you.

Posted in Uncategorized

MHRA Publishes Latest GCP Inspection Findings

The MHRA today released on their website their annual report of GCP inspection findings. It is interesting to note two critical findings for data integrity issues…. something that industry has been talking much about over recent months. It would be wise to do a comprehensive review of the MHRA draft guidance on this topic!

In addition, the MHRA cited a critical finding for management and maintenance of the trial master file (TMF). This includes the following:

  • The TMF was not defined (i.e., no quality system record to confirm all the systems that held TMF records, e.g. regulatory document system)
  • TMFs provided for inspection were incomplete (missing records)
  • There was a lack of Quality Control (QC) process of the TMFs
  • The DIA TMF Reference model had been implemented (although with modifications) via a table of contents for each trial to identify the location of TMF essential documents in defined sections, however the structure of the actual TMF electronic folders had not been changed to reflect the model.
  • The TMF table of contents was found to be unreliable as the location of documents was not accurate.
  • The document date was in the file name and the date in the system was either the upload date or finalisation date. For this reason it was not possible to order documents in document date order.
  • There was a lack of integration with other TMF systems i.e. there were no links or placeholders directing the inspectors to the relevant repository or system for the relevant documents and data.
  • It was not possible to set the TMF to archive status to prevent further changes.
  • Documents were not consistently held in the TMF.
  • Previous versions of documents were not present.
  • There was extensive duplication of scanned documents.
  • There was evidence that the uploading of documents was not being undertaken in a timely manner.
  • If a document was not “finalised”, the inspectors could not view it.

Many of these observation reflect what is now becoming our common industry understanding of the MHRA expectations for a TMF…. so shouldn’t be a surprise anymore. However, it may be worth passing comment on a couple of these topics:

  • I found it interesting to note the comment regarding not being able to view a “finalised” document. My assumption – and this is only conjecture – is that this does not imply a requirement to view draft documents but is simply the fact that a final document had not had its status changed in the eTMF to “finalised” and was therefore not accessible. If this however relates to an expectation to view non-final (i.e. draft, unapproved) documents in the eTMF, that would represent a major shift.
  • As I predicted, the MHRA seem to be providing greater focus on archive processes and how systems comply with the regulatory requirements for an archive. This means under the control of an archivist and having the necessary security and access controls (amongst many other requirements!).
  • The lack of integration comment is also an interesting one…. I would like to know more about this one. For sure, the location of all TMF content must be very clear and it must be readily and directly accessible wherever it may be stored. But this is the first time I’ve come across any kind of expectation that an eTMF should be integrated with sources of document content held elsewhere. Placeholders or a TMF Index…. yes.
  • And great to see that the TMF Reference Model was in use…. though disappointing that there seems to have been some inconsistency in the way it was adopted.

Go to the MHRA website to view the full report.

Posted in Regulations, TMF | Leave a comment

What’s in a name?

“A rose by any other name would name would smell as sweet” is a frequently referenced quote from William Shakespeare’s Romeo and Juliet…. meaning that the name is not as important as the object itself. This blog post is a departure for me. Rather than bring you some great announcement , revelation or insight into some aspect of records management, I’m going to jot down a few thoughts about an issue I’m struggling with a little at the moment. I hope you stick with me on this!

An industry group I support and get myself involved with is proposing to change its name. It has had the same name since it formed over 35 years ago and the name was a reasonable reflection of what the group did at that time (though not perfect). However, over time – and especially over the last 10 years or so – the focus of the group has changed. Reflecting the needs of industry, the changing regulatory framework and the way records management has evolved, the group now meets the needs of a much broader spectrum of individuals and the work they do. As a result, the name no longer reflects what the group is about. It is also a sad but true fact that in some quarters, the name is actually an issue of fun, embarrassment or ridicule.

Does it matter? I think the answer depends on your perspective.

From the perspective of a member of the group – just as in Romeo and Juliet – the name should not really matter. I join a group because of what the group does and what it stands for…. the name is rather inconsequential. I couldn’t really care less if AIIM, ARMA, IRMS, RQA etc, etc changed their name (as most of them actually have done in the past!). What matters is what these organisations do.

But from the perspective of the group itself, the name definitely matters. The name is a major marketing tool to say to prospective members what the group does. It is there to signify to stakeholders and bodies that the group is trying to influence and engage with that the group has relevance. It is there to attract attendees to events, meetings, conferences and training courses by aligning itself with the professional development needs of those individuals. Thus, a name may carry a valid historical significance but it no longer has resonance with most members or with the population that the group is aiming to attract….. and therefore needs to change.

So this blog post comes out of a sense of frustration from those who cling on to a name that represents the past rather than seize the opportunity to take us forwards into the next 35 years. For me, the past should be respected but the future should be prepared.

Posted in Records management practice, Training | Leave a comment

EMA Publishes New TMF Guideline for Consultation

The European Medicines Agency (EMA) has published (12-April-2017) a new guideline “on good clinical practice compliance in relation to trial master files” (TMF). This guideline includes and updates the guidance issued previously in the EMA draft reflection paper on trial master files, includes additional guidance from the ICH E6 revision, and provides additional guidance from the EMA based on issues experienced by industry.

I will provide a detailed assessment of the guidance over the next few days but just a few immediate thoughts based on my first read:

  • This is a draft for consultation and there may therefore be changes in the final guidance document that is published towards the end of the year. Therefore, if there are areas of concerns etc in the content, these may change.
  • The draft provides some very helpful additional guidance that is not present in the draft reflection paper e.g. on destruction of originals, on maintenance of multiple electronic systems.
  • The draft also includes some areas that need to be challenged during the consultation process e.g. the obligation to maintain provenance of emails conflicts with the statement that only the final email in an email chain need be retained.

The draft guideline can be downloaded from the EMA website or HERE.

Posted in Uncategorized | Leave a comment

MHRA Draft Guidance on GxP Data Integrity….. a Few Thoughts

In July 2016 the Medicines and Healthcare products Regulatory Agency (MHRA) released a definitions and guidance document on the topic of data integrity for GxP regulatory data. The public consultation period closed on 31 October 2016. However, I’ve been hearing an increased interest in this topic over recent days so have provided my perspective.

I provided an official response to the consultation request and if you are interested you can read it here. In summary, I welcome the effort to provide some clarity by the MHRA on this topic, as well as promoting consistency between the different GxPs. However, I have a big concern about proportionality and am worried that once this guidance becomes final, industry will take a knee-jerk response and apply disproportionate measures to ensure data integrity and alignment with the guidance. We need to remember that many of the electronic systems that are in use hold “regulated data” in some shape or form but errors that concern data integrity will have little or no impact on data reported to regulatory agencies or ultimately to the safety and well-being of the public. I would not like to see “heroic measures” employed by industry to secure data integrity and for regulatory inspectors to expect perfect systems when the data within the IT systems really does not justify this approach. Remember, it is the regulatory agencies that are encouraging us to “take a risk-based approach“!

In addition, I thought there were many inconsistencies and ambiguities within the draft document. For example, there are three different terms used to describe a copy and these are used interchangeably: true copy; certified copy; and verified copy. The term “dynamic data” is introduced within the guidance without a definition and is then used in a way that does not give the reader a clear understanding of what is meant by this term. And the term “durable storage” is used without explaining exactly what this term means.

Finally, I think the requirements described for the long-term retention of audit trails, electronic signatures and other “dynamic data” are in conflict with the requirement to maintain accessibility and usability over an extended period of time (in excess of 25 years for GCP-regulated data). Given that we’re being encouraged to adopt a risk-based approach, I would expect to see some pragmatism in the document to give readers some realistic options for long-term digital preservation, whilst maintaining data integrity.

I shall look forward to seeing the final published guidance document!

Posted in Compliance, e-records, Technology | Leave a comment

ICH GCP Guidelines (E6) Addendum Now Available

So after securing approval at the November ICH meeting in Osaka, the revised text of the ICH guidelines for Good Clinical Practice – otherwise known as ICH E6(R2) – has been published on the ICH website….. and also available to download HERE. My previous blog posting highlighted some of the issues that I thought needed to be addressed. Although many were not resolved by the final approved wording of the guideline (no surprise there!), I was pleased to see some changes since the initial draft was circulated for consultation. I have these highlighted below.

The revised guideline now includes a definition for a “certified copy” and provides for such copies replacing the originals. Section 8.1 states:

“When a copy is used to replace an original document (e.g., source documents, CRF), the copy should fulfill the requirements for certified copies”.

This statement suggests strongly to me that it is acceptable to generate a certified copy (e.g. a digital scan of a paper document) and for the copy (the digital file) to completely replace the original paper version i.e. the paper copy can be destroyed. If the paper original is being retained, then the copy is not actually replacing anything….. it is being filed in addition to the original! You may want to retain originals for other reasons (e.g. originals may have greater evidential weight if relied upon in court) but this is not needed to comply with the guidelines here.

The final definition of a certified copy is slightly different from that first proposed but I think the subtle change is extremely important. The original wording proposed was:

“A paper or electronic copy of the original record that has been verified (e.g. by a dated signature) or has been generated through a validated process to produce an exact copy having all of the same attributes and information as the original.”

The two highlighted words are significant and were mentioned in my feedback to ICH during the consultation period. “Exact” is a very precise term, as is “all”. It is often the case that conversion of documents from one format to another results in a copy that is not 100% identical to the original in EVERY respect. But still, it contains all of the information, content and structure that is critical from a GCP compliance perspective and to support patient safety and data integrity. So the revised definition takes account of this nuance of document conversion:

“A copy (irrespective of the type of media used) of the original record that has been verified (i.e., by a dated signature or by generation through a validated process) to have the same information, including data that describe the context, content, and structure, as the original.”

So, I would argue that a PDF rendition of an email potentially loses some metadata components related to the email header (so is not an “exact copy” based on the original definition) but retains “the same information” contained within the email, the structure of the email and the context of the email, and would therefore constitute a certified copy (if produced through a validated process). And of course the ICH definition allows for certification via use of a validated process, not requiring a dated signature to certify each and every copy! It is worth noting that the current FDA definition of a certified copy differs from the ICH addendum definition but the FDA have stated that the “draft guidance, when finalized, will represent the current thinking of the Food and Drug Administration (FDA or Agency) on this topic”.

The second small but significant change that is of interest from a records management perspective is the specific inclusion of the requirement for “systems for archiving”. The original addendum included the following statement in section 8.1:

“The sponsor and investigator/institution should maintain a record of the location(s) of their respective essential documents. The storage system (irrespective of the media used) should provide for document identification, search and retrieval.”

The final text states:

“The sponsor and investigator/institution should maintain a record of the location(s) of their respective essential documents including source documents. The storage system used during the trial and for archiving (irrespective of the type of media used) should provide for document identification, version history, search, and retrieval.”

So there are three new concepts that are emphasised here:

  • The TMF inventory or index that is maintained both by the sponsor and by the investigator / institution must include all records considered part of the TMF, including source documents.
  • The issue of long-term retention and preservation of TMF content must be addressed and those systems selected for archival purposes must meet the same requirements for document identification, version control, search and retrieval as systems used during the trial.
  • The system for good version control and maintenance of an immutable audit trail of version history throughout the retention period must be assured – including those systems used for archiving.

The GCP addendum includes many other changes, some of which are directly relevant for records management and TMF management and can easily be found by reference to the additional sections in the guideline. I’ve just highlighted here what I consider to be two of the most significant ones for TMF management as they relate to concepts that are often misunderstood and variously interpreted across industry.

Posted in Regulations, TMF | Leave a comment