EMA Cancels Revision of Reflection Paper on Trial Master Files

On 1st February 2013 the European Medicines Agency (EMA) released for public consultation a draft reflection paper on GCP compliance, including management of trial master files. After a three month consultation period which ended on 30th April 2013, industry has been waiting for finalisation of the reflection paper.

This week – on 15th June 2015 – EMA announced the following:

“It has been decided that the revised version of the TMF document, based on the comments collected during the public consultation, will be incorporated into a guidance on TMF as part of the work related to the implementation of the new Clinical Trial Regulation (EU) 536/2014. A public consultation on the new guidance will follow in due course.”

So, having waited for over 2 years since closure of the initial consultation period, industry has to wait a further period until this important piece of guidance is finalised. In particular, we are keen to see if EMA has responded favourably to the following criticisms of its first draft:

  • the suggestion that all correspondence should be maintained “in an appropriate section of the file”, rather than classified according to the content of the correspondence;
  • there is no differentiation between “obsolescence” and “deterioration” in relation to digital archiving; this an important distinction to emphasise because both problems require differing preventative and remedial solutions;
  • it is important that the documentation generated by the CRO from following its internal procedures is retained as part of the TMF for the required retention period and sponsors must consider this part of the TMF;
  • there is no mention of draft documents. As a guidance document, it is recommended that the reflection paper specifically excludes draft documents from being considered TMF content. This is particularly relevant with eTMF systems where documents are created, drafted and reviewed within the same application as the final TMF content. It is important that draft documents are not subject to scrutiny as final documents are;
  • electronic correspondence (emails) should be retained electronically, provided the requirements for eTMF and electronic archiving are considered;
  • the recommendation to use .pst format rather than pdf is not sound as .pst is a Microsoft proprietary format and is not suitable for preserving content for the period of time required by regulations. Other e-mail formats such as Lotus Notes NSF files, would have to be converted to Outlook before saving. The.pst format is often used informally by staff for saving complete e-mail folders and could lead to irrelevant e-mails being saved in a disorganised manner. Whilst conversion to PDF results in a small loss of metadata related to internet transportation routing, taking a risk-based approach it is recommended that the .pdf format is used for the long-term preservation (archiving) of e-mails;
  • it is not clear whether the intention is to use the term “electronic signature” rather than “digital signature”. Electronic signatures are typically created during workflows whereas digital signatures may be used with or without workflow;
  • FDA guidance on scanning specifically states that 100% page-by-page QC is not expected. It would be helpful to include a similar statement in this document i.e. a risk-based approach should be used to determine the appropriate level of QC required;
  • the requirement to restrict access for archived TMFs to the named archivist only does not take account of the additional protection afforded to electronic records through use of read-only user accounts. If a read-only account is being used, there does not appear to be any justification for restricting access to named archivists only;
  • considering the typical retention period for TMF documents is in excess of 25 years, transportable media such as media drives, pen drives and even CD/DVD are not considered to be appropriate for this purpose. This conclusion is supported by The Digital Preservation Consortium and digital preservation specialists such as Tessella and Arkivum. These forms of storage medium are susceptible to media degradation within this time period. In addition, it is likely that hardware to access the content will not be available throughout the retention period (e.g. unavailability of 3.5” floppy disk drives). It is recommended that electronic archiving be carried out as part of a formal digital preservation strategy; and
  • the statements and conclusions regarding destruction of scanned originals is considered to be too strong and conflicts with prior advice given. Rather than making a recommendation to retain wet-ink originals, it is recommended that the reflection paper simply provides this as an option in a risk-based approach.

We await with interest publication of the next draft guidance document!

Advertisements
Posted in Compliance, Regulations, TMF | Leave a comment

Thoughts from the DIA EU Electronic Document Management Conference

[Also posted on LinkedIn Pulse]
I have just attended the 15th Drug Information Association (DIA) European Electronic Document Management conference in Berlin. Here are a few thoughts and observations that came to mind as I sat through some of the presentations:

Many speakers described issues relating to the conference theme of standardisation and how we could potentially improve interoperability and increase efficiency through use of technology. But it was refreshing to hear Russell Joyce (Heath Barrowcliff Consulting) talk about the importance of information governance…. a topic that is sadly missing from many people’s agendas.

We heard about functionality needed for regulatory submissions management but I really have to question the need for “archive and find” functionality for submissions. In general, submissions are a snapshot of source records…. the source should be stored and managed in their respective “system of truth”. Yes, we need to identify what goes into each submission and may have a need to go back and run queries several years after the event. But submissions are a published COPY of original records. So, ‘archive and find’ functionality is needed in the “systems of truth” i.e. for the source content. This is the content that needs archiving in compliance with relevant regulations….. the submissions can be destroyed as soon as there is no longer a business need for them (usually a relatively short period).

System implementation projects often take several years within our industry and so it was interesting to hear about a project that was delivered in just over 2 months. Impressive? Yes, but I just couldn’t help wondering the extent to which success was heavily dependent upon a team of external consultants (e.g. to rewrite applicable SOPs) and the fact that the system was implemented with “out of the box” functionality. It is not often projects are this straight-forward. Having said that, perhaps a prompt for us to think hard why our IT projects have to take quite so long? If we had to deliver in 3, 4 or 5 months, could we? I reckon we probably could!

Another presentation concerned a process harmonisation effort as part of an IT implementation project. It was interesting to see that a specific objective of the project was identification of activities that could be carried out externally by a vendor i.e. an assessment of core competencies to be retained in-house. Perhaps we need to be more open to out-sourcing activities that we “hold dear”?

Great presentation on digital preservation. Much more than electronic archiving. Every organisation needs a digital preservation plan….. not just a retention schedule that says we’ll archive the content electronically for 25 years.

It was interesting to hear that one of the over-riding principles when the Reference Models were initially established was that content taxonomy was not considered to provide any competitive advantage and hence, we could all share openly our practices and learn from each other. It got me thinking that perhaps recent GCP inspection experience suggests that NOT adopting a reference model increases the risk of inspection findings. On this basis, is content taxonomy now actually providing competitive advantage to those that adopt the reference models?

My final observation concerned some comments from one of the regulatory agencies who pointed out that our design and implementation of eTMF solutions should be based on our business needs and should facilitate trial conduct and management, rather than to please a GCP inspector….. a statement that I heartily agree with. It was therefore a little disconcerting to subsequently hear the same individual comment that our use of perhaps five distributed and varied IT solutions might result in an inability of the inspector to perform an inspection within the required timeframe and should therefore be avoided. I think there’s a disconnect here! If use of 5 disparate systems is chosen by the sponsor for sound business and quality reasons, the agencies need to adapt their inspection regime to accommodate this type of technology e.g. by use of remote inspection, rather than suggesting we choose an IT infrastructure that is more aligned with their inspection practices.

So, here’s to the 16th DIA EU EDM conference!

Posted in Uncategorized | Tagged , , , , | Leave a comment

How to Implement a Successful eTMF Project

If you are considering implementing an eTMF or perhaps you are already in the middle of such a project, there is still time to register on our 1-day eTMF workshop being held on Monday 1st December in Berlin, Germany. Technology solutions for managing electronic Trial Master File (eTMF) documents are now established in many sponsors and contract research organisations. However, companies often experience challenges as they begin to implement solutions, sometimes several years later! This workshop will provide attendees with an opportunity to discuss some of the more challenging aspects of eTMF implementation. Through instructor-facilitated discussions, guidance will be provided in a wide range of relevant topics, including:

  • Change management
  • Operating with multiple eTMF solutions e.g. sponsor/CRO
  • Managing correspondence and email
  • Evaluating the “health” of your TMF

This workshop, being run by the DIA with Karen Roy (Phlexglobal), is held on the day preceding the EU Electronic Document Management conference and can be booked on the DIA website. Registration for the full-day workshop is just €550. With great transport links from across Europe, a great price, and a great agenda, this is an opportunity not to be missed!!! Make sure you get your booking submitted soon.

Posted in e-records, TMF, Training | Leave a comment

TMF Benchmarking

A quick blog post today! Have you completed our TMF Benchmarking Survey?

We are running a very brief online survey to identify any trends in terms of how companies are defining their TMF and how they interact with CROs with regards to TMF technology. The survey just has SIX simple questions and takes no more than 2-3 minutes to complete. As an incentive for companies to provide their information, every participant will receive a report containing anonymized output. This should be a useful tool for you to see how your company is positioned in relation to others. We’re not asking for information that is company confidential. The survey will shortly close so don’t delay…. complete the survey HERE.

Posted in Records management practice, Technology, TMF | Tagged | Leave a comment

The World Slowly Wakes to the Reality of TMF Requirements

Perhaps my title is somewhat harsh but I perceive more than a little frustration in the announcement today from the MHRA that they are revising their definition of critical inspection findings to include cases “where provision of the Trial Master File (TMF) does not comply with Regulation 31A 1-3, as the TMF is not readily available or accessible, or the TMF is incomplete to such an extent that it cannot form the basis of inspection and therefore impedes or obstructs inspectors carrying out their duties in verifying compliance with the regulations“.

It seems like an age that consultants like myself and other experts have been telling industry that once a clinical trial is closed-out, no matter how wonderfully the trial was completed the only artifact that is left are the trial records….. the trial master file. This forms the basis for GCP inspections i.e. to demonstrate that all applicable regulations were complied with and that the data generated during the study can be trusted as being accurate and complete. So, maintenance of a trial master file is not just a case of filing away a set of documents and ticking them off some sort of inventory list. The trial master file must “tell the story” of the clinical trial….. how the trial was conducted and how decisions were made, for example. And when it comes to the time for an inspection, the TMF must be a complete and accurate record of the facts.

In amending the definition of criticial findings to include the unavailability or incompleteness of the TMF, the MHRA are raising the stakes. TMF management is not a process that should be considered a low priority administrative task but a criticial activity that is central to GCP compliance. And increasingly, TMF vendors are coming to realise that an eTMF is more than just a repository for PDF files; it is a critical tool to assist in trial conduct and trial management. I don’t think it will be too long before other regulatory agencies follow the MHRA’s lead on this.

About Rammell Consulting: We are a specialist records management consulting company, providing a range of consultancy, training and education services to the life sciences/healthcare sector. Topics that we advise on include the trial master file, electronic archiving and digital preservation. Please refer to our website for further information. Our Introduction to Records Management training module takes place on Thursday June 12th in the UK (last few places remaining).

Posted in Compliance, Records management practice, Regulations, TMF | Tagged , , , | 2 Comments

Training for Records Managers & Archivists

It is not often that I use my blog to promote meetings, conferences etc but I’m going to make an exception today! For those records management professionals working in the life sciences, there are few events that provide training specific to our sector. We can pick up general principles from organisations such as ARMA, AIIM and IRMS but where do we go for training that is specific to the issues we face in our work on a day-to-day basis?

Well, if you have not attended one of the Scientific Archivists Group conferences before, I recommend you take a look. Although the name of the association includes “archivist” it is actually much broader than that, meeting the needs of all records management professionals working in the life sciences, particularly focussing on those operating under GLP, GCP, GMP, etc. The finishes touches are just being made to the agenda of their next conference and bookings are now being taken.

The preliminary agenda can be viewed online – where bookings are taken also – and it includes:

  • electronic archiving solutions
  • archiving electronic mail
  • issues for the archivist when an R&D site closes
  • a non-commercial viewpoint of clinical archiving
  • quality and compliance issues for document digitisation (scanning)
  • cloud computing (case study)
  • health and safety
  • electronic signatures
  • document scanning (case study)

What a varied programme! And for £500 (12% reduction for SAG members) you get:

  • 2-day conference programme including all refreshments
  • 2-nights bed & breakfast accommodation in Nice, France
  • complimentary drinks reception
  • conference gala dinner
  • networking opportunities

Personally,  I think this is great value for money and would highly recommend attendance.

For staff who are fairly new to the industry and to records management, you may awant to consider our new 3-day training programme too which covers the full spectrum of records management principles, going back to basics.

Posted in Records management practice, Training | Tagged , | 1 Comment

Fire Destroys Argentine Banking System Archives, Killing 9

Another Iron Mountain Incident

mediachecker

February 6, 2014

While we are sure it is a very sad coincidence, on the day when Argentina decrees limits on the FX positions banks can hold and the Argentine Central Bank’s reserves accounting is questioned publically, a massive fire – killing 9 people – has destroyed a warehouse archiving banking system documents. As The Washington Post reports, the fire at the Iron Mountain warehouse (which purportedly had multiple protections against fire, including advanced systems that can detect and quench flames without damaging important documents) took hours to control and the sprawling building appeared to be ruined.

The cause of the fire wasn’t immediately clear – though we suggest smelling [Christine] Fernandez’ hands…

We noted yesterday that there are major questions over Argentina’s reserve honesty

While first print is preliminary and subject to revision, the size of recent discrepancies have no precedent. This suggests that the government may…

View original post 786 more words

Posted in Uncategorized | Leave a comment