How to Apply a Risk Based Approach to Records Management

The European Medicines Agency (EMA) has recently published its final reflection paper on risk based quality management in clinical trials, reference EMA/269011/2013, 18 November 2013.The management of trial records is barely mentioned in the paper but I took away a few principles that could be applied to clinical records management. Please note that, as always, these are my own personal views and do not ncessarily represent the viewpoint of the EMA!

The paper says:

“The current manner in which some elements of a quality system are implemented by sponsors and their agents (CROs etc.) is generally acknowledged to be time-consuming and constitutes a major proportion of the cost of development of medicines. In addition, the ICH GCP guideline was finalised in 1996 when clinical research was largely paper based, but the available technology and the approach to the conduct of clinical trials has evolved considerably in the meantime.”

I liked the fact that ther EMA is acknowledging the short-comings of the ICH guidelines with respect to its application in the 21st century. We need to be identifying more opportunities to introduce process efficiencies and avoid the temptation to use regulations as an “excuse” to be overly cautious. The EMA consider industry to be generally risk-averse, stating:

“Current practices in clinical research are not proportionate to risk nor well adapted to achieving the desired goals. The origins of the problem are multifactorial and include: risk aversion – society and its institutions (public and private) is increasingly risk averse, often with little appreciation of the actual or relative risk of different activities, leading to imbalanced or disproportionate risk mitigation”. The rsflection paper then goes on to say:

“Since perfection in every aspect of an activity is rarely achievable or can only be achieved by disproportionate allocation of resource, it is necessary to establish a risk based quality management system for which the ultimate principles are reliability of the trial results and the well being and safety of trial subjects. This system is based on identification of trial priorities and mitigation of the significant and serious risks and establishing tolerance limits within which different processes can operate.”

What this is saying to me is that we need to identify and assess the risks involved in developing processes for managing trial master file content but we then need to manage those risks in the light of all of the other risks involved in running a clinical trial, most of which have a much greater impact on the well-being and safety of trial subjects. We tend to put far too much effort and resource into developing high-quality records management processes and systems when perhaps that level of resource is not justified.

Don’t get me wrong, I’m not suggested we simply “lower the bar” and aim for poor quality systems and proceses but that we carry out a proper risk assesment when we “do” process re-engineering. And the EMA support this, recommending that we conduct a “Risk assessment requiring knowledge and understanding of what really matters for the establishment of priorities and the identification of risks: what may go wrong? What is the probability (chance/likelihood) of the occurrence of a negative outcome? What, in particular, would be the impact on trial subjects’ rights/well-being/safety and/or on the reliability of the trial results?”.

So let’s take as an example the required level of QC necessary when scanning a paper document, importing the PDF into an eTMF, indexing the electronic file and deciding whether to keep or destroy the original paper document. Is the process we define proportionate to the risk? Will errors likely result in an adverse effect on subject well-being and safety? Are there opportunities for errors to be mitigated against. Let’s have a “reality check” and ensure we’re taking a reasonable risk-based approach with records management as well as for trial conduct and study monitoring.

About Rammell Consulting: We are a specialist records management consulting company, providing a range of consultancy, training and education services to the life sciences/healthcare sector. Please refer to our website for further information.

Posted in Regulations, TMF | Leave a comment

UK Government Seeks to Prevent Proposed Amendment to Clinical Trial Regulation

If you’ve been following the ongoing debate regarding the proposed European Clinical Trial Regulation (see previous posts on this site), you will know that the text currently before Parliament is proposing (a) that Trial Master File content be held and maintained only in electronic form and (b) is retained for an indefinite period. I have corresponded with the UK Government on this issue and the initial responses are promising.

The Rt Hon the Earl Howe, Parliamentary Under Secretary of State for Quality confirms that he is concerned with these amendments and that the requirement to hold the TMF in electronic format and to be retained indefinitely “would impose a disproportionate burden on researchers”.

The amendments will be examined during negotiations between the European Council of Ministers and the European Parliament, and the Government “will try to prevent this amendment from being in the Regulation”. Departmental officials have already discussed a number of amendments that impose unnecessary burdens, including this amendment, with MEPs, including the lead MEP for the Regulation. Officials are also in contact with the Presidency of the Council (currently Lithuania) and other member states about these amendments.

Action plan: If you are an EU citizen and are concerned about the potential imposition of an obligation to maintain all trial master file records in electronic format for an indefinite period, please write to your MP and to your MEP, making reference to Amendment 223, Article 55, Paragraph 1. MPs, MEPs and Ministers need to hear a consistent message on this topic.

Posted in Compliance, Regulations, TMF | Leave a comment

Are You Thinking Big Enough?

I’ve just been reading an interesting article by Bob Kocher, a partner at healthcare investment company Venrock. He was talking about investment in digital healthcare technologies and much of what he had to say rang so true to me about investment in eTMF technologies (and other Electronic Document Management solutions), based on some of the data that NextDocs published in their recent White Paper “The State of Trial Master Files“. His point about healthcare in America was that this is so expensive and so inefficient that if you can’t create something that saves a lot of money and/or dramatically improves outcomes, then you should reconsider whether it’s worth pursuing. After reading the NextDocs White Paper I thought to myself: “This is not cheap technology and yet people are making a significant investment and not getting the benefit.” To follow Kocher’s philosophy, is there any point in investing in an eTMF if you’re not going to do it properly?

To pick up on some of Kocher’s points that have relevance to Trial Master File solutions:

  1. Is the unmet need big enough? There are so many significant problems in our processes and systems that we should be looking to create what Kocher calls “shockingly large benefits,” and not incremental improvements. To put it bluntly, if you have purchased a document management system, then you should use it to author, develop, approve and distribute documents and not just as a document library. If you have purchased a system that permits electronic approval of documents, why are you wasting time and money continuing to collect wet-ink signatures? If you’ve purchased a system with configurable reports, why are you manually tracking activities in spread sheets and hard-copy tracking forms?
  2. Can you deliver meaningful return on investment (ROI) within a short period of time? There are myriad examples of inefficiencies in our business processes, so system implementation should focus on ideas and technologies that can capture some of that and not add additional cost. Take courier charges for example; these can amount to 5% of the cost of a typical phase II clinical trial. An efficient eTMF could save a company over £200,000 in courier costs alone for a single study by using digital document exchange.
  3. Whose money are you saving? The downside to eliminating inefficiencies in the existing system is that there may be incumbent individuals or functions currently profiting because of those inefficiencies. It is safe to assume that the incumbents will respond strongly to any disruptive technology that threatens their business model(s). Are you going to let individuals dictate the level of efficiency that is “permitted” within your organization? Change management is often fraught with difficulty but sometimes people need to just be told “Look, this is how we’re now doing things and its part of your job.”
  4. Who is your primary customer? For an eTMF, the primary customer is not the study monitor, the statistician, the study manager or even the M.D. The primary customer is the regulatory agency who will inspect your facility and expect to see a compliant set of essential documents. So how come so few organizations with an eTMF rate ‘regulatory compliance’ and ‘inspections’ so low on their list of priority features? If an eTMF is not supporting your inspections, I would question whether or not you’ve made the right investment decision. In terms of prioritized requirements, the ability to support a regulatory inspection should be Number 1.

Am I being too harsh? Maybe a little. But the underlying message is, I believe, spot on. Invest in EDMS technology wisely. Be bold. Make outrageous decisions. Reap the benefits!


Posted in e-records, Records management practice, Technology, TMF | Leave a comment

Document Storage Industry Suffers Another Fire

Despite making significant investments in fire protection facilities, it seems our records can still be at risk. Metrofile, a large commercial document storage vendor based in South Africa, suffered from a devastating fire last week at their Westmead, Pinetown facility. The fire started on Friday evening (11th October) with fire services still struggling to put out the blaze 24 hours later.

The building housed millions of documents from government, banking and private companies. This is the second fire at this particular facility in six years and is thought to have been started deliberately. Whilst this will cause huge financial damage to Metrofile, it will be the clients who will suffer most and will have to attempt to reconstruct their files, a process that could take several years. The company pointed out that only one of three warehouses was affected by the fire. In addition to fire destroying many documents, the incident also left some documents blowing across public areas in the vicinity, including bank slips, salary information and insurance documents belonging to clients.

Once again, this demonstrates one of the weaknesses in managing records in hard-copy; the creation and retention of back-up copies is often too costly. Many of the records destroyed in the fire are gone forever; a digital back-up could have preserved them.

Posted in Document Storage, Records management practice | Tagged | Leave a comment

Retention Scheduling? Don’t Forget Data Protection Laws!

Research recently conducted by PwC on behalf of Iron Mountain suggests that more than a third (35%) of small European companies are risking prosecution by “hoarding” data beyond the scope and period required by data protection legislation. When these companies were asked why personal data was being retained, they replied “In case it is needed”.

Apparently, engineering and manufacturing companies are the worst culprits with 45% holding on to everything. In the financial services sector, 39% keep everything and 9% have no company-wide retention policy! Colin Rooney, from Technology and Life Sciences Group, Arthur Cox says “Today, each organisation is increasingly aware of the possibility of litigation  and this creates a reluctance to destroy original documents. That in turn creates spiralling storage costs and a potential breach in data  protection law, in retaining personal data for longer than necessary.” The poor record keeping practices are particularly acute regarding employee, customer and financial information.

Do you need to dust off your copy of the Data Protection Directive and assess your own retention policies against the requirement not to retain personal data for longer than is necessary? Don’t forget, we’re also here to help you ensure you develop and implement compliance policies and procedures.

Posted in Compliance, Data protection, Records management practice | Leave a comment

Is an eTMF Standard Needed?

This is a question that many of us in the clinical document management space are currently asking ourselves following the announcement by CareLex of their intention to develop and launch an industry eTMF standard under the auspices of OASIS. There already exists a sub-team within the DIA TMF Reference Model working group who are looking at developing metadata to support the Reference Model so the obvious question is “Why is another group necessary?”.

Well first off, the intention of the DIA TMF Reference Model group is not to develop a standard and DIA are not in the business of developing standards or endorsing standards. Our remit within DIA is to develop a conceptual model that can be adopted and adapted by industry to support clinical document management needs. As part of this, the metadata sub-team are developing standard nomenclature for core and recommended document attributes. This is a valuable exercise and will continue but the difference between these deliverables and a formal technical standard need to be recognized. A technical standard – which is outside the remit of the DIA team – is to facilitate data and document exchange between any two systems that are compliant with the eTMF standard. This currently requires bespoke (i.e. expensive and time-consuming) programming and consultancy to get two eTMF systems talking to each other. Furthermore, the availability of an eTMF technical standard would facilitate the development of generic eTMF Viewers that could be used in a simple browser interface to interrogate and access ANY compliant eTMF system. The OASIS eTMF Standard Technical Committee will be supportive of and complementary to the DIA TMF Reference Model activities.

So why OASIS? Well, in the conversations I’ve had, various alternative standards groups have been mentioned. Each group has advantages and disadvantages and it becomes a rather subjective exercise to pick one to go with. Here’s a quick summary of some of the points I’ve picked up along the way. OASIS has already developed an authoritative standard for content management interoperability (CMIS) that is used by many of the big software vendors such as EMC, Microsoft, Adobe. OASIS is already supported by many of the major content management vendors. Whilst groups like HL7 and CDISC have experience in data exchange, OASIS has more experience in developing standards for content and document exchange. So far, HL7 has not expressed any interest in hosting or being involved in an eTMF initiative.

OASIS also has a reasonable history at developing standards within a rapid timeframe. The goal is to publish a draft standard within 6-9 months; OASIS Technical Committees have achieved this where other standards groups typically take longer.

Several have commented on the seemingly expensive cost of engagement with OASIS. To participate directly in an OASIS Technical Committee, a person must either be an individual member of OASIS ($310 per year) or their employer must have company membership. Company membership fees vary from $3,350 to $8,400, providing contributor access for all company employees. This compares to a minimum of $1,200 plus $3,500 joining fee for CDISC for the smallest company (<20 employees) up to $25,000 plus $30,000 joining fee for companies with >25,000 employees. Similarly, membership to HL7 costs $705 per year for an individual to up to $19,635 for larger companies. So from a cost perspective, OASIS appears a reasonable choice. Furthermore, non-OASIS members will still have opportunities to engage in this process via the DIA TMF Reference Model group; the two initiatives are not competing with each other!

So we come back to my original question….. is a standard necessary? From my experience working with many different clients, I see a key issue causing a stumbling block on almost every TMF project; this is the impact caused by the inherent flexibility of the TMF Reference Model. By virtue of the fact that it is a reference model and not a standard, each organization has its own interpretation of the model. This is initially seen as great for the organization as it often means not needing to change their current file plan for TMF content. Inevitably, it leads to plenty of man-hours mapping file plans to the Reference Model and vice versa. More importantly however, the subtle difference between Reference Model implementations results in an inability to easily exchange data and/or content between systems. Does this mean the TMF Reference Model has no value? Of course not! It still has value in determining the underlying content and structure of the TMF for regulatory and business purposes. It has done perhaps more than any other single thing to improve TMF compliance. Does a standard mean loss of flexibility? Absolutely not! An eTMF Standard can still mean Company A or Vendor A displaying TMF content using a different file plan than Company B or Vendor B. However, the primary attributes that identify and describe each document within the eTMF will be the same; the only difference will be how the content is surfaced in the user interface. As an analogy, there exists a technical standard for electronic calendaring. Both Google Calendar and Microsoft Outlook Calendar comply with the standard for data exchange but calendar items appear very different in both systems; field names are sometimes different, with each having different options. However, compliance with the standard enables you to send me a Google Meeting Request and me to receive that Meeting Request into my Microsoft Outlook Calendar. An eTMF Standard should achieve the same for eTMFs. Wingspan to Veeva. FirstDoc to PhlexEview. NextDocs to Transperfect. Or…. view your eTMF archive through a generic web-browser eTMF Viewer Add-in! See you on the OASIS Technical Committee or eTMF Standards LinkedIn Group?

Posted in e-records, Records management practice, Standards, Technology, TMF | 1 Comment

Proposed EU Clinical Trial Regulation – Update

Well, my optimism from our meeting with one of the UK MEPs a short while ago was misplaced! It appeared a possibility that the intention was to only retain the clinical study report indefinitely and not the whole of the TMF. This is NOT the case.

I have had it confirmed categorically by the rapporteur for the new Regulation that the intent of the proposes wording is to retain the whole of the trial master file – in electronic form only – for an indefinite period. The Committee on the Environment, Public Health and Food Safety (ENVI) are aware that there are concerns from industry on the practicality of these proposals and this will be something that will be on the table as negotiations continue with the European Council. The position of the Committee is that transparency and accountability are paramount and it is thought the trial master file would be invaluable should problems come to light a long time after the end of a clinical trial.

However, I have yet to hear a convincing argument to support this supposition. Suppose we have a drug that has recently completed its clinical programme, been given marketing approval and is then on the market for 30 years before serious side effects are identified. Given the vast quantities of post-marketing surveillance data that would have been collected over the coming 30 years and the existence of a huge amount of data available in the clinical study reports and their associated tables and listings (including individual case report forms for adverse events and deaths), I really have to question what additional value records in the trial master file would provide, other than to potentially protect the sponsor against accusations of improper conduct.

I’ve heard various “drug disaster scenarios” used as justification for the indefinite retention period (e.g. Stilbestrol) but these examples are all from the era when toxicology testing was fairly rudimentary compared to today’s regulatory environment. Since the 1950s, 1960s and 1970s, regulations governing pre-clinical safety testing have been significantly enhanced, including the requirement that “observations should be continued through one complete life cycle, i.e. from conception in one generation through conception in the following generation” [CPMP/ICH/386/95]. This does not completely eliminate any risk of teratological issues arising but it certainly makes it inappropriate to use examples such as Stilbestrol, Etretinate and Thalidomide as justification to retain records ad infinitum.

So where do we go from here? Industry really needs to start taking this issue seriously as it will undoubtedly have a significant impact on the conduct of clinical trials. Perhaps the biggest impact will be on academic and investigator-sponsored studies where a high proportion of trial records are still maintained and archived in paper format; these will all need to be maintained in a fully validated, electronic trial master file system (eTMF) and measures put in place to archive the electronic files indefinitely. Relevant industry bodies and companies need to lobby their MEPs and regulatory agencies with urgency to ensure common sense prevails on this issue. Parliament is due to begin debating this issue later this year.

Posted in Uncategorized | Leave a comment