I would like you to follow me on my voyage of discovery. Treat it as a story. This would be very apt because like all good stories, there is a twist at the end or "a sting in the tail" that will be revealed. Our first step on this journey will be "21 CFR Part 11".
Code of Federal Regulations - Title 21 - Food and Drugs
The Code of Federal Regulations (CFR) is a codification of the general and permanent rules published in the Federal Register by the Executive departments and agencies of the Federal Government.. Title 21 of the CFR is reserved for rules of the Food and Drug Administration. Each title (or volume) of the CFR is revised once each calendar year. A revised Title 21 is issued on approximately April 1st of each year. |
Next I want you to click on the link in the text box is you haven't already done so. You will see that you can use this page to search for words. Look at the "Parts 1 - 1499" section and scroll down a bit until you find Part 11. You will find "(11) Electronic records; electronic signatures". This, then, is "21 CFR Part 11" so you now understand what this section is about.
The FDA has issued guidelines that cover electronic systems that gather
data from clinical trials that use electronic signatures in lieu of paper
signatures. By extension, a lot of people within the industry assumed,
and were even advised, that although 21 CFR Part 11 did not explicitly
cover clinical trial reporting, that in the future it possibly would. This
was a commonly held view that was partly fuelled by the FDA stating that
their guidelines represented their "current thinking" on the matter. This
gave rise to some guessing and preparing for the future and caused a spate
of both vendor and in-house products for clinical reporting with an aim
to make them 21 CFR Part 11 compliant. In 2003 the FDA made clear that
21 CFR Part 11 was limited in scope to systems that use electronic signatures
in lieu of paper signatures and this is clearly nothing to do with clinical
reporting. The main page for linking to this document is as below and from
that page you can access the most recent releases of the documents.
http://www.fda.gov/ora/compliance_ref/part11/
From the above document you can use the first link at the top to go
to this page:
http://www.fda.gov/cder/gmp/index.htm
In the second section titled "Guidances" you can link to "Part 11,
Electronic Records, Electronic Signatures — Scope and Application (final
guidance)" which takes you to this page:
http://www.fda.gov/cder/guidance/5667fnl.htm
If you open that document then there is a link to "1. Narrow Interpretation
of Scope" which takes you to the following section of the document.
http://www.fda.gov/cder/guidance/5667fnl.htm#P181_12930
The text of that section is below:
1. Narrow Interpretation of Scope
We understand that there is some confusion about the scope of part 11. Some have understood the scope of part 11 to be very broad. We believe that some of those broad interpretations could lead to unnecessary controls and costs and could discourage innovation and technological advances without providing added benefit to the public health. As a result, we want to clarify that the Agency intends to interpret the scope of part 11 narrowly. Under the narrow interpretation of the scope of part 11, with respect to records required to be maintained under predicate rules or submitted to FDA, when persons choose to use records in electronic format in place of paper format, part 11 would apply. On the other hand, when persons use computers to generate paper printouts of electronic records, and those paper records meet all the requirements of the applicable predicate rules and persons rely on the paper records to perform their regulated activities, FDA would generally not consider persons to be "using electronic records in lieu of paper records" under §§ 11.2(a) and 11.2(b). In these instances, the use of computer systems in the generation of paper records would not trigger part 11. |
The above text should have ended speculation that the FDA's "current thinking" might extend 21 CFR Part 11 beyond electronic systems that store data to cover a broader scope -- perhaps even clinical reporting systems. Indeed, the final sentence in the text above seems to cover clinical reporting systems. Even if not, it is clear that 21 CFR Part 11 does not apply to clinical reporting systems.
We seem to have what we are looking for regarding whether clinical reporting
systems should be validated or not. In the "System Dependability" section
we have the following text and it even mentions "validation of software":
SYSTEM DEPENDABILITY
The sponsor should ensure and document that computerized systems conform to the sponsor's established requirements for completeness, accuracy, reliability, and consistent intended performance. A. Systems documentation should be readily available at the site where clinical trials are conducted. Such documentation should provide an overall description of computerized systems and the relationship of hardware, software, and physical environment. B. FDA may inspect documentation, possessed by a regulated company, that demonstrates validation of software. The study sponsor is responsible, if requested, for making such documentation available at the time of inspection at the site where software is used. Clinical investigators are not generally responsible for validation unless they originated or modified software. 1. For software purchased off-the-shelf, most of the validation should have been done by the company that wrote the software. The sponsor or contract research organization should have documentation (either original validation documents or on-site vendor audit documents) of this design level validation by the vendor, and should have itself performed functional testing (e.g., by use of test data sets) and researched known software limitations, problems, and defect corrections. In the special case of database and spreadsheet software that is (1) purchased off-the-shelf, (2) designed for and widely used for general purposes, (3) unmodified, and (4) not being used for direct entry of data, the sponsor or contract research organization may not have documentation of design level validation. However, the sponsor or contract research organization should have itself performed functional testing (e.g., by use of test data sets) and researched known software limitations, problems, and defect corrections. 2. Documentation important to demonstrate software validation includes: Written design specification that describes what the software is intended to do and how it is intended to do it; A written test plan based on the design specification, including both structural and functional analysis; and, Test results and an evaluation of how these results demonstrate that the predetermined design specification has been met. C. Change Control Written procedures should be in place to ensure that changes to the computerized system such as software upgrades, equipment or component replacement, or new instrumentation will maintain the integrity of the data or the integrity of protocols. The impact of any change to the system should be evaluated and a decision made regarding the need to revalidate. Revalidation should be performed for changes that exceed operational limits or design specifications. All changes to the system should be documented. |
Have we found what we are looking for at last regarding the validation
of a clinical reporting system? The answer is "no". It does not
apply. The start of the document makes it clear in the definition of "computerized
system" as follows:
Computerized System means, for the purpose of this guidance, computer hardware, software, and associated documents (e.g., user manual) that create, modify, maintain, archive, retrieve, or transmit in digital form information related to the conduct of a clinical trial. |
Clinical reporting systems might transmit in digital form information
related
to a clinical trial, but not information related to the conduct of
a clinical trial. The wording is confusing so further clarification is
needed. We get further clarification from the Introduction in that it says:
I. INTRODUCTION
This document addresses issues pertaining to computerized systems used to create, modify, maintain, archive, retrieve, or transmit clinical data intended for submission to the Food and Drug Administration (FDA). These data form the basis for the Agency's decisions regarding the safety and efficacy of new human and animal drugs, biologics, medical devices, and certain food and color additives. As such, these data have broad public health significance and must be of the highest quality and integrity. |
The above introduction clears up any possible misunderstandings. It is to do with "clinical data". Clinical reporting systems do not transmit "clinical data".
As for sending the FDA data in CDISC format then if that is the only data transmitted to them about the trial and the clinical data has gone through a transformation process to get it to CDISC format then whatever software handles the transformation of data and sending it would have to be validated and in the way described above in the "System Dependability" section. If this were part of a clinical reporting system then that part of it would have to be validated in the way described above. It would make a lot of sense to keep that side of it entirely distinct from the clinical reporting process itself so that tight controls could be kept on it.
The part of the regulations that might be seen as having something to
do with clinical reporting is 21 CFR 314.126 which has the title
"adequate and well-controlled studies". 21 CFR 314.126(b) lists
the characteristics of an adequate and well-controlled study. For the first
characteristic listed we have:
(1) There is a clear statement of the objectives of the investigation and a summary of the proposed or actual methods of analysis in the protocol for the study and in the report of its results. In addition, the protocol should contain a description of the proposed methods of analysis, and the study report should contain a description of the methods of analysis ultimately used. If the protocol does not contain a description of the proposed methods of analysis, the study report should describe how the methods used were selected. |
For the seventh characteristic listed we have:
(7) There is an analysis of the results of the study adequate to assess the effects of the drug. The report of the study should describe the results and the analytic methods used to evaluate them, including any appropriate statistical methods. The analysis should assess, among other things, the comparability of test and control groups with respect to pertinent variables, and the effects of any interim data analyses performed. |
In this seventh characteristic, the first sentence "There is an analysis of the results of the study adequate to assess the effects of the drug." does not even imply that table values and listings should be correct. Only that the analysis is adequate to assess the effects of the drug. These key probability values will be checked by a statistician in any case.
It would be the easiest thing in the world for the FDA to ask that clinical reporting systems be validated, but they have not done so. For me, the split between computerized systems handling data and computerized systems that just report on data, in requiring the former to have validated systems and the latter left unspecified, makes a lot of sense. For data handling, you have very little in the way of backup and checks. You have got to make sure the computerized system is doing its job properly or you can't rely on the data. Validating the system most definitely helps. On the other hand, you have every opportunity to check on what comes out of a clinical reporting system. You can independently program. You can check stats results independently. I sometimes check stats results using online calculators on the Internet (my favorite site is here). For data handling systems, if it is validated and working correctly then there is a high degree of certainty that the data are correct. However, a "validated" clinical reporting system is still capable of producing nonsense figures depending on how you select the data, for example. Even if you totally trusted the contents of a table you might still want to check that the counts in the table corresponded to manual counts done for corresponding listings -- so called "consistency checks". "Validation" buys you real benefits for data handling. For clinical reporting systems, "validation" does not bring the same level of benefit and still leaves the system heavily exposed to risk of error. Thinking of a "validated" clinical reporting system as an "error free" clinical reporting system would be a big mistake.
Another thing with clinical reporting is that it can be quite fluid over time. Statistical techniques used can go in and out of fashion. Presentation styles can change. Also, code developed by one department can end up getting shared with other departments. "Validated" systems, however, can be rather rigid, due to all the controls they have to go through such as those of the SDLC ("Systems Development Life Cycle") (or SLC in the UK) and such a lack of flexibility does nobody any favors -- trial subjects included.
In section "2.1 Regulatory Background" the following source documents
are listed to back up their claim that validating a clinical reporting
system should be done. This is a useful list to have and I will check these
documents below.
2.1 Regulatory Background
The need for validation of computer systems in clinical research is documented in the ICH regulatory guidelines for Good Clinical Practice (GCP) E6 and Statistical Principles for Clinical Trials (E9). The FDA draft proposal 'Guidance for Industry: Computerized Systems used in Clinical Trials' establishes in further detail specific requiremants. |
Although a person reading the document above will think that clinical
reporting systems need to be validated, the authors of this document are
obviously aware that there are no regulations covering it. An extract is
below and the highlighting is my own.
2.4 Scope of the Guideline
The guideline addresses all computerised clinical systems used directly or indirectly by a sponsor to capture, process, analyze, and report clinical data, and to manage clinical development programmes, before and after regulatory submission. Although validation of the latter systems is not required by regulatory guidelines, it makes good business sense. |
We already know that computerized systems that handle clinical data need to be validated but there is no written evidence to say that it should extend to clinical reporting systems. Reporting systems can be fluid over a period of years. Statistical methods come in and out of fashion. Presentation styles change. Macros written by one department can end up getting shared across other departments. It is important that the analysis is done correctly and accurately, and putting the code used through some sort of careful checking would help, but I consider clinical reporting systems to be too fluid to have a complicated and heavily documented validation system imposed on them as the above document recommends. It could rob the reporting system of its flexibility and in so doing make it less effective. And if a reporting system is made less effective then this could have a negative effect on trial subjects. This would be counter to the principles of "Good Clinical Practice" as we will see.
I will now go into more detail on the documents listed above in the "2.1 Regulatory Background" section not looked at so far, namely "Good Clinical Practice (E6)" and "Statistical Principles for Clinical Trials (E9)" to see if these give any evidence to suggest that clinical reporting systems need to be validated.
Sometimes you see see in a job advert for a sas programming position
that an awareness of GCP is required. "GCP" is short for "Good Clinical
Practice". I have heard it said that clinical reporting systems are covered
by the broad principles of GCP. There are some reasons for this so let
us look at the definition of GCP.
INTRODUCTION
Good Clinical Practice (GCP) is an international ethical and scientific quality standard for designing, conducting, recording and reporting trials that involve the participation of human subjects. Compliance with this standard provides public assurance that the rights, safety and well-being of trial subjects are protected, consistent with the principles that have their origin in the Declaration of Helsinki, and that the clinical trial data are credible. |
In the first sentence of the introduction above I have highlighted "quality standard" and "reporting trials". The second sentence makes it clear that it is to do with the "safety and well-being of trial subjects" and making sure "the clinical trial data are credible". How does "reporting trials" tie in with that second sentence? As for "the clinical trial data are credible" then there is no strong link with the reporting of the trial as reporting of a trial does not affect the data. How does "reporting trials" tie in with the "safety and well-being of trial subjects"? Does post-trial statistical analysis affect the "safety and well-being of trial subjects"? For post-trial analysis, I don't see how, as the subjects are no longer on the trial. This "reporting trials" could be more to do with the reporting of serious adverse events that might affect the safety of subjects. But there might be interim reporting on which a decision is made to continue with the trial or not and the quality standard of this reporting could be of crucial importance for the safety and well-being of the trial subjects. Also, post-trial analysis could have an influence on whether a further study were done and so would have safety-implications for subjects on that later trial. I am having to make a real effort to imply this link but I think we have something here that might give a clue as to whether clinical reporting systems should be validated or not.
We find another clue in section "2. The Principles of ICH GCP" at the
end which I have highlighted.
2. THE PRINCIPLES OF ICH GCP
2.1 Clinical trials should be conducted in accordance with the ethical
principles that have their origin in the Declaration of Helsinki, and that
are consistent with GCP and the applicable regulatory requirement(s).
|
In section 2.13 above it should be clear that the reporting of a trial is an aspect of the trial and it is asking for "systems and procedures that assure the quality" to be in place. For English people, the unqualified word "quality" is unclear as you can have "poor quality", "medium quaility", "high quality" etc.. Presumably "quality" in this case means "high quality" or "high enough quality" or "of sufficient quality" such that it ensures the "safety and well-being of trial subjects". We have something at last!
Section 6.9 is to do with statistics. It gives us no more clues about
validating clinical reporting systems. That is all we have that might relate
to clinical reporting.
6.9 Statistics
6.9.1 A description of the statistical methods to be employed, including
timing of any planned interim analysis(ses).
|
In concluding this section, I would say we have found something that relates to clinical reporting. The principles of GCP are more to do with subject safety while on a trial and because statistical analysis of clinical trials is more a post-trial function or a delayed interim-trial function, then it is more separated from the safety of subjects covered by the principles of GCP. However, clinical reporting might have an influence on whether a trial is continued or a further trial done and so could affect the safety of subjects. These things point to a literal interpretation of the 13th core principle of GCP and that implicitly covers clinical reporting.
The UK adopted GCP into law in 2004 with their own drafted regulations
entitled "The Medicines for Human Use (Clinical Trials) Regulations 2004"
as follows:
http://www.opsi.gov.uk/si/si2004/20041031.htm
The EU followed up "Directive 2001/20/EU" with another directive in
2005 entitled "Directive 2005/28/EC" that strengthened the legal basis
for using GCP which you can link to below. EU member states had to comply
with this by 29 January 2006. It is not clear how it will be implemented
in the UK.
http://europa.eu.int/eur-lex/lex/LexUriServ/site/en/oj/2005/l_091/l_09120050409en00130019.pdf
What I will do now with the above three documents is see how the laws relate to clinical reporting.
Searching "Directive 2001/20/EC" for the word "reporting" we
get this I have highlighted "reporting" and "results of the clinical trials
are credible".
2. Good clinical practice is a set of internationally recognised ethical and scientific quality requirements which must be observed for designing, conducting, recording and reporting clinical trials that involve the participation of human subjects. Compliance with this good practice provides assurance that the rights, safety and well-being of trial subjects are protected, and that the results of the clinical trials are credible. |
In the above, we have further clarification of what they mean by "reporting".
"Reporting" will certainly cover the "results of the clinical trials" and
lent it credibility or not. A further search on the word reporting only
finds the word in the "Notification of adverse events" section. I have
highlighted the word "reporting".
Article 16
Notification of adverse events 1. The investigator shall report all serious adverse events immediately to the sponsor except for those that the protocol or investigator's brochure identifies as not requiring immediate reporting. The immediate report shall be followed by detailed, written reports. The immediate and follow-up reports shall identify subjects by unique code numbers assigned to the latter. 2. Adverse events and/or laboratory abnormalities identified in the protocol as critical to safety evaluations shall be reported to the sponsor according to the reporting requirements |
Since we are interested in clinical reporting systems, rather than the
reporting of adverse events, I will select on the word "reporting" used
only in the sense of reporting clinical trials. A search on the word "reporting"
in the UK regulations gives us the following. It is nothing to do
with clinical trial reporting.
10. All clinical trial information shall be recorded, handled, and stored in a way that allows its accurate reporting, interpretation and verification. |
Searching for "reporting" in "Directive 2005/28/EU" gives us this. I
have highlighted the word "reporting":
CHAPTER 1
SUBJECT-MATTER Article 1 1. This Directive lays down the following provisions to be applied to investigational medicinal products for human use: (a) the principles of good clinical practice and detailed guidelines in line with those principles, as referred to in Article 1(3) of Directive 2001/20/EC, for the design, conduct and reporting of clinical trials on human subjects involving such products; |
Also this. I have highlighted the word "reporting" and also "quality
of every aspect of the trials".
CHAPTER 2
GOOD CLINICAL PRACTICE FOR THE DESIGN, CONDUCT, RECORDING AND REPORTING OF CLINICAL TRIALS SECTION 1 GOOD CLINICAL PRACTICE Article 2 1. The rights, safety and well being of the trial subjects shall prevail over the interests of science and society. 2. Each individual involved in conducting a trial shall be qualified
by education, training, and experience to perform his
3. Clinical trials shall be scientifically sound and guided by ethical principles in all their aspects. 4. The necessary procedures to secure the quality of every aspect of the trials shall be complied with. |
Going back to the UK regulations again, although they do not
appear to mention clinical reporting, we have this:
PART 4
GOOD CLINICAL PRACTICE AND THE CONDUCT OF CLINICAL TRIALS Good clinical practice and protection of clinical trial subjects
(a) conduct a clinical trial; or (b) perform the functions of the sponsor of a clinical trial (whether that person is the sponsor or is acting under arrangements made with that sponsor), otherwise than in accordance with the conditions and principles of good clinical practice. |
The above makes it clear the the conditions and principles of good clinical practice apply and the 13th core principle implicitly covers clinical reporting.
What we get out of looking at these EU regulations is that for the EU countries, excepting the UK, we have a link between "reporting" and the "results of the clinical trials are credible". For the UK we have the same as before in that clinical reporting is implicitly covered by the core principle 13 of GCP.
Section VII of this document covers the clinical study report. There is a great deal of important information there, and people should be aware of its contents, but nothing there to say a clinical reporting systems needs to be validated. Mention is made in this document of ICH E3 that covers clinical reporting in more detail. This will come next.
The conclusion I have come to, based on the documents I have reviewed
above, is that there is no statutory requirement or law stating
that a clinical reporting system be "validated" in the same way
that a computerized system for handling clinical trial data needs to be
validated. Only that the results that come out of a clinical reporting
system are accurate and of at least sufficient quality to assure the safety
of trial subjects and there are systems with procedures that assure
the quality in place. We are at the end of this story of discovery
and it is time for me to reveal the "twist at the end of the story"
or the "sting in the tail" so here goes: putting such "systems
with procedures that assure the quality" in place arguably goes way
beyond "validating" a clinical reporting system. If you were reading
this wondering whether you should bother "validating" your clinical reporting
system or not, then what I have concluded is that what is required of you
to "assure the quality" dwarfs the effort to validate a system.
Worse than that, since GCP is now law for EU member states, I wonder how
many pharmaceutical companies in the EU are complying with the GCP regulations
in this regard. I suspect very few and those who are not are now breaking
the law.
Use the "Back" button of your browser to return to the previous page.