Apr 25, 2026
ICH E6(R3) is no longer an announced change

ICH E6(R3) is no longer an announced change. Since July 2025, the guideline has been in effect at the EMA level. Since September 2025, FDA has formally adopted it. The industry no longer has a preparation horizon — it has an active requirement.
For an SMO, this transition raises a direct question: what has effectively changed in the way a site operates, is evaluated and selected by sponsors?
The answer is not straightforward. E6(R3) does not simply add new requirements on top of the existing framework — it restructures the logic on which the entire conduct of a clinical trial is built. The shift from a uniform approach, applied identically regardless of trial complexity, to one proportionate to the real risk of each trial, changes the way an SMO builds its processes, documents its activities and demonstrates operational competence to a sponsor.
SMOs that treat E6(R3) as a formal update — a few revised procedures, a few training sessions checked off — risk falling behind in an ecosystem where sponsors and CROs now evaluate site partners based on their demonstrated capacity to operate according to the new logic, not based on compliance declarations.
This article does not present the theory of the guideline. It presents what E6(R3) means in the real operations of an SMO in 2026 — what has changed, where challenges persist and how a competitive SMO positions itself against these requirements.
1. From Uniform Compliance to Risk Proportionality — A Shift in Operational Logic
E6(R2) introduced the concept of a risk-based approach but largely left its application to the sponsor's discretion. In practice, the result was an industry where monitoring, documentation and the operational processes of a site were built according to the same template, regardless of study complexity, the risk profile of the investigational product or the participant population.
E6(R3) fundamentally changes this logic. Risk proportionality is no longer a design option — it is a mandatory principle that runs through the entire lifecycle of a trial, from planning to reporting. The guideline explicitly states that trial processes must be proportionate to the risks to participants and to the importance of the data collected, and that unnecessary complexity must be eliminated. For an SMO, this means that having processes in place is no longer sufficient — those processes must be justified and calibrated.
1.1 What Risk-Based Approach Means in Practice at Site Level
At site level, the risk-based approach requires a concrete change in how trial activities are managed. Identifying the factors critical to trial quality — what E6(R3) explicitly calls "critical to quality factors" — is no longer exclusively the sponsor's responsibility. The investigator and the SMO supporting them must demonstrate that they understand these factors, that their working processes reflect them and that oversight of delegated activities is proportionate to their importance for participant safety and data integrity.
In concrete operational terms: a site that applies the same level of verification and documentation to a routine administrative procedure and to the collection of a primary safety endpoint is not operating in accordance with E6(R3). Differentiation is not optional — it is part of the guideline's core logic.
1.2 How to Build a Proportionate Monitoring Plan — The Difference from the Previous Approach
The monitoring plan is the document where risk proportionality becomes visible and verifiable. According to E6(R3), it must explicitly reflect the monitoring strategy, the methods used, the justification for their selection and the critical to quality factors that are monitored as a priority.
The difference from E6(R2) is substantial. Under the previous framework, the monitoring plan was primarily a sponsor document describing the frequency of site visits and source data verification procedures. Under E6(R3), the monitoring plan becomes a risk management instrument — it must identify where the real risks to data quality and participant safety lie and describe how those risks are managed through a combination of site monitoring and centralised monitoring.
In this context, a sponsor operating according to the logic of E6(R3) will evaluate not only whether the site follows the procedures described in the monitoring plan, but whether the site understands and applies the logic behind that plan — a distinction that becomes directly relevant in the site selection and qualification process.
1.3 Centralised Monitoring as a Primary Instrument, Not an Exception
One of the changes with the greatest operational impact in E6(R3) is the repositioning of centralised monitoring. In E6(R2), centralised monitoring was referenced as a possible alternative in special circumstances. In E6(R3), it is defined as an essential component of the monitoring process — a timely evaluation of accumulated data, conducted by the sponsor's qualified and trained personnel, which can complement and reduce the frequency of site monitoring or be used independently.
The operational implication for an SMO is direct: the quality of data entered into electronic systems and their real-time accuracy are no longer verified exclusively at the monitor's visit. They are evaluated continuously. A site that enters data with systematic delays, that has high query rates or that presents inconsistencies between source records and reported data becomes visible through centralised monitoring before a monitor physically arrives at the site.
This means that the operational performance of an SMO is now transparent and measurable in real time for the sponsor — not only at the moment of the monitoring visit.
2. Data Governance — A Completely New Section Compared to E6(R2)
Risk proportionality, as established in section 1, cannot be demonstrated without a solid data infrastructure. Data is the operational evidence that a site functions according to the logic of E6(R3) — and E6(R3) introduces for the first time a section dedicated exclusively to Data Governance, completely absent from E6(R2). This is not a reformulation of existing requirements. It is a new framework, with specific requirements for how data is captured, managed, corrected, transferred and retained across the entire lifecycle of a trial.
For an SMO, the impact is direct and verifiable. Sponsors and CROs now evaluate a site's capacity to manage data according to this framework — not as a formal compliance requirement, but as an indicator of operational maturity.
2.1 The Data Lifecycle — What Must Be Documented and How
E6(R3) explicitly defines the elements of the data lifecycle: data capture, relevant metadata including audit trails, review of data and metadata, corrections, data transfer and migration, finalisation of data sets prior to analysis, retention and access, and destruction. Each of these stages must be covered by documented procedures.
The difference from E6(R2) is that under the previous framework, data management requirements were dispersed across different sections of the guideline and treated primarily as the sponsor's responsibility. In E6(R3), the data lifecycle is a unified framework that explicitly involves both the sponsor and the investigator — and by extension, the SMO operating the site.
In practical terms, a site must be able to demonstrate that it has clear procedures for each stage of this lifecycle — not only for data capture and query resolution, but for the entire chain from primary source to archiving. The absence of these procedures, or their existence only on paper without demonstrable application, represents a real operational risk in the context of an inspection or a site evaluation by a sponsor.
2.2 Requirements for Computerised Systems at Site Level — What the Sponsor Evaluates
E6(R3) introduces clear requirements for computerised systems used in clinical trials — requirements that directly target the site, not only the sponsor. The guideline specifies that the sponsor must evaluate computerised systems used or operated by the investigator, including clinical practice systems such as electronic health records, to determine whether they are fit for purpose in the context of the trial.
This evaluation must be documented and completed before the system is used in the trial. The factors evaluated include data security, user management, audit trails and backup measures — in other words, precisely the elements that an SMO must have functional and demonstrable before a sponsor selects the site.
The practical implication is clear: an SMO that cannot demonstrate that its computerised systems meet these requirements — or that has not documented this evaluation — creates an obstacle in the site selection process. Not because the sponsor seeks additional bureaucracy, but because E6(R3) imposes this verification on the sponsor as part of its oversight responsibility.
2.3 Audit Trails and Metadata — What E6(R3) Adds
E6(R3) does not invent the audit trail requirement — this existed in E6(R2). What it introduces is an explicit and detailed framework that transforms the audit trail from a general technical requirement into an active element of data quality management, with specific requirements for interpretability, management and review.
Three aspects are significantly clarified compared to the previous framework.
The first — the explicit prohibition of disabling. E6(R3) directly specifies that audit trails and logs cannot be disabled and cannot be modified except in exceptional circumstances — for example, when a participant's personal information is inadvertently included in the data — and only if the action and its justification are documented. This requirement was implicit in E6(R2). In E6(R3) it is explicitly formulated as a verifiable standard.
The second — the interpretability requirement. E6(R3) specifies that audit trails and logs must be interpretable and must be able to support review. It is not sufficient for the audit trail to exist — it must be usable as an instrument for evaluating trial conduct. An audit trail that is incomplete, fragmented or impossible to correlate with the data in the electronic system does not meet this requirement.
The third — integration into the data lifecycle. E6(R3) positions the audit trail as a metadata element that must be actively managed throughout the entire duration of the trial — not merely passively retained. The review of audit trails must be a planned activity, proportionate to the criticality of the data, not a reactive verification carried out only at the monitor's visit or in the context of an inspection.
For an SMO, the practical implication of these three clarifications is that a sponsor or auditor evaluating compliance with E6(R3) will verify not only whether the audit trail exists, but whether it is continuous, interpretable and integrated into an active data review process.
3. Redefined Roles and Responsibilities — What Concretely Changes for the Investigator and the SMO
If section 1 establishes that the operational logic has changed and section 2 demonstrates that data infrastructure is the evidence of this change, section 3 answers the question that matters most for an SMO: who does what and who is accountable for what under the new E6(R3) framework?
E6(R3) does not reinvent the responsibility structure of clinical trials. The investigator remains the primary responsible party at site level. The sponsor remains responsible for the quality and integrity of the entire trial. What changes is the precision with which delegation, oversight and documentation of these responsibilities are defined — and consequently, the role an SMO plays in demonstrating them.
3.1 Delegation of Activities — Documentation Proportionate to the Importance of the Activity
E6(R3) maintains the principle that the investigator may delegate trial activities to other persons or entities while retaining ultimate responsibility for them. What changes compared to E6(R2) is that the documentation of delegation must be proportionate to the significance of the delegated activities.
The guideline explicitly specifies that in situations where activities are performed as part of routine clinical practice, delegation documentation may not be required. This is a deliberate change from the previous approach — where the tendency was to exhaustively document any delegation regardless of its importance, generating large volumes of documentation with limited operational value.
For an SMO, the implication is twofold. On one hand, significant trial activities — those affecting participant safety, the quality of primary data or clinical decisions — must be rigorously documented and their oversight demonstrable. On the other hand, routine activities that form part of standard clinical practice do not require the same level of formal documentation. An SMO's ability to make this distinction correctly — and to demonstrate it to a sponsor or auditor — is a direct indicator of operational maturity.
3.2 Investigator Oversight — What Concrete Evidence Is Required
E6(R3) is explicit regarding oversight: the investigator must maintain adequate oversight of the persons and entities to whom activities have been delegated, and the level of this oversight must be proportionate to the nature of the delegated activities and their importance for participant safety and data reliability.
This means that an investigator cannot delegate significant trial activities to an SMO and consider their responsibility fulfilled upon signing the delegation log. E6(R3) requires evidence that oversight actually took place — not merely that it was intended.
For an SMO, this creates both a responsibility and an opportunity. The responsibility is that delegated activities must be executed to a standard that allows the investigator to demonstrate that their oversight was real and effective. The opportunity is that an SMO that provides the investigator with the tools and documentation needed to exercise this oversight — clear reports, activity traceability, structured escalation of issues — becomes a strategic partner, not merely a provider of administrative services.
3.3 How the SMO Repositions Itself Relative to the Sponsor in the New Contractual Framework
E6(R3) explicitly clarifies that agreements between sponsor and investigator, between investigator and service providers, and between sponsor and service providers must clearly define the roles, activities and responsibilities of each party. These agreements must be documented prior to the initiation of activities.
The direct implication for an SMO is that its contractual positioning relative to both the sponsor and the investigator must precisely reflect which activities it assumes, which responsibilities it takes on and what level of oversight the investigator maintains over the delegated activities. A vague agreement that describes SMO services in general terms is no longer sufficient in the context of E6(R3).
Furthermore, E6(R3) specifies that the sponsor must maintain adequate oversight of activities transferred to service providers — including activities subsequently subcontracted by those providers. This means that an SMO is visible to the sponsor not only through the investigator, but directly, as an entity executing trial activities with an impact on data quality and participant safety.
This relationship is bilateral. E6(R3) imposes concrete obligations on the sponsor towards the site — to provide the investigator with real-time access to data collected according to the protocol, to avoid exercising exclusive control over data captured in data acquisition systems and to ensure that the investigator can access the data necessary for retention and for decisions related to participant safety. An SMO that understands these sponsor obligations can manage the contractual and operational relationship with greater clarity and can escalate situations in which these obligations are not met.
This knowledge, combined with the capacity to demonstrate to the sponsor — through clear documentation, traceable processes and demonstrable operational quality — that its activities are executed according to the logic of E6(R3), concretely defines the difference between a verifiable site partner and one that merely declares compliance.
4. The Real Implementation Challenges in 2026
The first three sections of this article describe what E6(R3) requires and what these requirements mean for an SMO's operations. This section describes where the industry actually stands relative to these requirements — and where the gap between requirement and practice persists.
Acknowledging this gap is not a sign of weakness. It is the starting point for genuine strategic positioning.
4.1 Where the Gap Between Requirement and Practice Persists
Nearly a year after E6(R3) came into effect at the EMA level, the implementation gap is real and documentable across several operational areas.
The first area is the understanding and application of risk proportionality. The principle has existed in the guideline since 2016, with E6(R2). The reality is that most sites and SMOs continue to operate with standardised processes applied uniformly — not because they are unaware of the requirement, but because building genuinely differentiated processes based on the real risk of each trial requires an analytical and operational capacity that many have not systematically developed. The result is declarative compliance — sites can cite the principle of proportionality but cannot demonstrate how it is reflected in concrete day-to-day operational decisions.
The second area is Data Governance. The introduction of a dedicated section in E6(R3) has clarified the requirements, but implementing them requires more than updating SOPs. It requires that the computerised systems used at site level are evaluated, documented and demonstrated as fit for purpose — which in practice means that many sites are now discovering that they have systems they have been using for years without a formally documented evaluation. This is not a problem of bad faith — it is a problem of prioritisation and resources that is now manifesting as a concrete operational risk.
The third area is the documentation of delegation and oversight. Although E6(R3) introduces proportionality in this area as well — reducing the volume of formal documentation for routine activities — the reality is that many investigators and SMOs have not yet adjusted how they document delegation. Either they continue to document exhaustively and uniformly, consuming resources without added value, or they have reduced documentation without a clear justification of the proportionality criteria applied. Neither extreme reflects the logic of E6(R3).
4.2 What SMOs That Have Not Adapted Cannot Demonstrate
The implementation gap manifests concretely in what an SMO cannot demonstrate to a sponsor during the site evaluation and selection process.
It cannot demonstrate that it understands and applies risk proportionality. A sponsor evaluating a site according to the logic of E6(R3) will look for evidence that the site's processes are calibrated to the real risk of the study — not that the site has general quality procedures. The absence of this differentiation is visible and relevant in the selection decision.
It cannot demonstrate that its data infrastructure is fit for purpose according to E6(R3). Electronic systems without documented evaluation, incomplete or absent audit trails, data management procedures that do not cover the entire data lifecycle — these are concrete vulnerabilities that become visible either in the selection process or during an inspection.
It cannot demonstrate that the investigator's oversight of delegated activities is real and effective. Without clear traceability tools, without structured reporting and without documented evidence that the investigator exercised oversight — not merely signed a delegation log — an SMO cannot sustain this requirement in front of a demanding auditor or sponsor.
Each of these gaps represents not only a compliance risk — it represents a lost competitive opportunity in an ecosystem where sponsors have access to comparative data on site performance and make study allocation decisions based on that data.
5. From Compliance to Competitiveness — Strategic Positioning in the E6(R3) Ecosystem
The gap described in the previous section is not a permanent reality — it is a window of opportunity for SMOs that act now.
E6(R3) is not a guideline that rewards formal compliance. It is a guideline that makes visible the difference between an SMO that truly operates at the required standard and one that merely declares it does.
This distinction matters more now than ever — not because regulatory authorities are stricter, but because sponsors and CROs have access to evaluation tools and comparative data that make a site's operational performance transparent and measurable. An SMO that cannot demonstrate that its processes are calibrated to the real risk of the study, that its data infrastructure is fit for purpose and that oversight of delegated activities is real and documented, can no longer compensate for these gaps through relationships or general reputation. The data speaks first.
What an SMO Must Do Differently Starting Now
The first step is not revising SOPs. It is an honest assessment of the gap between what E6(R3) requires and what the SMO can currently demonstrate — not what it declares, but what it can prove with concrete documentation.
This assessment must cover three areas. First: are operational processes genuinely differentiated based on the risk of the study, or are they uniform with a declarative layer of proportionality applied on top? Second: do the computerised systems used at site level have a documented fitness for purpose evaluation, functional audit trails and clear procedures covering the entire data lifecycle? Third: does the documentation of delegation and the evidence of investigator oversight reflect the logic of E6(R3) — proportionate to the importance of the activities, not exhaustive and uniform?
Honest answers to these three questions define the real implementation agenda.
How Positioning in the Industry Changes
An SMO that correctly implements the logic of E6(R3) does not merely become compliant — it becomes a credible strategic partner for sponsors and CROs operating at the same standard.
The difference is concrete. In the feasibility and site selection process, an SMO that can demonstrate genuine process proportionality, documented data infrastructure and effective investigator oversight reduces the sponsor's perceived risk and accelerates the study allocation decision. In an ecosystem where sponsors allocate studies based on demonstrated performance — not promises — this capacity is a direct competitive advantage.
Why 2026 Is the Moment of Differentiation
One year after E6(R3) came into effect, the industry is at an inflection point. Most sites and SMOs have gone through training and updated documents. Few have implemented the shift in operational logic that the guideline truly requires.
This means the differentiation window is open — but it will not remain open indefinitely. SMOs that implement now, with rigour and concrete evidence, will be positioned as first-choice partners for sponsors evaluating sites according to the E6(R3) standard. Those that wait will implement later, under more competitive conditions and with less room for differentiation.
E6(R3) is not a requirement to be checked off. It is a standard that redefines what quality, accountability and partnership mean in global clinical research — for sponsors, CROs, investigators and sites alike. For an SMO, understanding this distinction and acting accordingly is not a strategic option — it is the condition for long-term relevance in an ecosystem that now measures performance, not intention.
📄 Source: ICH Harmonised Guideline — Guideline for Good Clinical Practice E6(R3), final version adopted on 6 January 2025. 🔗 https://database.ich.org/sites/default/files/ICH_E6(R3)_Step4_FinalGuideline_2025_0106.pdf