Original Research | Published: 14 November 2022
open-url

Advancing the Real-World Evidence for Medical Devices through Coordinated Registry Networks

https://doi.org/10.1136/bmjsit-2021-000123

Request reuse permissionopen-url
This is an open access article distributed in accordance with the Creative Commons Attribution Non Commercial (CC BY-NC 4.0) licenseopen-url

Abstract

Objectives Generating and using real-world evidence (RWE) is a pragmatic solution for evaluating health technologies. RWE is recognized by regulators, health technology assessors, clinicians, and manufacturers as a valid source of information to support their decision-making. Well-designed registries can provide RWE and become more powerful when linked with electronic health records and administrative databases in coordinated registry networks (CRNs). Our objective was to create a framework of maturity of CRNs and registries, so guiding their development and the prioritization of funding.

Design, setting, and participants We invited 52 stakeholders from diverse backgrounds including patient advocacy groups, academic, clinical, industry and regulatory experts to participate on a Delphi survey. Of those invited, 42 participated in the survey to provide feedback on the maturity framework for CRNs and registries. An expert panel reviewed the responses to refine the framework until the target consensus of 80% was reached. Two rounds of the Delphi were distributed via Qualtrics online platform from July to August 2020 and from October to November 2020.

Main outcome measures Consensus on the maturity framework for CRNs and registries consisted of seven domains (unique device identification, efficient data collection, data quality, product life cycle approach, governance and sustainability, quality improvement, and patient-reported outcomes), each presented with five levels of maturity.

Results Of 52 invited experts, 41 (79.9%) responded to round 1; all 41 responded to round 2; and consensus was reached for most domains. The expert panel resolved the disagreements and final consensus estimates ranged from 80.5% to 92.7% for seven domains.

Conclusions We have developed a robust framework to assess the maturity of any CRN (or registry) to provide reliable RWE. This framework will promote harmonization of approaches to RWE generation across different disciplines and health systems. The domains and their levels may evolve over time as new solutions become available.

Key messages

What is already known about this subject?

  • Several initiatives have been launched on national and international levels by the US Food and Drug Administration and the International Medical Device Regulators Forum to develop a real-world evidence (RWE) framework to provide supportive evidence for regulatory purposes.

  • Registries are a key source of RWE, building from which coordinated registry networks (CRNs) have been introduced to describe systems that aim to produce all the necessary evidence for regulators and key stakeholders, by obtaining data from multiple sources.

What are the new findings?

  • We developed an innovative and robust framework to assess the maturity of registries and CRNs for device research and surveillance, to address increasing evidentiary needs of stakeholders.

  • We defined the maturity of CRNs by how close they come to providing all the required information in an accessible, thorough, relevant, and reliable form using seven domains - unique device identification, efficient data collection, data quality, product lifecycle approach, governance and sustainability, quality improvement, and patient-reported outcomes.

How might these results affect future research or surgical practice?

  • This maturity framework for CRNs and registries will promote harmonization of approaches to RWE generation across different disciplines and health systems.

  • This framework will also help to prioritize investment in systems and processes that are sustainable and that will supply the evidence needed for regulatory and other evaluations requested by stakeholders.

Introduction

Modern healthcare is being transformed by the introduction of unprecedented numbers of new technologies, including many that require invasive procedures to implant medical devices. This creates a major challenge for evidence development needed for evaluation of these technologies. It is not only the number of new devices but also their complexity and speed of development that compound the challenge. Comprehensive evidence about short-term and long-term safety and effectiveness of devices is needed to support decision-making by regulators, health technology assessors, clinicians, patients, and other stakeholders if these devices are to be adopted in healthcare systems worldwide. The need for thorough and timely data on implanted devices has been highlighted by problems that have emerged related to certain hip implants,1 2 urogynecological surgical meshes,3 4 breast implants,5 and cardiovascular devices.6 If better information systems have been available during early dissemination of these technologies, it is possible that these problems may have been recognized and addressed sooner.

Using real-world evidence (RWE) is a pragmatic solution for the evaluation of many therapeutics and technologies.7 8 RWE can be produced by a variety of study designs using data from routinely collected sources such as electronic heath records, registries, and patient-generated data, including patient-reported outcomes (PROs). Registries, as a major source of RWE, have received global attention, especially since the launch of specific initiatives by the International Medical Device Regulators Forum (IMDRF), exploring the possibilities of prospective national data collection in multiple countries, with the potential for subsequent data linkage to provide information from very large numbers of patients longitudinally.9 In the USA, the Food and Drug Administration (FDA) has spearheaded an initiative to develop an RWE framework to provide supportive evidence for regulatory purposes.10 11

Building on registries, the concept of coordinated registry networks (CRNs) has been introduced to describe systems that aim to produce all the necessary evidence for regulators and other stakeholders by obtaining data from multiple sources.12 In theory, it is possible for a single registry to do this, but in practice, that is usually not feasible. The solution is linkage of registries and a diverse range of other datasets to provide the whole spectrum of information needed for device evaluation. In the USA, the development of CRNs has been led by the FDA and the Medical Device Epidemiology Network (MDEpiNet), with the aim of creating national and international partnerships and methodologies for leveraging RWE to evaluate medical devices throughout the total product life cycle (TPLC).13 CRNs not only provide the prospect of robust RWE evidence on safety and effectiveness of devices but also offer the possibility of nested study designs that can expedite patient recruitment at a lower cost than traditional clinical research.14–16

With growing recognition of the value of CRNs, there is a need to develop a consensus-based framework to evaluate them and the registries on which they are based. Our aim was to develop a robust framework to assess the maturity of registries and CRNs for device research and surveillance, to address increasing evidentiary needs of stakeholders. We defined the maturity of CRNs by how close they come to providing all the required information in an accessible, thorough, relevant, and reliable form. This framework will help to prioritize investment in systems and processes that are sustainable and that will supply the evidence needed for regulatory and other evaluations requested by stakeholders. We describe this framework, along with its evolution through two rounds of Delphi survey responses from a range of key stakeholders.

Methods

The key domains of the framework were developed by the MDEpiNet Coordinating Centre in consultation with FDA and an expert group of collaborators from patient advocacy groups, academic, clinical, industry, and regulatory settings. The domains of the framework were based on a previous IMDRF report17 that was led by a number of coauthors of that study. There are seven domains:

  1. Promotion of unique device identification.

  2. Improving data collection efficiency.

  3. Advancing data quality for regulatory decision-making.

  4. Considering TPLC research.

  5. Establishing governance and ensuring sustainability.

  6. Leveraging registries as quality systems.

  7. Incorporation of patient-generated data and PROs.

In this study, we used the Delphi method for reaching consensus to develop and refine the framework from our initial design. The Delphi method was established by the RAND Corporation in 1964.18 It was introduced to eliminate peer influence of traditional survey designs by using an anonymous platform to collect unbiased opinions. The technique employs multiple rounds of questionnaires until a target consensus is reached from a group of diverse participants. This method is used routinely by MDEpiNet to create core minimum data elements for studying various technologies19–22 and is also used by our partner IDEAL (Idea, Development, Exploration, Assessment, Long-term study framework) group to develop guidelines for evaluation of new surgical techniques and complex therapeutic technologies.23

Using the Delphi method, we aimed to establish a framework that includes five levels of maturity for each of the seven domains. MDEpiNet Executive Operations Committee members, CRN leaders, and a range of stakeholders experienced in the field of device research and surveillance using RWE were selected for the survey.

Two rounds of the Delphi survey were circulated to participants using the Weill Cornell Medicine’s Qualtrics survey platform (https://weillcornell.az1.qualtrics.com/). Round 1 was carried out in July–August 2020 and round 2 in October–November 2020. First, the participants were asked to agree or disagree on each entire domain, as presented, together with its five levels of maturity. If the participants agreed on the domain as it was presented, they moved on to the next domain. However, if they disagreed on any detail of the domain or its five levels, they were presented with follow-up questions. The follow-up questions allowed participants to provide free-text comments on the text of the domain or any of its proposed five levels regarding their disagreement. In this way, consensus was assessed for each of the seven domains as well as their five levels.

The defined aim of the Delphi process was achievement of 80% consensus for all seven domains and their five levels. The lead investigators and registry experts (AS, DM-D, JLC, EWP, PG, and AB) reviewed the survey results and used the comments made by participants to propose changes to the text of the domains and their levels until the target of 80% consensus was reached when next reviewed by the Delphi participants. The responses of round 1 were reviewed to provide clarity on definitions or language without losing the meaning or the structure of the overall model. This guided revision of the framework for round 2 of the survey. Based on the responses from round 2, the framework was further revised to create the final version, which was circulated to the participants for final comments.

The one domain in which 80% consensus was not achieved was domain 3, data quality. To resolve this, an expert panel (JLC, EWP, and AS) was convened, which reviewed all the relevant comments by teleconference on 15 December 2020. The revised version of this domain was then presented to participants and committee members on 29 December 2020 via Zoom videoconference when target consensus level was achieved.

Results

Fifty-two individuals were identified as experts and were invited to participate. Of the 52 invited, a total of 41 responded to round 1 of the survey (79.9% response rate). Six participants expressed 100% agreement in round 1, and so they were excluded from round 2. All the remaining 35 responded to round 2 (100%). There was, therefore, a total of 41 responses, which were evaluated for the final model development. The characteristics of the survey participants are summarized in table 1. There were 21 (51.2%) experts who represented clinical perspective, and 12 of these experts also had academic roles. Six (14.6%) additional experts primarily represented academic perspective. There was also representation from regulatory, industry, and patient stakeholder communities (table 1).

Table 1
|
Characteristics of the Delphi survey participants (N=41)

In round 1, only the TPLC domain achieved the target consensus (82.9%) (table 2). In round 2, the agreement level for each domain increased, such that all domains reached the 80% target for consensus, except the data quality domain, which achieved 75.6% agreement. Where there were disagreements, these tended to be heterogenous, without any common theme. The one exception was the data quality domain, in which there was disagreement about the level of detail and therefore the burden of auditing requirements. After an expert panel review, the team was able to resolve the disagreement and propose a modified approach of the data quality domain to reach the target consensus, which was then agreed on by the participants when the final framework was circulated for comment. Most importantly, each item (levels 1–5) of each domain in the framework received more than 90% agreement in round 2 (figure 1).

Table 2
|
Delphi survey results: % agreement and disagreement on coordinated registry network maturity framework
Figure 1
Figure 1

Delphi survey results: percent agreement on levels of maturity in each domain. PRO, patient-reported outcome; TPLC, total product life cycle.

Discussion

Our two rounds of Delphi responses achieved consensus for all seven domains of the framework (with subsequently agreed adjustment for one domain), each with five levels, designed to evaluate the maturity of CRNs and registries to support medical device evaluation.

This framework offers a means of assessing the maturity, or level of development, of registries and CRNs, whatever their origin or type, specifically with respect to their suitability for generating reliable RWE. This is important because such a wide range currently exists globally. Not all registries or CRNs were created for the purpose of generating research results. Some CRNs are based on registries started by medical professional societies in response to transformational technologies such as transcatheter valve therapies.24 Others leverage traditional registries developed by specialty societies (eg, Society for Vascular Surgery (SVS)) and a variety of integrated health systems, used for clinical care, quality improvement, billing, and also for collection of national statistics.25 26 Our framework offers an understanding of the capability of any registry or CRN to provide information for relevant stakeholders in a standardized manner. It will promote harmonization of assessment of maturity across different disciplines and different health systems worldwide. The framework is intended as a dynamic model, with the capacity to evolve over time as new experiences accumulate within the data ecosystem.

An example of a CRN that has achieved maturity in most domains described by our framework is the Vascular Implant Surveillance and Interventional Outcomes Network (VISION) CRN. This CRN is successful in efficiently capturing device data for patients undergoing vascular surgery and interventional care. The CRN covers 40% of relevant hospitals across the USA, with over 90% enrollment of patients and completion of case report forms complete. It has created a national repository of linked data sources to obtain long-term outcomes and has published a series of studies documenting the value of these data for quality improvement and for informing regulatory decisions (figure 2).27–30 It also contributes to the International Consortium of Vascular Registries (ICVR) for analyses of device outcomes.31 32 The CRN uses data from the Vascular Quality Initiative of the Society for Vascular Surgery, which was launched in 2003 to improve the quality, safety, and effectiveness and to reduce costs of vascular procedures.26 As a quality improvement initiative, it conducts regular data audits and provides reports back to participating institutions, including outlier assessments. The VISION CRN has also recently launched a PRO data collection pilot with disease-specific and general health measures.

Figure 2
Figure 2

Vascular Implant Surveillance and Interventional Outcomes Network coordinated registry network. The center hexagon depicts the role of the MDEpiNet Coordinating Centre, and outer hexagons represent various data partners and data sources. CDRN, Clinical Data Research Network; DUA, data use agreement; MOU, memorandum of understanding; OPC, Objective Performance Criteria; OPG, Objective Perfomance Goals; PCORI, Patient-Centered Outcomes Research Institute; SPARCS, Statewide Planning and Research Cooperative System; VQI, Vascular Quality Initiative.

It should be emphasized that a CRN or registry does not need to have the highest score in all seven domains to be considered mature and useful. Application of maturity framework and the levels of maturity that it specifies can help registries and CRNs identify the gaps and prioritize investments in data infrastructure and analytical processes, provided that their initial design has the potential for evolution. It is also important to understand that the framework is not intended as a system to produce an overall ‘score’ for any CRN or registry, but rather to assess each of its domains and then to take an overall perspective. What is important for any registry or CRN is its initial design, which needs to have the capacity for development, along the lines set out in the framework. This means involving all the key stakeholders in the initial development phase, including patients, professional societies, and manufacturers.

From the international perspective, there are emerging linked data networks that can become CRNs in Europe and Australia. In the UK, for example, there are well-established national registries, such as the National Joint Registry, that have been linked with routinely collected Hospital Episode Statistics, to study the risk of revision due to prosthetic joint infection following primary knee replacement.33 The UK Transcatheter Aortic Valve Implantation registry also has experience linking with routine National Health Service and has provided risk outcomes of transcatheter aortic valve implantation from 2007 to 2012.34 The UK Clinical Practice Research Datalink, with detailed information on 60 million patients from primary care providers (community-based), also provides huge potential for RWD studies, although specific device identification is challenging.35 In Australia, population-based linked hospital morbidity and mortality data have been used to study age-stratified outcomes of surgical aortic valve replacement.36

Our framework creates opportunities for harmonization and global collaboration in the development and evolution of registries and CRNs because it offers a shared vision of the qualities required for them to provide reliable and useful RWE. International collaborations offer great potential for rapid acquisition of information about short-term and long-term safety and effectiveness of novel and established technologies, particularly those that are not commonly used in clinical practice. MDEpiNet has initiated international collaborations that are using registries and administrative datasets for device research and surveillance. Examples include the ICVR37 and the International Consortium of Orthopaedics Registries.38 Applying the CRN maturity framework to these systems will help make global collaborations increasingly more robust and useful.

Conclusions

Our maturity framework offers a consistent method for assessing the capacity of registries and CRNs to provide useful and reliable RWE about medical devices. It identifies gaps and guides their future development. It can be applied in any country or health system and, therefore, has value in enabling international collaborations.

Appendix A1: Maturity framework

|
|
|
|
|
|
|