Article Text

Download PDFPDF

Advancing the Real-World Evidence for Medical Devices through Coordinated Registry Networks
  1. Art Sedrakyan1,
  2. Danica Marinac-Dabic2,
  3. Bruce Campbell3,
  4. Suvekshya Aryal1,
  5. Courtney E Baird4,
  6. Philip Goodney5,
  7. Jack L Cronenwett5,
  8. Adam W Beck6,
  9. Elizabeth W Paxton7,
  10. Jim Hu8,
  11. Ralph Brindis9,
  12. Kevin Baskin10,
  13. Terrie Cowley11,
  14. Jeffery Levy12,
  15. David S Liebeskind13,
  16. Benjamin K Poulose14,
  17. Charles R Rardin15,
  18. Frederic S Resnic16,
  19. James Tcheng17,
  20. Benjamin Fisher2,
  21. Charles Viviano2,
  22. Vincent Devlin2,
  23. Murray Sheldon2,
  24. Jens Eldrup-Jorgensen18,19,
  25. Jesse A Berlin20,
  26. Joseph Drozda21,
  27. Michael E Matheny22,
  28. Sanket S Dhruva23,
  29. Timothy Feeney24,
  30. Kristi Mitchell25 and
  31. Gregory Pappas26
  1. 1Department of Population Health Sciences; Medical Devices Epidemiology Network (MDEpiNet) Coordinating Center, Weill Cornell Medical College, New York, New York, USA
  2. 2Center for Devices and Radiological Health (CDRH), US Food and Drug Administration, Silver Spring, Maryland, USA
  3. 3Vascular Surgery, University of Exeter Medical School, Exter, UK
  4. 4Health Services, Policy and Practice, Brown University School of Public Health, Providence, Rhode Island, USA
  5. 5Vascular Surgery, Dartmouth-Hitchcock Medical Center, Lebanon, New Hampshire, USA
  6. 6Division of Vascular Surgery and Endovascular Therapy, University of Alabama, Birmingham, Alabama, USA
  7. 7Surgical Outcomes and Analysis, Kaiser Permanente, Harbor City, California, USA
  8. 8Department of Urology, Weill Cornell Medical College, New York, New York, USA
  9. 9Philip R. Lee Institute for Health Policy Studies, University of California San Francisco, San Francisco, California, USA
  10. 10Vascular and Interventional Radiology, Conemaugh Memorial Medical Center, Johnstown, Pennsylvania, USA
  11. 11The TMJ Association, Milwaukee, Wisconsin, USA
  12. 12Robotic Surgery, Institute of Surgical Excellence, Philadelphia, Pennsylvania, USA
  13. 13Department of Neurology, Stroke Center, University of California Los Angeles, Los Angeles, California, USA
  14. 14Center for Abdominal Core Health, Ohio State University Wexner Medical Center, Columbus, Ohio, USA
  15. 15Department of Obstetrics and Gyencology, Women and Infants Hospital of Rhode Island, Providence, Rhode Island, USA
  16. 16Department of Cardiology, Comparative Effective Research Institute, Lahey Hospital and Medical Center, Burlington, Massachusetts, USA
  17. 17Department of Medicine, Division of Cardiology, Duke University, Durham, North Carolina, USA
  18. 18Vascular Surgery, Maine Medical Center, Portland, Maine, USA
  19. 19Surgery, Tufts University School of Medicine, Boston, Massachusetts, USA
  20. 20Global Epidemiology, Johnson and Johnson Limited, New Brunswick, New Jersey, USA
  21. 21Outcomes Research, Mercy Health, St. Louis, Missouri, USA
  22. 22Department of Biomedical Informatics and Department of Medicine, Vanderbilt University Medical Center, Nashville, Tennessee, USA
  23. 23Department of Medicine, University of California San Francisco School of Medicine, San Francisco, California, USA
  24. 24Department of Surgery, Boston University, Boston, Massachusetts, USA
  25. 25Health Equity Outcomes, Avalere Health, Washington, DC, USA
  26. 26Center for Biologicals Evaluation and Research (CBER), US Food and Drug Administration, Silver Spring, Maryland, USA
  1. Correspondence to Dr Art Sedrakyan; ars2013{at}


Objectives Generating and using real-world evidence (RWE) is a pragmatic solution for evaluating health technologies. RWE is recognized by regulators, health technology assessors, clinicians, and manufacturers as a valid source of information to support their decision-making. Well-designed registries can provide RWE and become more powerful when linked with electronic health records and administrative databases in coordinated registry networks (CRNs). Our objective was to create a framework of maturity of CRNs and registries, so guiding their development and the prioritization of funding.

Design, setting, and participants We invited 52 stakeholders from diverse backgrounds including patient advocacy groups, academic, clinical, industry and regulatory experts to participate on a Delphi survey. Of those invited, 42 participated in the survey to provide feedback on the maturity framework for CRNs and registries. An expert panel reviewed the responses to refine the framework until the target consensus of 80% was reached. Two rounds of the Delphi were distributed via Qualtrics online platform from July to August 2020 and from October to November 2020.

Main outcome measures Consensus on the maturity framework for CRNs and registries consisted of seven domains (unique device identification, efficient data collection, data quality, product life cycle approach, governance and sustainability, quality improvement, and patient-reported outcomes), each presented with five levels of maturity.

Results Of 52 invited experts, 41 (79.9%) responded to round 1; all 41 responded to round 2; and consensus was reached for most domains. The expert panel resolved the disagreements and final consensus estimates ranged from 80.5% to 92.7% for seven domains.

Conclusions We have developed a robust framework to assess the maturity of any CRN (or registry) to provide reliable RWE. This framework will promote harmonization of approaches to RWE generation across different disciplines and health systems. The domains and their levels may evolve over time as new solutions become available.

  • device safety
  • device surveillance
  • health technology
  • real world evidence
  • health care quality, access, and evaluation

Data availability statement

All data relevant to the study are included in the article or uploaded as supplementary information

This is an open access article distributed in accordance with the Creative Commons Attribution Non Commercial (CC BY-NC 4.0) license, which permits others to distribute, remix, adapt, build upon this work non-commercially, and license their derivative works on different terms, provided the original work is properly cited, appropriate credit is given, any changes made indicated, and the use is non-commercial. See:

Statistics from

Request Permissions

If you wish to reuse any or all of this article please use the link below which will take you to the Copyright Clearance Center’s RightsLink service. You will be able to get a quick price and instant permission to reuse the content in many different ways.

Key messages

What is already known about this subject?

  • Several initiatives have been launched on national and international levels by the US Food and Drug Administration and the International Medical Device Regulators Forum to develop a real-world evidence (RWE) framework to provide supportive evidence for regulatory purposes.

  • Registries are a key source of RWE, building from which coordinated registry networks (CRNs) have been introduced to describe systems that aim to produce all the necessary evidence for regulators and key stakeholders, by obtaining data from multiple sources.

What are the new findings?

  • We developed an innovative and robust framework to assess the maturity of registries and CRNs for device research and surveillance, to address increasing evidentiary needs of stakeholders.

  • We defined the maturity of CRNs by how close they come to providing all the required information in an accessible, thorough, relevant, and reliable form using seven domains - unique device identification, efficient data collection, data quality, product lifecycle approach, governance and sustainability, quality improvement, and patient-reported outcomes.

How might these results affect future research or surgical practice?

  • This maturity framework for CRNs and registries will promote harmonization of approaches to RWE generation across different disciplines and health systems.

  • This framework will also help to prioritize investment in systems and processes that are sustainable and that will supply the evidence needed for regulatory and other evaluations requested by stakeholders.


Modern healthcare is being transformed by the introduction of unprecedented numbers of new technologies, including many that require invasive procedures to implant medical devices. This creates a major challenge for evidence development needed for evaluation of these technologies. It is not only the number of new devices but also their complexity and speed of development that compound the challenge. Comprehensive evidence about short-term and long-term safety and effectiveness of devices is needed to support decision-making by regulators, health technology assessors, clinicians, patients, and other stakeholders if these devices are to be adopted in healthcare systems worldwide. The need for thorough and timely data on implanted devices has been highlighted by problems that have emerged related to certain hip implants,1 2 urogynecological surgical meshes,3 4 breast implants,5 and cardiovascular devices.6 If better information systems have been available during early dissemination of these technologies, it is possible that these problems may have been recognized and addressed sooner.

Using real-world evidence (RWE) is a pragmatic solution for the evaluation of many therapeutics and technologies.7 8 RWE can be produced by a variety of study designs using data from routinely collected sources such as electronic heath records, registries, and patient-generated data, including patient-reported outcomes (PROs). Registries, as a major source of RWE, have received global attention, especially since the launch of specific initiatives by the International Medical Device Regulators Forum (IMDRF), exploring the possibilities of prospective national data collection in multiple countries, with the potential for subsequent data linkage to provide information from very large numbers of patients longitudinally.9 In the USA, the Food and Drug Administration (FDA) has spearheaded an initiative to develop an RWE framework to provide supportive evidence for regulatory purposes.10 11

Building on registries, the concept of coordinated registry networks (CRNs) has been introduced to describe systems that aim to produce all the necessary evidence for regulators and other stakeholders by obtaining data from multiple sources.12 In theory, it is possible for a single registry to do this, but in practice, that is usually not feasible. The solution is linkage of registries and a diverse range of other datasets to provide the whole spectrum of information needed for device evaluation. In the USA, the development of CRNs has been led by the FDA and the Medical Device Epidemiology Network (MDEpiNet), with the aim of creating national and international partnerships and methodologies for leveraging RWE to evaluate medical devices throughout the total product life cycle (TPLC).13 CRNs not only provide the prospect of robust RWE evidence on safety and effectiveness of devices but also offer the possibility of nested study designs that can expedite patient recruitment at a lower cost than traditional clinical research.14–16

With growing recognition of the value of CRNs, there is a need to develop a consensus-based framework to evaluate them and the registries on which they are based. Our aim was to develop a robust framework to assess the maturity of registries and CRNs for device research and surveillance, to address increasing evidentiary needs of stakeholders. We defined the maturity of CRNs by how close they come to providing all the required information in an accessible, thorough, relevant, and reliable form. This framework will help to prioritize investment in systems and processes that are sustainable and that will supply the evidence needed for regulatory and other evaluations requested by stakeholders. We describe this framework, along with its evolution through two rounds of Delphi survey responses from a range of key stakeholders.


The key domains of the framework were developed by the MDEpiNet Coordinating Centre in consultation with FDA and an expert group of collaborators from patient advocacy groups, academic, clinical, industry, and regulatory settings. The domains of the framework were based on a previous IMDRF report17 that was led by a number of coauthors of that study. There are seven domains:

  1. Promotion of unique device identification.

  2. Improving data collection efficiency.

  3. Advancing data quality for regulatory decision-making.

  4. Considering TPLC research.

  5. Establishing governance and ensuring sustainability.

  6. Leveraging registries as quality systems.

  7. Incorporation of patient-generated data and PROs.

In this study, we used the Delphi method for reaching consensus to develop and refine the framework from our initial design. The Delphi method was established by the RAND Corporation in 1964.18 It was introduced to eliminate peer influence of traditional survey designs by using an anonymous platform to collect unbiased opinions. The technique employs multiple rounds of questionnaires until a target consensus is reached from a group of diverse participants. This method is used routinely by MDEpiNet to create core minimum data elements for studying various technologies19–22 and is also used by our partner IDEAL (Idea, Development, Exploration, Assessment, Long-term study framework) group to develop guidelines for evaluation of new surgical techniques and complex therapeutic technologies.23

Using the Delphi method, we aimed to establish a framework that includes five levels of maturity for each of the seven domains. MDEpiNet Executive Operations Committee members, CRN leaders, and a range of stakeholders experienced in the field of device research and surveillance using RWE were selected for the survey.

Two rounds of the Delphi survey were circulated to participants using the Weill Cornell Medicine’s Qualtrics survey platform ( Round 1 was carried out in July–August 2020 and round 2 in October–November 2020. First, the participants were asked to agree or disagree on each entire domain, as presented, together with its five levels of maturity. If the participants agreed on the domain as it was presented, they moved on to the next domain. However, if they disagreed on any detail of the domain or its five levels, they were presented with follow-up questions. The follow-up questions allowed participants to provide free-text comments on the text of the domain or any of its proposed five levels regarding their disagreement. In this way, consensus was assessed for each of the seven domains as well as their five levels.

The defined aim of the Delphi process was achievement of 80% consensus for all seven domains and their five levels. The lead investigators and registry experts (AS, DM-D, JLC, EWP, PG, and AB) reviewed the survey results and used the comments made by participants to propose changes to the text of the domains and their levels until the target of 80% consensus was reached when next reviewed by the Delphi participants. The responses of round 1 were reviewed to provide clarity on definitions or language without losing the meaning or the structure of the overall model. This guided revision of the framework for round 2 of the survey. Based on the responses from round 2, the framework was further revised to create the final version, which was circulated to the participants for final comments.

The one domain in which 80% consensus was not achieved was domain 3, data quality. To resolve this, an expert panel (JLC, EWP, and AS) was convened, which reviewed all the relevant comments by teleconference on 15 December 2020. The revised version of this domain was then presented to participants and committee members on 29 December 2020 via Zoom videoconference when target consensus level was achieved.


Fifty-two individuals were identified as experts and were invited to participate. Of the 52 invited, a total of 41 responded to round 1 of the survey (79.9% response rate). Six participants expressed 100% agreement in round 1, and so they were excluded from round 2. All the remaining 35 responded to round 2 (100%). There was, therefore, a total of 41 responses, which were evaluated for the final model development. The characteristics of the survey participants are summarized in table 1. There were 21 (51.2%) experts who represented clinical perspective, and 12 of these experts also had academic roles. Six (14.6%) additional experts primarily represented academic perspective. There was also representation from regulatory, industry, and patient stakeholder communities (table 1).

Table 1

Characteristics of the Delphi survey participants (N=41)

In round 1, only the TPLC domain achieved the target consensus (82.9%) (table 2). In round 2, the agreement level for each domain increased, such that all domains reached the 80% target for consensus, except the data quality domain, which achieved 75.6% agreement. Where there were disagreements, these tended to be heterogenous, without any common theme. The one exception was the data quality domain, in which there was disagreement about the level of detail and therefore the burden of auditing requirements. After an expert panel review, the team was able to resolve the disagreement and propose a modified approach of the data quality domain to reach the target consensus, which was then agreed on by the participants when the final framework was circulated for comment. Most importantly, each item (levels 1–5) of each domain in the framework received more than 90% agreement in round 2 (figure 1).

Table 2

Delphi survey results: % agreement and disagreement on coordinated registry network maturity framework

Supplemental material

Figure 1

Delphi survey results: percent agreement on levels of maturity in each domain. PRO, patient-reported outcome; TPLC, total product life cycle.


Our two rounds of Delphi responses achieved consensus for all seven domains of the framework (with subsequently agreed adjustment for one domain), each with five levels, designed to evaluate the maturity of CRNs and registries to support medical device evaluation.

This framework offers a means of assessing the maturity, or level of development, of registries and CRNs, whatever their origin or type, specifically with respect to their suitability for generating reliable RWE. This is important because such a wide range currently exists globally. Not all registries or CRNs were created for the purpose of generating research results. Some CRNs are based on registries started by medical professional societies in response to transformational technologies such as transcatheter valve therapies.24 Others leverage traditional registries developed by specialty societies (eg, Society for Vascular Surgery (SVS)) and a variety of integrated health systems, used for clinical care, quality improvement, billing, and also for collection of national statistics.25 26 Our framework offers an understanding of the capability of any registry or CRN to provide information for relevant stakeholders in a standardized manner. It will promote harmonization of assessment of maturity across different disciplines and different health systems worldwide. The framework is intended as a dynamic model, with the capacity to evolve over time as new experiences accumulate within the data ecosystem.

An example of a CRN that has achieved maturity in most domains described by our framework is the Vascular Implant Surveillance and Interventional Outcomes Network (VISION) CRN. This CRN is successful in efficiently capturing device data for patients undergoing vascular surgery and interventional care. The CRN covers 40% of relevant hospitals across the USA, with over 90% enrollment of patients and completion of case report forms complete. It has created a national repository of linked data sources to obtain long-term outcomes and has published a series of studies documenting the value of these data for quality improvement and for informing regulatory decisions (figure 2).27–30 It also contributes to the International Consortium of Vascular Registries (ICVR) for analyses of device outcomes.31 32 The CRN uses data from the Vascular Quality Initiative of the Society for Vascular Surgery, which was launched in 2003 to improve the quality, safety, and effectiveness and to reduce costs of vascular procedures.26 As a quality improvement initiative, it conducts regular data audits and provides reports back to participating institutions, including outlier assessments. The VISION CRN has also recently launched a PRO data collection pilot with disease-specific and general health measures.

Figure 2

Vascular Implant Surveillance and Interventional Outcomes Network coordinated registry network. The center hexagon depicts the role of the MDEpiNet Coordinating Centre, and outer hexagons represent various data partners and data sources. CDRN, Clinical Data Research Network; DUA, data use agreement; MOU, memorandum of understanding; OPC, Objective Performance Criteria; OPG, Objective Perfomance Goals; PCORI, Patient-Centered Outcomes Research Institute; SPARCS, Statewide Planning and Research Cooperative System; VQI, Vascular Quality Initiative.

It should be emphasized that a CRN or registry does not need to have the highest score in all seven domains to be considered mature and useful. Application of maturity framework and the levels of maturity that it specifies can help registries and CRNs identify the gaps and prioritize investments in data infrastructure and analytical processes, provided that their initial design has the potential for evolution. It is also important to understand that the framework is not intended as a system to produce an overall ‘score’ for any CRN or registry, but rather to assess each of its domains and then to take an overall perspective. What is important for any registry or CRN is its initial design, which needs to have the capacity for development, along the lines set out in the framework. This means involving all the key stakeholders in the initial development phase, including patients, professional societies, and manufacturers.

From the international perspective, there are emerging linked data networks that can become CRNs in Europe and Australia. In the UK, for example, there are well-established national registries, such as the National Joint Registry, that have been linked with routinely collected Hospital Episode Statistics, to study the risk of revision due to prosthetic joint infection following primary knee replacement.33 The UK Transcatheter Aortic Valve Implantation registry also has experience linking with routine National Health Service and has provided risk outcomes of transcatheter aortic valve implantation from 2007 to 2012.34 The UK Clinical Practice Research Datalink, with detailed information on 60 million patients from primary care providers (community-based), also provides huge potential for RWD studies, although specific device identification is challenging.35 In Australia, population-based linked hospital morbidity and mortality data have been used to study age-stratified outcomes of surgical aortic valve replacement.36

Our framework creates opportunities for harmonization and global collaboration in the development and evolution of registries and CRNs because it offers a shared vision of the qualities required for them to provide reliable and useful RWE. International collaborations offer great potential for rapid acquisition of information about short-term and long-term safety and effectiveness of novel and established technologies, particularly those that are not commonly used in clinical practice. MDEpiNet has initiated international collaborations that are using registries and administrative datasets for device research and surveillance. Examples include the ICVR37 and the International Consortium of Orthopaedics Registries.38 Applying the CRN maturity framework to these systems will help make global collaborations increasingly more robust and useful.


Our maturity framework offers a consistent method for assessing the capacity of registries and CRNs to provide useful and reliable RWE about medical devices. It identifies gaps and guides their future development. It can be applied in any country or health system and, therefore, has value in enabling international collaborations.

Appendix A1: Maturity framework

Data availability statement

All data relevant to the study are included in the article or uploaded as supplementary information

Ethics statements

Patient consent for publication

Ethics approval

Institutional review board approval was not required for this study as it did not involve human subjects research.


We acknowledge the following collaborators for their thoughtful input and contributions to the Delphi consensus survey: Sameer Ansari, MD, PhD (Northwestern University Feinberg School of Medicine, Chicago, Illinois, USA); Joseph E Bavaria, MD (University of Pennsylvania, PennMedicine, Philadelphia, Pennsylvania, USA); Christian Behrendt, MD (University Heart Centre Hamburg-Eppendorf, Hamburg, Germany); Deborah Fisher, MD (Duke University School of Medicine, Durham, North Carolina, USA); George Gibeily, MD (US Food and Drug Administration (FDA), Silver Spring, Maryland, USA); Peter G Goldschmidt, MD, DrPH (World Development Group, Bethesda, Maryland, USA); Maryam Guiahi, MD (University of Colorado School of Medicine, Colorado, USA); Nobuhiro Handa, MD (Pharmaceutical and Medical Device Agency, Tokyo, Japan); Ted Heise, PhD (MED Institute, West Lafayette, Indiana, USA); Louisa Jorm, PhD (University of New South Wales, Sydney, Australia); Kathleen Kobashi, MD (Virginia Mason Medical Center, Seattle, Washington, USA); Jialin Mao, MD, MS (Weill Cornell Medical College, New York, New York, USA); Jeffery Milsom, MD (Weill Cornell Medical College, New York, New York, USA), Evan Myers, MD (Duke University Medical Center, Durham, North Carolina, USA); Sharon-Lise Normand, PhD (Harvard University School of Medicine), W Benjamin Nowell, PhD (CreakyJoints, Global Healthy Living Foundation, Upper Nyack, New York, USA); Hiroshi Ohtsu, MS (National Center for Global Health and Medicine, Tokyo, Japan); Andrea Pusic, MD (Brigham and Women’s Hospital, Boston, Massachusetts, USA); Mary Beth Ritchey, PhD (Rutgers University, and MedTechEpi, Philadelphia, Pennsylvania, USA); Sandra Siami, MPH (National Evaluation System for health Technology Coordinating Center MDIC, Arlington, Virginia, USA); Kazuhiro Sase, MD, PhD (Juntendo University, Tokyo, Japan); Adnan Siddiqui, MD, PhD (State University of New York, Buffalo, New York, USA); Vinod H Thourani, MD (Piedmont Healthcare, Atlanta, Georgia, USA); Martha Velezis (US FDA); Sateria Venable (Fibroid Foundation, Bethesda, Maryland, USA); and Maarit Venermo, MD (Helsinki University Hospital, Helsinki, Finland).



  • Twitter @Artsytwits, @DartmthSurgHSR, @jimhumd, @jpdrozda

  • Contributors Conception or design of the work: AS, CB, and SA.

    Overall content as guarantor: AS. Data acquisition and analysis: AS, DMD, SA, BC, CB, JLC, and EWP. Interpretation of data for the work, critical revision of the work for important intellectual content, and final approval of the version to be published: AS, DMD, BC, SA, CB, PG, JLC, AWB, EWP, JH, RB, KB, TC, JL, DSL, BKP, CRR, FSR, JT, BF, CV, VD, MS, JEJ, JAB, JD, MEM, SSD, TF, KM, and GP. Drafting of the first version of the work: AS, DMD, SA, BC, and CB.

  • Funding This study was funded by the US Food and Drug Administration (FDA) (grant number U01FD006936; AS, Weill Cornell Medical College). Partially supported by the Assistant Secretary for Planning and Evaluation and Patient-Centered Outcomes Research Trust Fund of the US Department of Health and Human Services (Interagency Agreement #750119PE060048), through the FDA grant.

  • Competing interests None declared.

  • Provenance and peer review Commissioned; externally peer reviewed.

  • Supplemental material This content has been supplied by the author(s). It has not been vetted by BMJ Publishing Group Limited (BMJ) and may not have been peer-reviewed. Any opinions or recommendations discussed are solely those of the author(s) and are not endorsed by BMJ. BMJ disclaims all liability and responsibility arising from any reliance placed on the content. Where the content includes any translated material, BMJ does not warrant the accuracy and reliability of the translations (including but not limited to local regulations, clinical guidelines, terminology, drug names and drug dosages), and is not responsible for any error and/or omissions arising from translation and adaptation or otherwise.