On 18 April 2016, leaders from IMI GetReal met at the Takeda offices in London for a live broadcast: “Introduction to the concept of drivers of effectiveness”. The broadcast introduced the concept of drivers of effectiveness and provided an overview of the methods that can be used to identify them early during the drug development plan. A particular focus was placed on predictive modelling techniques. The live broadcast allowed for participation from audience members on site, as well as those watching online, who were invited to submit questions via email and Twitter.

The 18 April broadcast featured input from GetReal’s Work Package 2 participants: Chris Chinn, Head of Real World Investigations, Sanofi; Lucien Abenhaim, MD, PhD & Chairman, Laser Analytica; Clementine Nordon, Pharmacoepidemiologist, Laser Analytica; Helene Karcher, Global Head, Real-World Modelling, Laser Analytica and Billy Amzal, Global Scientific Vice President, Laser Analytica.

ChrisChris Chinn kicked off the discussion by looking back to GetReal’s first inspiration. At the start, GetReal’s Work Package 2 was examining the gap between efficacy data generated by clinical trials, and the effectiveness of a medicine that emerges over time as its put into practice. By addressing the “efficacy-effectiveness gap”, the needs of decision makers looking at new medicines can be better met—and, in turn, the needs of patients who depend on these new medicines.

To investigate how and when effectiveness research can be incorporated into R&D plans, GetReal has been working on two specific aspects. Firstly, to identify or develop study designs and analytical tools for pragmatic/effectiveness estimation before launch, and to develop standards. Secondly, to develop a decision-making framework to facilitate assessment, and aid in the design of alternative development strategies that evidence comparative or relative effectiveness.

Towards this end, GetReal has been working on a decision-making “Navigator” framework that aims to:

  • Deepen understanding of what drivers of effectiveness are;
  • Explore possible designs that can be used to address these drivers of effectiveness; and
  • Create a tool that takes people through the above two points in a very structured way, providing a structured framework for both those working in R&D and those who are assessing the evidence.

lucienNext, Lucien Abenhaim provided greater detail on what drivers of effectiveness (DoE’s) to consider. In order to develop innovative study designs and analytical tools needed to better assess the effectiveness of drugs prior to their market authorisation, it’s necessary to better understand what effectiveness is.

Effectiveness can be seen as the impact of drug efficacy when all “interactions” (in the large sense, e.g. modifiers of effect) are at play. The factors of interaction are to be identified at a population level (e.g., age or comorbidities) or regarding the actual drug use (e.g., dose, duration of use).

The purpose of effectiveness research must thus aim to assess all possible “interactions” and determine:

  • Which ones are universal;
  • How are they distributed locally;
  • What is the magnitude of their impact?
  • What is the mechanism of action?

DoE’s can thus be seen as “effect modifiers” which will account for a difference between efficacy and effect estimate, and the effectiveness effect estimate (“efficacy-effectiveness gap”).

Clementine Nordon took the discussion forward by addressing two key questions: First, can DoE’s be found in the literature, and second, can they be identified from existing observational data before launch. Nordon discussed three possible methods for identifying drivers of effectiveness:

  1. Literature-based approach;
  2. Expert-based approach;
  3. Data-based approach.

clementineDiscussing a literature-based approach, Nordon pinpointed two examples in the fields of Hodgkin’s Lymphoma and Schizophrenia, as proving successful. In the case of Hodgkin’s, for instance, it was possible to identify the fact that a patient’s age is a factor that drives effectiveness. Since, in trials, patients older than 65 or 75 are excluded, this may lead to an efficacy-effectiveness gap. Desiring clinical input, a supplementary expert-based approach was also explored. Interviews with experts to identify drivers of effectiveness proved useful.

Ultimately data analysis to identify drivers of effectiveness offered a useful and systematic approach. An example was given of patient-level data analysis on two observational datasets in schizophrenia. From the resulting analysis, disease duration was consistently evidenced as a driver of effectiveness—a tangible result reached through systematic means.

Screen Shot 2016-04-26 at 10.44.43Helene Karcher then addressed the question of how to take actually DoE’s into account in developing trials. Laying the groundwork, she noted there are two ways to improve learning about effectiveness early in clinical development

  1. More pragmatic design, i.e., any aspect of study design: population, type of randomisation, blinding, monitoring, etc.
  2. Better “analyses tools”e., any aspect of data analyses: statistical or model-based analyses, predictive models, etc.

While many methodological papers were identified that recommended how to make trial features more pragmatic and to adapt analyses, this did not translate into many actual Phase 2-3 trials with pragmatic elements–due to scientific and operational hurdles. Such hurdles can include operational difficulties in recruiting certain populations and uncertainty in regulatory bodies’ reactions.

Acknowledging such hurdles, the use of “enriched RCTs” with modelling is a logical approach to predict drug effectiveness. Referring to the previous schizophrenia example, the subsequent approach was outlined as a way of using real-world data to optimise clinical trials:

  1. Study patient characteristics and interplay between characteristics and outcome in a real-life schizophrenia population;
  2. Define the subpopulation of “real-life” patients who are actually eligible for a typical pre-authorisation trial “reference RCT population”;
  3. Re-include in this “reference RCT population” a minimal subset of patients who would usually be excluded (broaden the eligibility criteria);
  4. Evaluate how “efficient” each re-inclusion is.

To draw conclusions regarding methods, a disease registry was used to guide the addition of patient heterogeneity to standard phase 3 trials in schizophrenia. The impact of key trial design changes was assessed. Regarding conclusions on the results, the best choice of enrichment factor to predict real-life effects was driven by the size of the excluded real-life population, and the change in outcome in patients with this factor. Ultimately, enriching typical phase 3 with selected factors improved predictions of the investigated drug’s real life effects.

Screen Shot 2016-04-26 at 10.47.02Billy Amzal concluded the presentations by discussing whether we can anticipate what the effect will be in real life. Bridging-to-effectiveness modelling for clinical development would theoretically require the modelling of 3 interactions or effect modifications: actual use model; effect model; and disease model. This in turn must be supplemented with information on drug-specific interactions with DoE’s and country-specific distributions for DoE’s.

 

The resulting bridging model tool should ultimately allow you to predict the effectiveness in a given system and population. slide

With this established, Amzal detailed two case studies, one related to an asthma compound and the other an oncology drug. Based on these case studies, it was determined that indeed drivers of effectiveness can be identified early before launch, through literature review and patient-level data analyses. Further, these can be taken into account in pre-authorisation randomised trials, in a controlled manner and without compromising the success.

The full range of conclusions and details regarding the case studies and examples described above are detailed in the broadcast’s recording and the related presentations. Click here to watch the webinar and to download the slides of the webinar

A special thanks to the Takeda team for kindly hosting the IMI GetReal broadcast in their London offices.

Share this article