Τετάρτη 21 Αυγούστου 2019

Serious Cardiovascular Adverse Events Reported with Intravenous Sedatives: A Retrospective Analysis of the MedWatch Adverse Event Reporting System

Abstract

Background

Serious cardiovascular adverse events (SCAEs) associated with intravenous sedatives remain poorly characterized.

Objective

The objective of this study was to compare SCAE incidence, types, and mortality between intravenous benzodiazepines (i.e., diazepam, lorazepam, and midazolam), dexmedetomidine, and propofol in the USA over 8 years regardless of the clinical setting where it was administered.

Methods

The Food and Drug Administration’s MedWatch Adverse Event Reporting System was searched between 2004 and 2011 using the Evidex® platform from Advera Health Analytics, Inc. to identify all reports that included one or more of ten different SCAEs (package insert incidence ≥ 1%) and where an intravenous benzodiazepine, dexmedetomidine, or propofol was the primary suspected drug.

Results

Among the 2326 Food and Drug Administration’s MedWatch Adverse Event Reporting System cases reported, 394 (16.9%) were related to a SCAE. The presence of a SCAE (vs. a non-SCAE) is associated with higher mortality (34 vs. 8%, p < 0.001). The percentage of cases with one or more SCAE, the case mortality rate (%), and the incidence of each SCAE (per 106 days of sedative exposure), respectively, were benzodiazepines (14, 26, 13) [diazepam (13, 23, 31); lorazepam (15, 43, 14); midazolam (14, 20, 11)]; dexmedetomidine (40, 15, 13); and propofol (17, 39, 7). Propofol (vs. either a benzodiazepine or dexmedetomidine) was associated with more total SCAEs (268 vs. 126, p < 0.001) but a lower incidence (per 106 days of sedative exposure) of SCAE (7 vs. 13, p = 0.0001) and cardiac arrest [6.3 (benzodiazepine) vs. 6.7 (dexmedetomidine) vs. 1.4 (propofol), p < 0.0001].

Conclusions

Serious cardiac adverse events account for nearly one-fifth of intravenous sedative Food and Drug Administration’s MedWatch Adverse Event Reporting System reports. These SCAEs appear to be associated with greater mortality than non-cardiac serious adverse events. Serious cardiac events may be more prevalent with either benzodiazepines or dexmedetomidine than propofol.

Post-authorisation Safety Study of Pioglitazone Use and Safety Endpoints of Interest in Denmark After Direct Healthcare Professional Communication

Abstract

Introduction

A Direct Healthcare Professional Communication (DHPC) sent in Denmark on 11 August 2011 provided information on new pioglitazone labelling and guidance on monitoring treatment effectiveness. We describe pioglitazone use in Denmark after the DHPC, estimate the incidence of heart failure (HF), quantify pioglitazone cessation following a diagnosis of bladder cancer (BC) or uninvestigated macroscopic haematuria, and describe glycated haemoglobin (HbA1c) values.

Methods

This was a cohort study. From Danish population-based registries, cohorts of type 2 diabetes mellitus incident or prevalent users of pioglitazone or insulin in 2011–2015 were created. Patient characteristics, treatment patterns, laboratory results (available for a regional subset of the population), and incidence rates of HF and BC were estimated.

Results

There were 80 pioglitazone and 17,699 insulin incident users, 140 pioglitazone and 13,183 insulin prevalent users. There were no new BC cases among incident pioglitazone users, and < 5 new BC cases among prevalent pioglitazone users. Pioglitazone was rarely the first-line treatment. History of haematuria was documented in < 5 incident and 11 prevalent pioglitazone users. During follow-up, there were < 5 HF cases among 77 incident pioglitazone users and < 5 among 133 prevalent pioglitazone users without a history of HF. Median HbA1c at index date was 7.8% and 8.8% in incident pioglitazone and insulin cohorts, and 7.5% and 7.6% in prevalent pioglitazone and insulin cohorts, respectively. During follow-up of up to 4.4 years, 28.8% incident and 20.7% prevalent pioglitazone users discontinued pioglitazone.

Conclusions

Numbers of pioglitazone users in Denmark were low and decreased over time. Risks of BC or HF were low and risk estimates imprecise.

Re-evaluating Safety and Effectiveness of Dabigatran Versus Warfarin in a Nationwide Data Environment: A Prevalent New-User Design Study

Abstract

Introduction

The new user cohort design is widely used to assess the effects of a new drug, such as dabigatran, but inherently excludes some users due to prior use of the comparator drug, for example warfarin. The prevalent new-user design offers a solution that includes all eligible users of the new drug.

Objective

To evaluate the safety and effectiveness of dabigatran versus warfarin in non-valvular atrial fibrillation (NVAF) patients with prevalent new-user design.

Methods

Taiwan National Health Insurance and mortality data from 2011 through 2015 were utilized. From an incident NVAF cohort, we identified dabigatran initiators as either incident or prevalent (switchers from warfarin) new users. Time- and prescription-based exposure sets were formed for dabigatran initiators to account for prior warfarin prescriptions. A comparable warfarin user was matched on the time-conditional propensity score to the dabigatran initiator in each set. The matched patients were followed for clinical outcomes, with Cox proportional hazards model used to estimate hazard ratios (HRs).

Results

There were 10,811 dabigatran initiators, including 22% prevalent new users (switchers), who formed the exposure sets and were matched 1:1 to warfarin users. Dabigatran use was associated with lower risks of intracranial hemorrhage (HR 0.51; 95% confidence interval [CI] 0.39, 0.66) and gastrointestinal bleeding (HR 0.81; 95% CI 0.70, 0.92), compared with warfarin use. These effects were similar between the incident and prevalent new users.

Conclusion

Using a design that includes both incident and prevalent new users of dabigatran, the use of dabigatran is associated with lower major bleeding risk than warfarin use among patients with incident NVAF.

From Clinical Trial Efficacy to Real-Life Effectiveness: Why Conventional Metrics do not Work

Abstract

Background

Randomised, double-blind, clinical trial methodology minimises bias in the measurement of treatment efficacy. However, most phase III trials in non-orphan diseases do not include individuals from the population to whom efficacy findings will be applied in the real world. Thus, a translation process must be used to infer effectiveness for these populations. Current conventional translation processes are not formalised and do not have a clear theoretical or practical base. There is a growing need for accurate translation, both for public health considerations and for supporting the shift towards personalised medicine.

Objective

Our objective was to assess the results of translation of efficacy data to population efficacy from two simulated clinical trials for two drugs in three populations, using conventional methods.

Methods

We simulated three populations, two drugs with different efficacies and two trials with different sampling protocols.

Results

With few exceptions, current translation methods do not result in accurate population effectiveness predictions. The reason for this failure is the non-linearity of the translation method. One of the consequences of this inaccuracy is that pharmacoeconomic and postmarketing surveillance studies based on direct use of clinical trial efficacy metrics are flawed.

Conclusion

There is a clear need to develop and validate functional and relevant translation approaches for the translation of clinical trial efficacy to the real-world setting.

Patterns of High-Dose and Long-Term Proton Pump Inhibitor Use: A Cross-Sectional Study in Six South Australian Residential Aged Care Services

Abstract

Aim

While proton pump inhibitors (PPIs) are generally considered safe and well tolerated, frail older people who take PPIs long term may be susceptible to adverse events. This study characterized PPI use and determined factors associated with high-dose use among older adults in residential aged care services (RACSs).

Methods

A cross-sectional study of 383 residents of six South Australian RACSs within the same organization was conducted. Clinical, diagnostic, and medication data were collected by study nurses. The proportions of residents who took a PPI for > 8 weeks and without documented indications were calculated. Factors associated with high-dose PPI use compared to standard/low doses were identified using age- and sex-adjusted logistic regression models.

Results

196 (51%) residents received a PPI, with 45 (23%) prescribed a high dose. Overall, 173 (88%) PPI users had documented clinical indications or received medications that can increase bleeding risk. Three-quarters of PPI users with gastroesophageal reflux disease or dyspepsia had received a PPI for > 8 weeks. High-dose PPI use was associated with increasing medication regimen complexity [odds ratio (OR) 1.02; 95% confidence interval (CI) 1.01–1.04 per one-point increase in Medication Regimen Complexity Index score] and a greater number of medications prescribed for regular use (OR 1.11; 95% CI 1.01–1.21 per additional medication).

Conclusions

Half of all residents received a PPI, of whom the majority had documented clinical indications or received medications that may increase bleeding risk. There remains an opportunity to review the continuing need for treatment and consider “step-down” approaches for high-dose PPI users.

PEARL: A Non-interventional Study of Real-World Alirocumab Use in German Clinical Practice

Abstract

Background

Several lipid guidelines recommend that proprotein convertase subtilisin/kexin type 9 inhibitors should be considered for patients with atherosclerotic cardiovascular disease who are inadequately treated with maximally tolerated lipid-lowering treatment.

Objectives

The PEARL study assessed the efficacy and safety of the proprotein convertase subtilisin/kexin type 9 inhibitor alirocumab in patients with hypercholesterolemia in a real-world setting.

Methods

PEARL was an open, prospective, multicenter, non-interventional study conducted in Germany. Patients (n = 619) for whom treating physicians decided to use alirocumab 75 or 150 mg every 2 weeks according to German guidelines (low-density lipoprotein cholesterol > 1.8/2.6 mmol/L [> 70/100 mg/dL], depending on cardiovascular risk, despite maximally tolerated statin therapy with/without other non-alirocumab lipid-lowering therapy) were enrolled and followed for 24 weeks. Physicians could adjust the alirocumab dose based on their clinical judgment. The primary efficacy endpoint was low-density lipoprotein cholesterol reduction from baseline (prior to alirocumab therapy) to week 24.

Results

Overall, 72.8% of patients reported complete or partial statin intolerance. Mean low-density lipoprotein cholesterol was 4.7 mmol/L (180.5 mg/dL) and 2.3 mmol/L (89.8 mg/dL) at baseline and week 24, respectively. Least-squares mean percentage change from baseline to week 24 in low-density lipoprotein cholesterol was − 48.6%. Initial alirocumab dose was 75 mg in 72.9% of patients and 150 mg in 24.5% of patients; 19.6% of patients received an alirocumab dose increase (75 to 150 mg) and 1.6% of patients received a dose decrease. Adverse events were reported in 10.3% of patients, with myalgia being the most common.

Conclusions

In a real-world setting in Germany, alirocumab was used in patients who had high baseline low-density lipoprotein cholesterol levels with/without statin intolerance. Efficacy and safety were consistent with findings observed in the ODYSSEY Phase III program.

A Pilot Study to Assess the Feasibility of a Web-Based Survey to Examine Patient-Reported Symptoms and Satisfaction in Patients with Ankylosing Spondylitis Receiving Secukinumab

Abstract

Purpose

This real-world study evaluated the feasibility of assessing patient-reported symptom improvement and treatment satisfaction using a web-based survey among patients with ankylosing spondylitis (AS) treated with secukinumab.

Methods

This cross-sectional, web-based survey collected data on demographics, symptoms, treatment history, and treatment satisfaction from US patients with AS who were receiving secukinumab at survey participation. Patients reported AS symptoms experienced before and after secukinumab initiation, time to symptom improvement, and satisfaction with secukinumab treatment.

Results

Of 2755 patients screened, 200 with AS were included in the analysis. The mean (SD) age of patients was 34.4 (10.6) years; 86.5% were biologic experienced. Most (74.0%) reported overall improvement (“a little,” “moderately,” or “much better”) in AS symptoms since secukinumab initiation compared with before secukinumab initiation; a similar trend was observed for all the individual symptoms analyzed (pain disrupting sleep, fatigue, morning stiffness, pain and stiffness in lower back or neck, sore areas other than joints, and ankle or heel pain [indicating enthesitis]). Approximately 41.9% of patients reported overall symptom improvement within 4 weeks of secukinumab treatment. Most expressed overall satisfaction (“very,” “mostly,” or “somewhat satisfied”) with secukinumab regarding symptom improvement (99.0%), speed of symptom improvement (97.0%), frequency and method of administration (96.0% and 91.5%, respectively), ease of use (93.5%), patient support services (97.0%), and side effects, if any (93.0%).

Conclusion

Most patients reported overall symptom improvement and satisfaction with treatment. Our study indicates that patient-reported perspectives may be feasibly collected using a web-based survey to provide insights into treatment experience and satisfaction.

Real World Evidence: Can We Really Expect It to Have Much Influence?

Prevalence of Psychotropic Polypharmacy and Associated Healthcare Resource Utilization during Initial Phase of Care among Adults with Cancer in USA

Abstract

Background

The use of psychotropic medications is not uncommon among patients with newly diagnosed cancer. However, the impact of psychotropic polypharmacy on healthcare utilization during the initial phase of cancer care is largely unknown.

Methods

We used a claims database to identify adults with incident breast, prostate, lung, and colorectal cancers diagnosed during 2011–12. Psychotropic polypharmacy was defined as concurrent use of two or more psychotropic medication classes for at least 90 days. A multivariable logistic regression was performed to identify significant predictors of psychotropic polypharmacy. Multivariable Poisson and negative binomial regressions were used to assess the associations between psychotropic polypharmacy and healthcare utilization.

Results

Among 5604 patients included in the study, 52.6% had breast cancer, 30.6% had prostate cancer, 11.4% had colorectal cancer, and 5.5% had lung cancer. During the year following incident cancer diagnosis, psychotropic polypharmacy was reported in 7.4% of patients, with the highest prevalence among patients with lung cancer (14.4%). Compared with patients without psychotropic polypharmacy during the initial phase of care, patients with newly diagnosed cancer with psychotropic polypharmacy had a 30% higher rate of physician office visits, an 18% higher rate of hospitalization, and a 30% higher rate of outpatient visits. The rate of emergency room visits was similar between the two groups.

Conclusion

Psychotropic polypharmacy during the initial phase of cancer care was associated with significantly increased healthcare resource utilization, and the proportion of patients receiving psychotropic polypharmacy differed by type of cancer.

Impact

Findings emphasize the importance of evidence-based psychotropic prescribing and close surveillance of events causing increased healthcare utilization among patients with cancer receiving psychotropic polypharmacy.

Adverse Events Associated with Risperidone Use in Pediatric Patients: A Retrospective Biobank Study

Abstract

Background

Although risperidone is increasingly used for behavioral indications in children, the associated adverse events (AEs) are not well defined in this population.

Objective

We determined the incidence of and risk factors for AEs among children treated with risperidone at our institution, an academic medical center with inpatient, outpatient, generalist, and specialist pediatric care.

Methods

The study included children aged ≤ 18 years with ≥ 4 weeks of risperidone exposure. Data were obtained using de-identified electronic health records. AEs were defined as any untoward event attributed to risperidone reported by the patient, parent/guardian, or physician or detected following a laboratory investigation. Associations between AEs and clinical variables were determined using univariate and multivariate analyses.

Results

The study cohort included 371 individuals (median age 7.8 years [interquartile range 5.9–10.2]; 271 [73.0%] male). The two most common primary diagnoses were attention-deficit/hyperactivity disorder (160 [43.1%]) and autism (102 [27.5%]). The most frequent indications for risperidone were aggression (166 [44.7%]) and behavioral problems (114 [30.7%]). Altogether, 110 (29.6%) individuals had 156 AEs. Weight gain (32 [20.5%]) and extrapyramidal symptoms (23 [14.7%]) were the most common AEs. Aggression, irritability, and self-injurious behavior were positively associated with AEs, and concomitant analgesics and antibiotics were negatively associated. In multivariate analysis, associations remained significant for self-injurious behavior (adjusted odds ratio [aOR] 3.1; 95% confidence interval [CI] 1.7–5.4) and concomitant antibiotics (aOR 0.2; 95% CI 0.1–0.9).

Conclusions

Nearly one in three children treated with risperidone for ≥ 1 month experienced one or more AEs. Particular vigilance is warranted for children with self-injurious behavior.

Δεν υπάρχουν σχόλια:

Δημοσίευση σχολίου