Azacitidine increased median overall survival by 3.8 months vs current commonly used AML treatments (10.4 vs 6.5 months; P = .1009).
Azacitidine safety in patients age ≥65 years with AML (>30% blasts) was consistent with its known safety profile in other trials.
This multicenter, randomized, open-label, phase 3 trial evaluated azacitidine efficacy and safety vs conventional care regimens (CCRs) in 488 patients age ≥65 years with newly diagnosed acute myeloid leukemia (AML) with >30% bone marrow blasts. Before randomization, a CCR (standard induction chemotherapy, low-dose ara-c, or supportive care only) was preselected for each patient. Patients then were assigned 1:1 to azacitidine (n = 241) or CCR (n = 247). Patients assigned to CCR received their preselected treatment. Median overall survival (OS) was increased with azacitidine vs CCR: 10.4 months (95% confidence interval [CI], 8.0-12.7 months) vs 6.5 months (95% CI, 5.0-8.6 months), respectively (hazard ratio [HR] was 0.85; 95% CI, 0.69-1.03; stratified log-rank P = .1009). One-year survival rates with azacitidine and CCR were 46.5% and 34.2%, respectively (difference, 12.3%; 95% CI, 3.5%-21.0%). A prespecified analysis censoring patients who received AML treatment after discontinuing study drug showed median OS with azacitidine vs CCR was 12.1 months (95% CI, 9.2-14.2 months) vs 6.9 months (95% CI, 5.1-9.6 months; HR, 0.76; 95% CI, 0.60-0.96; stratified log-rank P = .0190). Univariate analysis showed favorable trends for azacitidine compared with CCR across all subgroups defined by baseline demographic and disease features. Adverse events were consistent with the well-established safety profile of azacitidine. Azacitidine may be an important treatment option for this difficult-to-treat AML population. This trial was registered at www.clinicaltrials.gov as #NCT01074047.
Acute myeloid leukemia (AML) is an aggressive malignancy with poor prognosis. In the United States, an estimated 20 830 new cases are anticipated in 2015 and more than 10 000 people will die of the disease.1 Older patients, who are at greatest risk of developing AML, have especially poor survival2 because of a variety of host- and disease-related adverse prognostic risk factors, such as history of myelodysplastic syndromes (MDSs), unfavorable karyotypes, poor performance status, and comorbidities, which can limit treatment options. As a result, many older patients receive only palliative care.3 Patients age ≥65 years with AML have a median overall survival (OS) of only 2 to 8 months.4⇓⇓⇓-8
There is no universally accepted standard approach to treating AML in older patients. Commonly used therapeutic options include best supportive care (BSC) alone, standard induction chemotherapy (IC), and low-dose ara-c (LDAC). The National Comprehensive Cancer Network (NCCN) guidelines recommend IC for patients with AML age ≥60 years with more favorable prognostic features,9 but many older patients with AML do not meet these criteria. Older patients considered eligible for IC can have relatively high rates of complete remission (CR); however, no clear survival benefit has been established for IC over less intensive treatment options.5,10,11 Those patients considered ineligible for IC because of advanced age or poor performance status are commonly treated with LDAC or BSC alone, which are associated with a median OS of only 5 and 2 months, respectively.3,4,12
NCCN treatment recommendations for older patients with newly diagnosed AML also include the hypomethylating agents azacitidine and decitabine.9 Decitabine was recently approved in Europe for treatment of patients age ≥65 years with newly diagnosed AML (≥20% bone marrow [BM] blasts) who are not considered candidates for standard IC. Azacitidine was shown to prolong OS compared with conventional care regimens (CCRs) in the subset of older patients with 20% to 30% BM blasts in the phase 3 AZA-001 trial.13 Similarly, azacitidine treatment has been associated with an encouraging median OS of approximately 9 to 10 months in patients with AML who participated in the Austrian Azacitidine Registry14,15 or in a French compassionate use program.16 Reported here are results of the international phase 3 AZA-AML-001 study, the first prospective, randomized clinical study to evaluate the efficacy and safety of azacitidine compared with CCR (doctor’s choice of BSC only, LDAC, or standard IC) in patients age ≥65 years with newly diagnosed AML and >30% BM blasts.
Patients and methods
This multicenter, randomized, open-label, parallel-group study conducted in 18 countries was approved by the relevant institutional review boards or independent ethics committees and was conducted according to the Declaration of Helsinki. All patients provided written informed consent. Authors had access to all study data, and analyses were performed by Celgene Corporation, Summit, NJ.
Eligible patients were age ≥65 years with newly diagnosed, histologically confirmed de novo or secondary AML with >30% BM blasts who were not considered eligible for hematopoietic stem cell transplantation, with intermediate- or poor-risk cytogenetics (NCCN 2009 criteria17), Eastern Cooperative Oncology Group performance status (ECOG PS) score ≤2, and white blood cell count ≤15 × 109/L. Exclusion criteria included acute promyelocytic leukemia with t(15;17)(q22;q12) and AML with inv(16)(p13.1q22) or t(16;16)(p13.1;q22), t(8;21)(q22;q22), or t(9;22)(q34;q11.2); AML arising from previous hematologic disorders other than MDS (eg, myeloproliferative neoplasms); other malignancies; or uncontrolled systemic infection. Patients could not have received prior decitabine, azacitidine, or cytarabine treatment; prior AML therapy (except hydroxyurea, which was allowed up to 2 weeks before the screening hematology sample was taken); or any experimental drug within 4 weeks of starting study treatment.
Study design and treatment
Before randomization, investigators determined which protocol-designated CCR (BSC, LDAC, or IC) was most appropriate for each patient on the basis of age, ECOG PS, comorbidities, and regional guidelines and/or institutional practice. A central, stratified, and permuted block randomization method and interactive voice response system were used to randomly assign patients 1:1 to receive azacitidine or CCR. Randomization was stratified by preselected CCR (BSC, LDAC, or IC), ECOG PS (0-1 or 2), and cytogenetic risk (intermediate or poor). Patients assigned to CCR received their preselected treatment.
Azacitidine 75 mg/m2 per day was administered subcutaneously for 7 consecutive days per 28-day treatment cycle for at least 6 cycles. CCRs were as follows: BSC only (blood product transfusions and antibiotics, with granulocyte colony-stimulating factor for neutropenic infection); subcutaneous LDAC (20 mg twice per day for 10 days per 28-day treatment cycle for at least 4 cycles); or IC (cytarabine 100-200 mg/m2 per day by continuous intravenous infusion for 7 days, plus 3 days of either daunorubicin 45-60 mg/m2 per day or idarubicin 9-12 mg/m2 per day) for 1 cycle, followed by up to 2 consolidation cycles (ie, the same anthracycline regimen as used at induction and the same cytarabine dose used for induction but administered for 3 to 7 days) for those achieving CR or partial response (PR). Reinduction was not allowed. Azacitidine and LDAC dosing could be reduced or delayed as needed until the blood count recovered. All study participants could receive BSC, including transient use of hydroxyurea (hydroxyurea was not allowed within 72 hours before or after azacitidine administration).
Patients were scheduled to visit study sites once per week during the first 2 treatment cycles, then every other week thereafter. BM aspirates, BM biopsies, and peripheral blood smears were collected, and cytogenetic testing was performed at screening in the IC group, within 7 days before each treatment cycle; in the azacitidine and LDAC groups, within 7 days before initiation of every second cycle beginning at cycle 3; and in the BSC group, on day 1 of every third cycle (a BSC cycle was defined as 28 days), beginning at cycle 4. Central review of peripheral blood, BM samples, and cytogenetics was conducted by a pathologist and cytogeneticist blinded to treatment. AML classification for each patient was determined by local investigators at study entry.
Efficacy end points
The primary end point was OS, defined as time from randomization to death as a result of any cause. Living patients were censored upon study discontinuation (loss to follow-up, withdrawal of consent) or at the end of the poststudy follow-up. Patients who discontinued randomized treatment could receive subsequent AML therapy during study follow-up according to the investigator’s decision. The choice of subsequent therapy was at the discretion of the investigator. A predefined sensitivity analysis evaluated OS in the azacitidine and CCR arms, censoring patients when they began subsequent AML therapy.
Secondary end points included estimated 1-year survival rate and OS in patient subgroups defined by baseline demographic and disease characteristics: age (<75 years vs ≥75 years), gender (male vs female), race (white vs Asian), geographic region (North America and Australia/Western Europe and Israel/Eastern Europe/Asia), ECOG PS (0-1 vs 2), baseline cytogenetic risk (intermediate vs poor), World Health Organization classification of AML (AML with recurrent genetic abnormalities vs AML with myelodysplasia-related changes [AML-MRC] vs AML with therapy-related myeloid neoplasms vs AML not otherwise specified), white blood cell count (≤5 × 109/L vs >5 × 109/L), BM blasts (≤50% vs >50%), and prior history of MDS.
Hematologic responses of CR, morphologic CR with incomplete blood count recovery (CRi), and PR were defined by International Working Group criteria18 and centrally adjudicated by an independent review committee blinded to treatment assignment. Stable disease was defined as not meeting criteria for any other treatment response (ie, CR, CRi, PR, disease progression, or treatment failure [early death]). Rates of event-free survival (events were progressive disease, relapse after CR or CRi, and death) and relapse-free survival were assessed. Red blood cell (RBC) and platelet transfusion independence (TI), defined as no transfusions for 56 consecutive days on study, was assessed for patients who were transfusion-dependent at baseline (≥1 transfusion in the 56 days before study start) and for patients who were TI at baseline and remained TI.
Prospective exploratory end points included OS comparisons within the CCR preselection subgroups. Post hoc multivariate efficacy analyses examined treatment effects on OS adjusted for selected baseline demographic and disease covariates and adjusted for the use of subsequent AML therapy. A separate post hoc analysis evaluated the effect of response (CR) on OS in the azacitidine and CCR groups.
Health-related quality of life (HRQoL) was assessed by using the European Organisation for Research and Treatment of Cancer Core Quality of Life Questionnaire (EORTC QLQ-C30), which was to be completed on day 1 of cycle 1 (baseline), every other cycle thereafter, and at the final study visit. HRQoL end points were changes from baseline scores in the Fatigue domain (primary) and in the Physical Functioning, Global Health Status, and Dyspnea domains (secondary) of the QLQ-C30. A 10-point change from baseline was prespecified as the minimally important difference. The HRQoL-evaluable population included patients with a baseline and at least 1 postbaseline assessment. HRQoL was evaluated through cycle 9 because of subsequent small cohort sizes.
The safety population included all patients who received at least 1 dose of study drug and had at least 1 safety assessment thereafter. Treatment-emergent adverse events (TEAEs) were defined as new or worsening AEs between the time of first dose (or randomization for BSC only) to the end of the safety follow-up period: 28 days after the last dose of azacitidine or LDAC, 70 days after the last dose of IC, or the day of discontinuation and/or study closure for BSC. To account for differences in treatment exposure, the TEAE incidence rate per 100 patient-years (ie, 100 × [number of patients with a given TEAE/total patient-years of treatment exposure]) was evaluated. The number of patients requiring hospitalization as a result of a TEAE, rate of hospitalization events for a TEAE per patient-year of drug exposure, and number of days hospitalized for TEAEs per patient-year of drug exposure were assessed.
The planned sample size was 480 patients (240 per treatment arm), calculated on the assumption of a median OS of 10.5 months in the azacitidine arm and 7.5 months in the CCR arm (40% improvement), with a 1% dropout rate from each treatment arm. This design required 374 deaths to demonstrate a statistically significant OS difference at a 1-sided significance level of 0.025 with at least 90% power to detect a constant hazard ratio (HR) of 0.71.
Efficacy analyses were performed for the intention-to-treat population. Survival distribution functions for each treatment arm, including 1-year survival probability, were estimated by the Kaplan-Meier method. A stratified log-rank test, with ECOG PS and cytogenetic risk as stratification factors, compared OS between azacitidine and CCR. A stratified Cox proportional hazards model was used to generate HRs and 95% confidence intervals (CIs). Time-to-event secondary end points were analyzed by using the same methods but without stratification. Secondary analyses were not controlled for multiplicity. Exploratory analyses comparing azacitidine with individual CCRs within treatment preselection groups (IC, LDAC, or BSC) were not powered to detect statistical differences between treatments, and results should be interpreted with caution. All reported log-rank or Fisher’s exact test P values for secondary and exploratory end points are nominal.
In post hoc analyses, OS with azacitidine vs CCR was estimated by using Cox proportional hazards models to adjust for variables that were preselected on the basis of their known potential to influence outcomes because of confounding and/or heterogeneity. These Cox models were adjusted for (1) selected baseline demographic and disease covariates known to influence prognosis (eg, cytogenetic risk), (2) covariates for subsequent therapy (time-varying; yes or no) and treatment-by-subsequent-therapy (time-varying) interaction, and (3) all covariates in models (1) and (2). HRs, 95%CIs, and P values were estimated from these Cox models.
The study was conducted between October 2010 and January 2014; the database was locked in March 2014. A total of 488 patients were randomly assigned at 98 clinical sites. Before random assignment, LDAC was preselected for most patients (64%), whereas BSC and IC were each preselected for 18% of patients (supplemental Figure 1 available at the Blood Web site). Overall, 241 patients were randomly assigned to receive azacitidine and 247 to receive CCR. In the CCR arm, 45 patients were assigned to BSC, 158 to LDAC, and 44 to IC.
The most common reasons for early discontinuation were AEs (azacitidine, n = 89 [37%]; CCR, n = 66 [28%]) and death (azacitidine, n = 53 [22%]; CCR, n = 58 [24%]) (supplemental Figure 2).
Baseline demographic and disease characteristics were generally balanced between treatment arms (Table 1). More than half of all patients (54%) were age ≥75 years, median BM blasts at baseline were 72%, and 35% of patients had poor-risk cytogenetics. Patients received a median of 6 (range, 1-28) azacitidine treatment cycles, 2 (range, 1-3) IC cycles, and 4 (range, 1-25) LDAC cycles, and the median exposure to BSC only was 65 (range, 6-535) days. In the azacitidine and LDAC groups, 52.5% and 35.9% of patients, respectively, received 6 or more treatment cycles, and 32.2% and 17.6% received 12 or more treatment cycles. Cumulative patient-years of study drug exposure were 174.9 for azacitidine, 82.9 for LDAC, 14.1 for IC, and 9.6 (ie, time on study) for BSC.
The median duration of follow-up was 24.4 months. By study end, 394 deaths (80.7%) had occurred (azacitidine, n = 193 [80.1%]; CCR, n = 201 [81.4%]). Median OS for patients receiving azacitidine and CCR was 10.4 and 6.5 months, respectively; stratified HR was 0.85 (95% CI, 0.69-1.03; P = .1009) (Figure 1A).
A total of 69 patients (28.6%) in the azacitidine group and 75 patients (30.4%) in the CCR group received subsequent AML therapy after discontinuing randomized study treatment. The most common subsequent therapies received in the azacitidine and CCR groups, either alone or in combination, included a cytarabine-based regimen (16.6% and 11.3%, respectively), azacitidine (4.6% and 13.0%, respectively), and/or decitabine (0.8% for each group). In the prespecified sensitivity analysis, 67 azacitidine-treated patients (the start date of subsequent treatment was not available for 2 azacitidine-treated patients) and 75 CCR-treated patients were censored at the time they began subsequent therapy. Median OS in the azacitidine arm was 12.1 vs 6.9 months in the CCR arm (stratified HR, 0.76; 95% CI, 0.60-0.96; P = .0190) (Figure 1B). Post hoc Cox regression analyses supported results of the sensitivity analysis; when adjusted for use of subsequent AML therapy as a time-dependent variable, azacitidine improved OS compared with CCRs (HR, 0.75; 95% CI, 0.59-0.94; P = .0130) (Table 2). Baseline covariates retained in the post hoc multivariate Cox regression model were cytogenetic risk, ECOG PS, percentage of BM blasts, geographic area, age, investigator preselection of CCR, and World Health Organization AML classification (Table 2). When adjusted for these factors, the HR for OS with azacitidine vs CCR was 0.80 (95% CI, 0.66-0.99; P = .0355). Furthermore, when subsequent AML therapy as a time-dependent variable and baseline covariates were included in the same multivariate model, the HR for azacitidine relative to CCR was 0.69 (95% CI, 0.54-0.88; P = .0027) (Table 2).
One-year survival rates in the azacitidine and CCR groups were 46.5% and 34.2%, respectively, a difference of 12.3% (95% CI, 3.5%-21.0%). In the sensitivity analysis that censored patients who received subsequent AML therapy, 1-year survival proportions were 50.7% in the azacitidine arm and 37.7% in the CCR arm, a difference of 13.0% (95% CI, 3.3%-22.7%).
Univariate OS analyses showed favorable trends for azacitidine compared with CCR across all subgroups (Figure 2). Median OS in patients with poor-risk cytogenetics was 6.4 months in the azacitidine arm and 3.2 months in the CCR arm (HR, 0.68; 95% CI, 0.50-0.94; P = .0185). Median OS in patients with AML-MRC who received azacitidine was 12.7 months and 6.3 months for patients who received CCR (HR, 0.69; 95% CI, 0.48-0.98; P = .0357).
In preplanned exploratory analyses, OS in patients who had been preselected to receive BSC alone but were randomly assigned to azacitidine was improved compared with that for patients who received BSC only (5.8 vs 3.7 months, respectively; P = .0288) (Table 3 and supplemental Figure 3). Patients who were preselected to receive LDAC treatment but were randomly assigned to receive azacitidine showed a 4.8-month increase in median OS compared with those who received LDAC (11.2 vs 6.4 months, respectively; P = .4270). Those preselected to receive IC but who received azacitidine had a median OS similar to that of patients who received IC (13.3 and 12.2 months, respectively). Estimated 1-year survival rates within preselected groups ranged from 30.3% to 55.8% in the azacitidine groups and from 18.6% to 50.9% in the CCR groups.
Overall response (CR + CRi) rates were comparable in the azacitidine (27.8%) and CCR (25.1%) arms (P = .5384) (Table 4). Within the CCR arm, overall response rates were 0% (BSC), 25.9% (LDAC), and 47.7% (IC). In the azacitidine and CCR treatment arms, 29.5% and 23.9% of patients, respectively, had stable disease as their best response during treatment. Higher proportions of patients who were transfusion dependent at baseline in the azacitidine treatment arm attained RBC TI (38.5% vs 27.6% in the CCR arm) or platelet TI (40.6% vs 29.3%). Total numbers of patients in the azacitidine and CCR groups who remained RBC TI or became RBC TI on treatment were 105 (43.6%; 95% CI, 37.2%-50.1%) and 76 (30.8%; 95% CI, 25.1%-36.9%), respectively, and who remained or became platelet TI were 142 (58.9%; 95% CI, 52.4%-65.2%) and 106 (42.9%; 95% CI, 36.7%-49.3%), respectively.
Post hoc analysis to explore the potential influence of CR on OS showed that when patients who attained CR (19.5% in the azacitidine group and 21.9% in the CCR group) were excluded from analysis, median OS in the azacitidine and CCR groups was 6.9 months (95% CI, 5.1-8.9 months) vs 4.2 months (95% CI, 3.2-5.1 months; HR, 0.77; 95% CI, 0.62-0.95; stratified log-rank P = .0170). The estimated 1-year survival rates for patients who did not achieve CR were 33.8% in the azacitidine arm and 20.4% in the CCR arm, a difference of 13.4% (95% CI, 4.5%-22.4%).
The population that was evaluable for HRQoL initially comprised 291 patients (azacitidine, 157; CCR, 134). This patient subset decreased over time in both groups, but at a faster rate in the CCR arm after cycle 3, and there was large variation in QLQ-C30 responses within each treatment group. Change from baseline scores for primary and secondary domains of the QLQ-C30 generally improved over 9 treatment cycles in both arms (supplemental Table 1). No HRQoL detriment was seen with azacitidine or CCR at the group level during treatment. The few changes that met the minimally important difference threshold were Fatigue (cycles 7 and 9) and Global Health Status/QoL (cycle 9) in the CCR group.
The safety population comprised 471 patients (azacitidine, 236; CCR, 235); 5 patients randomly assigned to azacitidine and 7 patients randomly assigned to CCR did not receive study treatment, and 5 patients in the CCR arm had no post-dose safety assessment. Most patients experienced a TEAE during the study (99.2% in the azacitidine arm and 100% in the CCR arm). TEAEs leading to study drug-dose reduction occurred for 3.4%, 1.3%, and 4.8% of patients in the azacitidine, LDAC, and IC arms, respectively; TEAEs leading to dose interruption occurred in 49.2%, 39.9%, and 9.5% of patients, respectively. Drug-related TEAEs leading to study discontinuation occurred in 22 patients (9.3%) in the azacitidine arm, 20 patients (13.1%) in the LDAC arm, and 5 patients (11.9%) in the IC arm; those that occurred in >1 patient in the azacitidine and LDAC arms, respectively, included pneumonia (3.0% and 2.0%), febrile neutropenia (1.3% in each arm), pyrexia (0% and 1.3%), and sepsis (0% and 1.3%).
Among the most frequent drug-related TEAEs in the azacitidine, LDAC, and IC groups, respectively, were nausea (27.1%, 22.2%, and 42.9%), neutropenia (19.9%, 22.9%, and 31.0%), and thrombocytopenia (17.4%, 22.2%, and 21.4%) (supplemental Table 2). Accounting for patient-years of treatment exposure, incidence rates of anemia, febrile neutropenia, neutropenia, and thrombocytopenia were substantially lower in the azacitidine arm than in the LDAC and IC arms (supplemental Table 3). Grades 3 and 4 hematologic TEAEs occurred with approximately equal frequency in the azacitidine and LDAC groups (Table 5). In both the azacitidine and LDAC groups, hematologic TEAEs (any grade) decreased in frequency as treatment continued (supplemental Table 4). The most frequent serious TEAEs occurred with similar frequency in the azacitidine, LDAC, and IC arms and included febrile neutropenia (25.0%, 24.8%, and 24.3%, respectively), pneumonia (20.3%, 19.0%, and 14.9%), and pyrexia (10.6%, 10.5%, and 8.9%) (supplemental Table 5). Notably, 30-day mortality rates in the azacitidine and CCR arms were 6.6% and 10.1%, respectively, and 60-day mortality rates were 16.2% and 18.2%.
In the azacitidine and CCR arms, 165 patients (69.9%) and 157 patients (66.8%), respectively, were hospitalized for a TEAE. Rates of hospitalization for TEAEs per patient-year of drug exposure in the azacitidine and CCR arms were 1.96 and 2.39, respectively (relative risk, 0.82; 95% CI, 0.70-0.960; P = .0083). Times spent in the hospital for TEAEs were 28.5 days and 38.3 days per patient-year of drug exposure in the azacitidine and CCR arms, respectively (relative risk, 0.74; 95% CI, 0.71-0.78; P < .0001).
This randomized trial in older patients (median age, 75 years) with newly diagnosed AML compared subcutaneous azacitidine with commonly used treatments for AML (LDAC, intensive chemotherapy, or BSC alone) and showed that azacitidine was associated with a clinically meaningful improvement in median OS (10.4 vs 6.5 months) and 1-year survival (46.5% vs 34.2%) vs CCR, although the primary end point was not met (stratified HR, 0.85; 95% CI, 0.60-1.03; P = .1009), influenced by the convergence of the Kaplan-Meier curves at around 22 months. Such convergence is not unexpected in a disease with no cure and a patient population with poor OS.
Azacitidine was generally well tolerated; more than half the azacitidine-treated patients received 6 or more treatment cycles, and one-third received 12 or more cycles. TEAEs occurred with comparable frequency and were similar to those reported for azacitidine in patients with MDS, and similar reductions in hematologic toxicity were observed as azacitidine treatment continued.19⇓-21 Rates and days of hospitalization for TEAEs by study drug exposure were lower in the azacitidine arm compared with CCR. During treatment, azacitidine and CCR were associated with general improvement in HRQoL, with improvement exceeding the threshold for a minimally important difference with CCR at cycles 7 and 9 for Fatigue and at cycle 9 for Global Health Status. Importantly, there was no meaningful HRQoL deterioration in the primary or secondary QLQ-C30 domains during treatment in either arm. Interpreting HRQoL outcomes is challenging because the population evaluable for HRQoL was smaller than the intention-to-treat population. Consequently, there is a risk that the protection of randomization was lost as the evaluable groups became smaller in number, which could introduce bias to the results.
Because subsequent therapy is a known confounding factor in survival studies,22 a sensitivity analysis censoring patients who received subsequent antileukemic treatment was prospectively included in the study design. In this analysis, median OS with azacitidine was extended by more than 5 months vs CCR (12.1 vs 6.9 months; stratified HR, 0.76; 95% CI, 0.60-0.96; P = .0190). This finding is supported by results of the post hoc Cox model that controlled for use of subsequent treatment, showing a 25% reduced risk of death in the azacitidine arm. In the multivariate Cox analysis adjusting for baseline disease and demographic characteristics as well as subsequent treatment, azacitidine reduced risk of death by 31% vs CCR (HR, 0.69; 95% CI, 0.54-0.88; P = .0027).
Univariate subgroup analyses showed that the overall HR for OS was within the 95% CIs for the HRs for each subgroup comparison. Although univariate analysis suggested improved OS in some azacitidine-treated demographic cohorts (eg, females), these characteristics dropped out of the multivariate Cox models, which retained only the most significant influences on OS as covariates, including cytogenetic risk, ECOG PS, and BM blast count. Remarkably, azacitidine was superior to CCR in subgroups with biologically poor risk factors. IC and LDAC provide no OS benefit in older patients with AML and poor cytogenetics5,10,12,23; median OS with these treatments in this population was approximately 2 to 3 months.12 In patients with poor-risk cytogenetics in this study, median OS with azacitidine (6.4 months) was twofold higher than that of patients who received CCR (3.2 months) and was nominally significant vs CCR in both univariate and multivariate analyses. As the only AML treatment yet shown to improve OS in older patients with poor-risk cytogenetics, azacitidine may be the treatment of choice for these patients. Azacitidine was also associated with improved OS in patients with AML-MRC, consistent with findings that azacitidine significantly improves OS in patients with higher-risk MDSs.21
The individual CCR preselection groups likely reflect 3 different patient populations with regard to performance status, frailty, and comorbidities, which play an important part in selection of treatment of older patients with AML. As might be expected, patients preselected to receive IC had the best OS, followed by patients in the LDAC preselection group, and patients preselected to receive BSC only had the worst OS outcomes. Although the study was not powered to demonstrate significant differences for comparisons within preselection groups, such comparisons allow assessment of treatment effects in patients with generally similar prognoses and clinical features because preselection of the preferred CCR occurred before randomization. Patients preselected to receive IC who received azacitidine or IC had comparable OS (13.3 vs 12.2 months, respectively) and 1-year survival (55.8% vs 50.9%). Thus, low-intensity azacitidine treatment may benefit older patients with AML who, although they are eligible for IC, choose to forego intensive therapy. In the LDAC preselection group, azacitidine demonstrated an improvement in median OS of 4.8 months (11.2 vs 6.4 months; P = .4270) and a 1-year survival advantage of 14.5% vs LDAC. Notably, in this study, patients received a median of 4 LDAC treatment cycles, which is double the reported median LDAC exposure in most large clinical trials.4,6,12 A survival improvement with azacitidine was also seen in patients preselected to receive BSC alone (5.8 vs 3.7 months).
Azacitidine has demonstrated an OS benefit in patients with higher-risk MDS in the absence of CR.24 Similarly, in these older patients with AML, a nominally significant increase in OS was seen in azacitidine-treated patients who did not achieve CR (6.9 vs 4.2 months with CCR; P = .0170).
Decitabine was evaluated in a trial with 485 patients age ≥65 years with AML and ≥20% BM blasts randomly assigned to receive either decitabine or physicians’ choice of treatment (LDAC or BSC).4 The primary analysis for that trial demonstrated a nonsignificant survival increase with decitabine (7.7 vs 5.0 months), whereas a later unplanned analysis at ∼3 years demonstrated a nominally significant P value for unchanged OS outcomes. Although cross-trial comparisons are problematic because of differences in populations, comparator arms, and LDAC regimens, the median OS in the azacitidine arm of this trial of 10.4 months is encouraging.
Combination treatment regimens may further improve outcomes for older patients with AML. Results of early trials of azacitidine in combination with lenalidomide,25,26 panobinostat,27 or sorafenib28 as first-line or salvage therapy in older patients with AML are promising, with overall response rates of approximately 30% to 40%. Larger studies are needed to confirm these findings.
In conclusion, results of this study suggest that azacitidine may provide an important additional treatment option for older patients with newly diagnosed AML.
Contribution: H. Dombret, J.F.S., R.M.S., M.D.M., and H. Döhner contributed to study design; H. Dombret, J.F.S., A.B., A.W., D.S., J.H.J., R.K., J.C., A.C.S., A.C., C.R., I.S., T.B.d.C., H.K.A.-A., G.M., J.F., R.N., and H. Döhner enrolled study participants, clinically managed study participants, and collected and reviewed data; H.M., S.S., L.M.L., and C.L.B. analyzed the data and L.M.L. provided statistical support; and all authors participated in manuscript development and revision. The primary author (H. Dombret) is responsible for manuscript content and gave approval to submit the manuscript.
Conflict-of-interest disclosure: H. Dombret was a consultant for, received honoraria from, and was a member of the speakers bureau and advisory committee for Celgene and received honoraria from and was a member of the speakers bureau for Janssen-Cilag; J.F.S. was a consultant for, received honoraria from, and was a member of the speakers bureau for Celgene; A.W. received honoraria from and was a member of the speakers bureau for Celgene; D.S. received speaker’s fees and honoraria for advisory boards and was a consultant for Celgene and Janssen-Cilag; R.K. received honoraria and was a member of the advisory committee for Celgene; J.D.C. received honoraria from Celgene; A.C.S. was a member of the advisory committee for Celgene; A.C. was a consultant and a member of the speakers bureau for Celgene; C.R. was a member of the advisory committee for and received research funding from Celgene, received research funding from Chugai, and was a member of the advisory committee for Sunesis; I.S. was a consultant and received honoraria from Celgene, was a consultant for and received honoraria from Janssen, and received honoraria from Novartis; T.B.d.C. received speaker’s fees and honoraria for advisory boards and was a consultant for Celgene; H.K.A.-A. received honoraria and research funding from Celgene; G.M. was a consultant and member of the speakers bureau for Novartis, was a consultant and member of the speakers bureau for Bristol-Myers Squibb, and was a consultant for Pfizer and Ariad; J.F. was a consultant for Celgene; R.M.S. was a consultant for Agios, AbbVie, Amgen, Celator, Celgene, and Roche; M.D.M. received honoraria from Celgene; H.M. was employed by Celgene; S.S., L.M.L., and C.L.B. were employed by Celgene and have equity ownership; H. Döhner was a consultant for Celgene. The remaining authors declare no competing financial interests.
Correspondence: Hervé Dombret, Hopital Saint Louis, Institut Universitaire d’Hematologie, University Paris Diderot, 1 avenue Claude Vellefaux, 75010 Paris, France; e-mail:.
Editorial assistance was provided by Brian Kaiser and Sheila Truten of Medical Communication Company, Inc., funded by Celgene Corporation.
The online version of this article contains a data supplement.
There is an Inside Blood Commentary on this article in this issue.
The publication costs of this article were defrayed in part by page charge payment. Therefore, and solely to indicate this fact, this article is hereby marked “advertisement” in accordance with 18 USC section 1734.
- Submitted January 13, 2015.
- Accepted May 7, 2015.
- © 2015 by The American Society of Hematology