Background: Anemia is a common and debilitating complication of chronic kidney disease (CKD), primarily due to reduced erythropoietin production and iron deficiency. Iron supplementation remains a cornerstone in anemia management among CKD patients. While both oral and intravenous (IV) iron therapies are widely utilized, their comparative effectiveness remains a subject of clinical interest, particularly in terms of hemoglobin improvement, safety profile, and treatment compliance. Materials and Methods: A prospective, randomized study was conducted involving 120 adult CKD patients with hemoglobin levels <10 g/dL and serum ferritin <300 ng/mL. Participants were randomly allocated into two groups (n=60 each). Group A received oral ferrous sulfate (325 mg twice daily) for 12 weeks, while Group B was administered IV ferric carboxymaltose (1000 mg total dose divided over two sessions). Baseline and post-treatment values for hemoglobin, serum ferritin, transferrin saturation (TSAT), and adverse events were recorded and analyzed using paired t-tests and ANOVA, with p<0.05 considered statistically significant. Results: At the end of 12 weeks, Group B showed a significantly greater increase in mean hemoglobin levels (from 8.2 ± 0.6 to 10.6 ± 0.7 g/dL) compared to Group A (from 8.1 ± 0.5 to 9.3 ± 0.6 g/dL), p<0.001. Serum ferritin levels rose markedly in the IV group (from 120 ± 30 to 340 ± 50 ng/mL) compared to the oral group (from 115 ± 28 to 180 ± 40 ng/mL), p<0.01. TSAT improvements were also significantly higher in Group B (from 18% to 32%) than Group A (from 17% to 24%). Gastrointestinal side effects were more frequent in Group A (26%) versus infusion-related reactions in Group B (10%). Conclusion: IV iron supplementation was more effective than oral iron in improving hemoglobin levels and iron stores among CKD patients, with better tolerance and fewer treatment interruptions. These findings support the preferential use of IV iron, particularly in moderate to severe anemia cases or when rapid correction is warranted.
Anemia is a prevalent and significant complication in patients with chronic kidney disease (CKD), affecting nearly 90% of those in advanced stages [1]. It contributes to fatigue, reduced quality of life, increased hospitalization, and worsened cardiovascular outcomes [2]. The pathogenesis of anemia in CKD is multifactorial, primarily driven by decreased erythropoietin synthesis by the diseased kidneys, iron deficiency, and chronic inflammation [3,4]. Iron deficiency, in particular, may arise from poor dietary intake, impaired gastrointestinal absorption, ongoing blood loss, and increased iron utilization with erythropoiesis-stimulating agent (ESA) therapy [5].
Iron supplementation plays a crucial role in the correction of anemia in CKD, both in pre-dialysis and dialysis patients [6]. Oral iron therapy, especially ferrous sulfate, is commonly used due to its low cost and ease of administration. However, its use is often limited by gastrointestinal side effects, poor absorption in uremic conditions, and inadequate response in the presence of inflammation [7,8]. Intravenous (IV) iron formulations, including ferric carboxymaltose and iron sucrose, have shown superior efficacy in replenishing iron stores and improving hemoglobin levels, particularly in patients who are unresponsive or intolerant to oral therapy [9,10].
Clinical guidelines such as those from KDIGO (Kidney Disease: Improving Global Outcomes) recommend IV iron as the preferred route in patients with stage 5 CKD or when rapid iron repletion is needed [11]. Despite these recommendations, concerns about potential adverse effects of IV iron, such as hypersensitivity reactions and oxidative stress, have led to variability in clinical practice [12]. Moreover, the comparative effectiveness of oral versus IV iron supplementation in non-dialysis CKD patients remains a subject of debate, particularly in resource-limited settings where oral iron remains the mainstay of treatment [13,14].
Several studies have attempted to compare the two approaches, but differences in study populations, iron formulations, dosing protocols, and outcome measures limit the generalizability of their findings [15]. Therefore, there is a continuing need for well-structured comparative studies to guide optimal iron supplementation strategies in CKD-related anemia. This study was undertaken to evaluate and compare the efficacy and safety of oral versus intravenous iron therapy in the management of anemia among CKD patients.
Study Design and Setting: This prospective, randomized, open-label study was conducted over a 12-week period at a tertiary care nephrology centre.
Study Population: A total of 120 adult patients (aged 18–70 years) diagnosed with chronic kidney disease (stages 3–5), not on dialysis, and presenting with anemia (hemoglobin <10 g/dL and serum ferritin <300 ng/mL), were enrolled. Patients were excluded if they had active infection, recent blood transfusion, malignancy, hypersensitivity to iron preparations, or were already receiving erythropoiesis-stimulating agents or other iron therapy.
Randomization and Intervention: Participants were randomly assigned into two equal groups (n = 60 per group) using a computer-generated randomization list.
Baseline and Follow-up Investigations: At baseline, complete blood count, serum ferritin, transferrin saturation (TSAT), serum iron, and renal function tests were performed. Follow-up assessments were repeated at the end of the 12-week study period. Any adverse events during the treatment phase were documented and categorized based on severity.
Outcome Measures: The primary outcome was the change in hemoglobin concentration from baseline to 12 weeks. Secondary outcomes included changes in serum ferritin and TSAT, as well as the frequency and type of adverse effects in each group.
Statistical Analysis: Data were entered and analyzed using SPSS version 25. Descriptive statistics were used to summarize demographic data. Continuous variables were expressed as mean ± standard deviation and compared using paired and unpaired t-tests. Categorical variables were analyzed using the Chi-square test. A p-value of less than 0.05 was considered statistically significant.
A total of 120 patients were included in the study, with 60 patients in each group. The demographic and baseline clinical characteristics were comparable between the two groups (Table 1). The mean age of the participants was 54.6 ± 8.7 years in the oral iron group and 55.2 ± 9.1 years in the intravenous iron group. Both groups had a similar distribution of male and female patients.
After 12 weeks of treatment, the intravenous iron group showed significantly greater improvement in hematological and iron parameters compared to the oral iron group (Table 2). The mean hemoglobin level increased from 8.1 ± 0.5 g/dL to 10.6 ± 0.7 g/dL in Group B, whereas Group A showed an increase from 8.2 ± 0.6 g/dL to 9.3 ± 0.6 g/dL (p < 0.001). Similarly, serum ferritin levels rose markedly in the IV group (from 120 ± 30 ng/mL to 340 ± 50 ng/mL) as opposed to the oral group (from 115 ± 28 ng/mL to 180 ± 40 ng/mL; p < 0.01). Transferrin saturation improved significantly in Group B (from 18% to 32%) compared to Group A (from 17% to 24%; p = 0.002).
Adverse events were reported more frequently in the oral iron group (Table 3), mainly gastrointestinal complaints such as nausea (15%), constipation (7%), and abdominal discomfort (4%). In contrast, the IV group reported only mild infusion-related reactions (headache and dizziness in 5%, transient rash in 2%).
Table 1: Baseline Demographic and Clinical Characteristics
Parameter |
Oral Iron Group (n=60) |
IV Iron Group (n=60) |
Age (years) |
54.6 ± 8.7 |
55.2 ± 9.1 |
Male:Female ratio |
35:25 |
33:27 |
Hemoglobin (g/dL) |
8.2 ± 0.6 |
8.1 ± 0.5 |
Serum Ferritin (ng/mL) |
115 ± 28 |
120 ± 30 |
TSAT (%) |
17 |
18 |
Table 2: Comparison of Hematological Parameters after 12 Weeks
Parameter |
Oral Iron Group (n=60) |
IV Iron Group (n=60) |
p-value |
Hemoglobin (g/dL) |
9.3 ± 0.6 |
10.6 ± 0.7 |
<0.001 |
Serum Ferritin (ng/mL) |
180 ± 40 |
340 ± 50 |
<0.01 |
TSAT (%) |
24 |
32 |
0.002 |
Table 3: Frequency of Adverse Events Reported During Treatment
Adverse Event |
Oral Iron Group (n=60) |
IV Iron Group (n=60) |
Nausea |
9 (15%) |
0 (0%) |
Constipation |
4 (7%) |
0 (0%) |
Abdominal Discomfort |
2 (4%) |
0 (0%) |
Headache and Dizziness |
0 (0%) |
3 (5%) |
Rash |
0 (0%) |
1 (2%) |
As seen in Table 2, the intravenous iron therapy led to significantly better improvements in hemoglobin and iron stores. Additionally, Table 3 indicates fewer and less severe adverse effects in the IV group, supporting its better tolerability and safety profile.
The current study aimed to evaluate and compare the therapeutic efficacy and safety of oral versus intravenous iron supplementation in anemic patients with chronic kidney disease (CKD). Our findings demonstrated a significantly greater improvement in hemoglobin concentration, serum ferritin levels, and transferrin saturation in the group receiving intravenous ferric carboxymaltose compared to those receiving oral ferrous sulfate. These results are consistent with existing literature and support the superiority of intravenous iron in managing anemia among CKD patients.
Anemia in CKD is multifactorial, with iron deficiency and decreased erythropoietin production being key contributors [1]. Oral iron therapy, while accessible and inexpensive, is often hampered by poor gastrointestinal absorption, limited bioavailability in the context of inflammation, and frequent gastrointestinal side effects [2,3]. In contrast, intravenous iron bypasses the gastrointestinal tract, delivering higher bioavailable iron and allowing rapid repletion of iron stores [4,5].
Our findings align with those of the FIND-CKD trial, which reported significantly higher increases in hemoglobin and ferritin levels in patients treated with IV ferric carboxymaltose than those on oral iron [6]. Another randomized controlled trial by Macdougall et al. similarly showed that IV iron resulted in faster and more effective correction of anemia in CKD patients [7]. Furthermore, we observed a significant rise in transferrin saturation in the IV group, indicating improved functional iron availability for erythropoiesis—a trend supported by previous reports [8,9].
The tolerability profile also favoured intravenous iron. Patients in the oral iron group reported a higher incidence of nausea, constipation, and abdominal discomfort, consistent with previous reports that highlight poor adherence and discontinuation of oral iron due to gastrointestinal side effects [10]. Conversely, adverse reactions to IV iron were mild and infrequent, consistent with contemporary formulations like ferric carboxymaltose having improved safety profiles [11,12].
Clinical practice guidelines, including those from KDIGO and the National Kidney Foundation, now recommend intravenous iron as the preferred method in patients with advanced CKD or those unresponsive to oral therapy [13,14]. However, in lower-resource settings, the cost and logistics of IV iron may still limit its use. Despite this, the long-term cost-effectiveness of IV iron—by reducing the need for ESA therapy and hospital admissions—has been recognized in several health economic evaluations [15].
Nevertheless, some limitations should be acknowledged. The study was conducted over a 12-week period, and longer-term outcomes such as sustained hemoglobin levels, ESA requirement, or cardiovascular effects were not assessed. Additionally, patients with active infections or on dialysis were excluded, limiting the generalizability to the broader CKD population.
In conclusion, our study reinforces the growing evidence that intravenous iron therapy offers superior hematologic outcomes and better tolerability than oral iron in managing CKD-associated anemia. Tailoring treatment based on individual patient profiles, availability, and economic considerations is essential for optimizing anemia management in CKD.