Iron deficiency is a common laboratory finding and can be associated with numerous medical conditions, in addition to dietary insufficiency. It represents the most common cause of anaemia worldwide and… Click to show full abstract
Iron deficiency is a common laboratory finding and can be associated with numerous medical conditions, in addition to dietary insufficiency. It represents the most common cause of anaemia worldwide and is acknowledged as a significant cause of morbidity and mortality.1 Accurate investigation of iron status is essential, not only with regard to diagnosis, but also for evaluating response to treatment. The gold standard for measuring iron status is assessment of bone marrow iron stores using Perl’s stain.2 However, such an invasive test in all cases of suspected iron deficiency would be excessive and unnecessary, in addition to incurring a high cost per test. As such, iron status has traditionally been measured using biochemical parameters, primarily ferritin and percentage transferrin saturation (%TSAT).3 These tests are easily automated and have a low cost, making them ideal for highthroughput testing. Whilst this strategy has been undeniably valuable in diagnosing and monitoring iron deficiency, neither test is without its limitations; %TSAT is subject to diurnal variation4 whilst ferritin, an acute phase protein, is raised in infection and inflammation.5,6 Diagnosis of iron deficiency in patients presenting with a microcytic, hypochromic anaemia in the presence of other comorbidities, therefore, remains challenging. In light of a continuing need for a robust means of measuring iron status, two new haematological parameters have been included in National Institute for Health and Clinical Excellence (NICE) and British Committee for Standards in Haematology (BCSH) guidelines, as suitable alternatives to biochemical measurement of body iron.7,8 These are percentage hypochromic mature red cells (%HYPO) and reticulocyte haemoglobin content (CHr). %HYPO measures the percentage of mature red cells with a cellular haemoglobin content of less than 28 pg, using the principle that ironrestricted erythropoiesis would result in an increased proportion of hypochromic red cells, causing a raised %HYPO result. CHr can be used to assess iron via determination of the haemoglobin content of peripheral blood reticulocytes. This will be reduced in the absence of iron, and so CHr results will decrease in iron deficiency. Both parameters are affected by thalassaemia,9 but as this is the only identified comorbidity which will influence results, these parameters may fill the need for robust iron status measurement. %HYPO is recommended as the test of choice in both guidelines, but a 6hour analysis window restricts the use of this test to samples collected in close proximity to testing facilities. Community monitoring of iron deficiency is a strategy used to combat overbooked hospital outpatient clinics but unfortunately, peripheral blood samples collected from the community are unlikely to be analysed within this 6hour analysis window. Consequently, %HYPO is unsuitable for the vast majority of patients requiring outpatient evaluation of iron status. No analysis window is stated for CHr in either guideline. This study had two aims. Firstly, to determine if it was possible to prolong the analysis window of %HYPO to support its availability to community clinicians and secondly, to determine the analysis window for CHr. By repeat testing samples over a 24hour period and comparing results to baseline we determined sample stability and identified the most suitable parameter to implement in a highthroughput medical laboratory for community monitoring of iron status.
               
Click one of the above tabs to view related content.