While U.S. treatment guidelines recommend universal treatment of HIV, the benefits of such a policy may not outweigh the costs in resource-poor areas around the globe. Three researchers from Johns Hopkins University published a paper in Clinical Infectious Diseases, arguing that the limited scope and inconsistent availability of antiretrovirals (ARVs), as well as diminished laboratory monitoring capacity, prompt ethical considerations about applying U.S. treatment standards to impoverished countries at this time.

Recommending early treatment for HIV in the United States is intended both to improve the health outlook of people living with the virus and to curb transmission to others. The increasing clarity of these benefits in recent years has been accompanied by a steady decline in the toxicity of newer ARVs. The Hopkins researchers posit that early use of ARVs in resource-poor settings still clearly benefits prevention efforts as well as people with HIV. However, they argue that because the available ARVs are more antiquated, people beginning therapy with high CD4 counts may put themselves at risk of side effects and toxicity. In addition, less effective therapies and inadequate or non-existent laboratory monitoring may lead people to develop drug resistance, thus potentially limiting future treatment options.

“Universal treatment should be the long-term goal, both for the benefit of those infected and for its effect on slowing and ultimately stopping the epidemic,” the authors write. “But in the enthusiasm for the benefits of ART, we must not lose sight of the glaring disparities among nations that stand as obstacles to the ethical implementation of ART.”

To read the study abstract, click here.