Click here to read a digital edition of this article.

In March 1985, the U.S. Food and Drug Administration (FDA) licensed the first HIV antibody test: an enzyme-linked immunosorbent assay, or ELISA, manufactured by Abbott.

For members of high-risk groups, the four years preceding this milestone in the epidemic had become increasingly fraught and perilous. AIDS-related complications were claiming a rapidly escalating number of lives, and many lived in constant fear of a great unknown.

HIV had been identified as the cause of AIDS in 1983. But if someone had no AIDS-defining disease symptoms—checking for Kaposi’s sarcoma skin lesions had become a widespread undertaking for gay men—even the various tests to detect immune disorders still failed to provide clear answers.

This is not to say that the HIV test—which took about a week to provide results and required a blood draw— offered immediate relief to such anxiety, since the antiretroviral revolution that made HIV relatively manageable wouldn’t begin for another decade.

Although the test could reassure those whose results were negative, a positive result was still considered a death sentence. Many preferred not to know.

“Whether or not to get tested was a torturous decision, as most people felt there was nothing to do medically if you knew you were positive,” recalls Laura Pinsky, LCSW, who co-founded the world’s first college campus testing site at Columbia University in 1985, and who still directs the Gay Health Advocacy Project (GHAP), as the program is known. (This reporter was an HIV test counselor at GHAP during college.)

Noting that Bactrim was already being used as prophylaxis against Pneumocystis pneumonia (PCP)—a particularly insidious, AIDS-defining killer—Pinsky says, “Although we at GHAP saw [getting an HIV test] as very much a personal decision, we were attacked for even suggesting that there was some benefit to knowing your HIV status and CD4 cell count and starting Bactrim, or an alternative, early on.”

On the brighter side, by midsummer 1985, U.S. health officials expressed relative confidence that HIV antibody testing was highly effective at clearing the virus from the nation’s supply of transfusion blood. This was a profound accomplishment, considering that an estimated 9,000 people had previously contracted HIV from transfusions.

Additionally, an estimated 5,300 hemophiliacs had contracted HIV from clotting factor transfusions. But changes in the processing of clotting factor had already addressed this issue earlier in the 1980s.

By comparison, fewer than 10 Americans have contracted HIV from a transfusion since the nucleic acid test, which looks for HIV’s genetic material rather than antibodies against the virus, was first used to screen the supply in 1999.

Thanks to the protection provided by this advanced testing technique (not to mention political pressure), in December 2014 the FDA ended its controversial lifetime ban on donating blood for any man who has had sex with a man since 1977. Under the new policy, men can donate blood if they’ve not had sexual contact with another man within the last year.

Throughout the 30 years since the advent of the HIV test, researchers have pushed the capabilities of screening technologies, allowing for earlier detection of the virus after infection as well as greater ease of use.

The first rapid test was approved in 1992, followed two years later by the first oral fluid test. Urine tests, as well as home tests requiring samples mailed to labs, hit the scene in 1996. In 2002, a rapid test using a finger prick instead of a blood draw became available, and a year later it was granted what is known as a Clinical Laboratory Improvement Amendments (CLIA) waiver, allowing for its use outside of testing laboratories and inside venues such as doctor’s offices, community-based organizations and testing vans. The first rapid oral fluid home test arrived in 2012.

Health agencies have increasingly advocated expanded HIV testing during the past decade. In 2006, the Centers for Disease Control and Prevention (CDC) recommended routine HIV screening in U.S. health care settings, which the World Health Organization and the Joint United Nations Programme on HIV/AIDS began pushing for on a global scale in 2007.

In 2013, the United States Preventive Services Task Force gave routine HIV screening for Americans ages 15 to 65 an “A” rating, its highest recommendation level. Under Obamacare, this nod means that Medicaid, Medicare and new private health plans should cover HIV testing at no cost to the consumer.

Testing Chart
This graphic contains information provided by the CDC and illustrates key concepts of its new testing approach in laboratories. Click here to see the full guidelines.
Scientists seeking to improve HIV assays have been up against a formidable challenge because the virus can be difficult to detect when it is most infectious, during the first 90 days or so of infection. During this period, which is also known as acute infection, viral load is at its peak—that is, until the latter stages of AIDS, when untreated virus will once again rise to highly infectious levels.

After a person gets HIV, it takes two to eight weeks before the body produces detectable antibodies, which are disease-fighting molecules. This gap is known as the “window period,” during which false negative results are possible, since the oldest HIV tests only look for antibodies. Getting tested during the window period and failing to retest later can have significant public health consequences.

“Nearly 50 percent of forward transmission of HIV occurs during the earliest phases of infection, missed by most currently used antibody tests,” says Demetre Daskalakis, MD, MPH, assistant health commissioner at the New York City Bureau of HIV/AIDS Prevention.

Research suggests that, after people test positive for HIV, they typically take significant steps to reduce the risk of transmitting the virus. Plus, antiretroviral treatment that fully suppresses the virus appears likely to all but eliminate the chance of transmission to others.

“In this new era of HIV prevention, with more options than ever before, testing remains one of our most important allies,” says Michele Owen, PhD, acting associate director for laboratory diagnostics in the CDC’s Division of HIV/AIDS Prevention. “For those who have HIV, testing is the gateway to treatment and care, and for those who do not have HIV, testing can help link them with important prevention services so they can remain HIV-free.”

While manufacturers of antibody tests have steadily improved them over time, shaving down the window period in particular, it wasn’t until 2010 that a test could identify an infection before detectable antibodies arise. This now-common test detects what’s known as the p24 antigen, a protein component of HIV that spurs development of antibodies against the virus.

Consequently, the newer tests that look for p24 have a shorter window period, about three weeks. In addition, because the antigen disappears again after a few weeks, only to return at the onset of AIDS, its presence can indicate if someone is newly infected.

In August 2013, the first rapid test that can detect both HIV antibodies and p24, the Alere Determine, gained FDA approval. And in December 2014, after what James Krellenstein, a founding member of the Prevention of HIV Action Group of ACT UP New York, characterizes as a “long fight,” the FDA finally granted a CLIA waiver to this “fourth-generation” test.

“The clinical and public health ramifications of this delay were so significant that in October 2014 a very senior researcher at the Division of HIV/AIDS Prevention at the CDC pulled me aside at an HIV conference and, in private, strongly urged me and ACT UP to do something to expedite the process,” Krellenstein says.

“It is imperative that new, fourth-generation testing technology that detects viral antigen be adopted widely to detect early HIV and open the door to care linkage,” says New York’s Daskalakis. “Whether by awareness of acute infection and behavior change or treatment to suppress HIV, detection of [acute HIV infection] and using newer platforms for testing may allow us to better interrupt networks of transmission.”

Then there’s the nucleic acid test, which looks for HIV’s genetic material and can detect an infection even earlier than a fourth-generation test, at about seven to 12 days after exposure. It was approved for HIV testing in 2006. However, its use is limited, in part because of a high price tag and a long turnaround time.

Given the collective power of these new and ever-evolving tools, the CDC revised its HIV testing guidelines in June 2014 in an attempt to cut the typical window period by as much as three to four weeks.

The first step in the suggested protocol is a fourth-generation test for both HIV antibodies and p24. A negative result means an HIV-negative diagnosis (which can still be a false negative if someone was infected less than three weeks before). If the result is positive, next comes an antibody test that can distinguish between HIV-1 or HIV-2, which are the two types of the virus. (HIV-1 is predominant worldwide, while HIV-2 is found mostly in West Africa.)

This antibody test provides faster results than the previously recommended Western Blot assay, which arrived in 1987. If the result of the HIV-1/HIV-2 test is positive, the lab technician determines the type of virus. If the result is negative or indeterminate, the subsequent step is a nucleic acid test. A negative result of that test indicates the fourth generation’s positive result was false, so the diagnosis is HIV negative (with another nod to the window period). A positive result indicates an acute HIV infection.

It’s important to note that no one who gets an HIV test should assume that the complete CDC protocol will be followed. Home tests, for example, still only look for antibodies. Also, there is only one FDA-approved nucleic acid test for HIV, which isn’t available in most public health labs because of various technical and logistical obstacles.

To address this problem, the CDC is testing a program that by 2016 might allow labs that don’t have easy access to nucleic acid tests to farm out their blood samples.

Meanwhile, researchers are developing nucleic acid tests that can be conducted at the testing site and could produce results in just an hour—and perhaps much more cheaply.

The CDC is also teaming up with others in the public health field and with players in the biotech industry to bring new tests to market that may lower the cost of HIV testing in other ways, and also to develop tests that can screen for additional pathogens, such as the hepatitis C virus (HCV), along with HIV.

“It’s important to think beyond the laboratory,” Michele Owen of the CDC says. “We also need to be able to reach people at the moment they receive health care with tests that can detect early HIV infection. The CDC is actively involved both in the field and in the lab to identify new testing methods and to develop new technologies to afford the most advanced HIV testing possible.”

HIV Testing Milestones

1981: First AIDS cases identified
1983: HIV discovered
1985: HIV antibody test (ELISA)
1987: Western Blot HIV test
1992: Rapid test
1994: Oral fluid test
1996: Home test (mailed to labs)
2002: Finger prick rapid test
2004: Rapid oral test
2010: Antigen/antibody test
2012: Rapid oral fluid home test
2013: Rapid antigen/ antibody test