I picked up The New York Times a few months ago and saw a rather large article touting a promising anti-HIV therapy, the mineral selenium. Many treatment newsletters gave the work on selenium a few lines; many didn’t cover it at all. Most of us who work on AIDS research issues are familiar with this sort of stuff -- a steady flow of extremely preliminary investigations, most of which turn out to be dead ends.

So why the Times fanfare? Because it was simply a slow news day. A bored and careless reporter from the Associated Press, who had read the selenium paper in an obscure scientific journal, decided to write about it.

The Times article is one of the more egregious examples of the baseless inflation of a scientific footnote into a cure du jour. It’s fairly easy to spot one of these whoppers, though, if you know a few things to look for.

The selenium headline was “New Theory Links HIV to Depletion of a Mineral.” Be on the look-out for the word “theoretical” in discussions of AIDS treatments. This often means that what is written about is often no more than someone’s bright idea -- hardly the basis for a treatment decision.

Another phrase that should raise a flag is "in vitro,“ which means ”in the test tube." Gasoline kills HIV in vitro, but you wouldn’t want to be giving Texaco Unleaded to people with HIV, or anyone, for that matter. Generally, to reliably evaluate an agent, it must undergo a long research process culminating in human clinical trials.

Most of the time, the information you hear about both approved and experimental therapies reaches you through a long chain of interpretations. The principal investigator of a trial presents his or her results, which is then interpreted by a reporter, whose article is read by your doctor or handed to him or her by a drug company’s sales representative. Finally, the doctor talks to you about it. Like the party game Telephone, what you eventually hear may be very different from the actual findings about the study.

A good example of this is the controversial Concorde study of AZT. In brief, the study found that early intervention with AZT doesn’t prolong life any longer than taking it later on in the course of disease. Headline writers, however, decided that this meant that AZT doesn’t do any good at all. In fact, AZT has a demonstrated ability to prolong life as compared to taking nothing at all. Concorde did not address this issue and the conclusions some reporters reached were not the conclusions of the study itself.

Mistakes such as this often arise from the media’s tendency to exaggerate. More insidious is the way people, from reporters to activists to drug companies, put their own spin, or interpretation, on data. Again, Concorde forms a good example. The “AZT is poison” crowd declared the drug completely useless. The “It’s never too early” tried to depict the trial design as flawed. Both spins were contrary to the facts. AZT maker Burroughs Wellcome and some AIDS researchers attacked the conclusions of the study themselves. The Journal of the Physicians Association for AIDS Care published an article that vigorously attacks Concorde, with Joep Lange of the World Health Organization maintaining that early intervention is still the best treatment strategy: “It just makes more sense to hit the virus when there’s not a lot of it around.” He doesn’t seem to care that Concorde provided powerful evidence that early intervention with AZT doesn’t make people live longer.

Confused? You ought to be. Your best bet is to read the original studies themselves and draw your own conclusions. If you want some pointers on how to understand a study, read the American Foundation for AIDS Research Treatment Directory article “A Consumer’s Guide to Clinical Trial Results.”

A healthy dose of skepticism of headlines -- and a high regard for evidence -- will take you a long way. Remember what Jack Webb from the old Dragnet series used to say: “Just the facts, ma’am. Just the facts.”