BBC

UNDERSTANDING TESTS FOR HIV

In 1983, the virus that came to be called the human immunodeficiency virus (HIV) was discovered in human blood samples. Two years later, researchers developed a test to detect HIV in the blood. At that time, the best use of the test was to screen people who wanted to donate blood so that blood transfusions and the blood supply would be free of HIV.
Medical researchers, however, were reluctant to use the screening test to identify people with HIV infection. First, although the test was one of the most accurate tests of its kind, occasional errors in its results created a lot of anxiety. Someone whose test result was positive might not have HIV, while someone whose test result was negative couldn’t be certain that HIV was not present. Second, at that time, the public was not ready to accept people with HIV infection, and the newspapers were full of stories of discrimination against these people. This public response was partly due to fear and partly to our uncertainty about how the virus was transmitted. The third reason, and the most important, was that physicians had little to offer anyone who tested positive. As a result, recommendations from the medical profession and from others concerned with the epidemic about whether to get tested for HIV were ambiguous. That is, no one knew whether testing for HIV made sense or not.
Since these early times, the accuracy of the test has improved substantially, public understanding has progressed somewhat, and medical research has made a gigantic leap forward in the treatment of HIV infection. Recommendations have changed accordingly. The purpose of this article is to discuss the tests themselves and their accuracy, make recommendations about who should be tested, discuss the confidentiality of the tests, and help interpret the results.
*252\191\2*

UNDERSTANDING TESTS FOR HIVIn 1983, the virus that came to be called the human immunodeficiency virus (HIV) was discovered in human blood samples. Two years later, researchers developed a test to detect HIV in the blood. At that time, the best use of the test was to screen people who wanted to donate blood so that blood transfusions and the blood supply would be free of HIV.     Medical researchers, however, were reluctant to use the screening test to identify people with HIV infection. First, although the test was one of the most accurate tests of its kind, occasional errors in its results created a lot of anxiety. Someone whose test result was positive might not have HIV, while someone whose test result was negative couldn’t be certain that HIV was not present. Second, at that time, the public was not ready to accept people with HIV infection, and the newspapers were full of stories of discrimination against these people. This public response was partly due to fear and partly to our uncertainty about how the virus was transmitted. The third reason, and the most important, was that physicians had little to offer anyone who tested positive. As a result, recommendations from the medical profession and from others concerned with the epidemic about whether to get tested for HIV were ambiguous. That is, no one knew whether testing for HIV made sense or not.     Since these early times, the accuracy of the test has improved substantially, public understanding has progressed somewhat, and medical research has made a gigantic leap forward in the treatment of HIV infection. Recommendations have changed accordingly. The purpose of this article is to discuss the tests themselves and their accuracy, make recommendations about who should be tested, discuss the confidentiality of the tests, and help interpret the results.*252\191\2*

Share and Enjoy:
  • Digg
  • Sphinn
  • del.icio.us
  • Facebook
  • LinkedIn
  • Reddit
  • StumbleUpon
  • Technorati
  • Twitter
  • Yahoo! Bookmarks

Comments are closed.