Employers in the United States have attempted to address issues perceived to be affiliated with drug and alcohol use by instituting and initiating drug-testing programs. Proponents of drug and alcohol testing contend that screening employees advances safety and productivity within the workplace. A large amount of data suggests a correlation between employee’s substance abuse and the costs it might impose on the company in the forms of low productivity increased absenteeism and an increase in workplace accidents (Carpenter, 2007). To combat this dilemma, employers have implemented a variety of systems and programs devised to reduce and deter employee substance abuse. The purpose of this paper is to present the legal, moral and practical implications of implemented mandatory workplace drug testing.
Early reports of substance abuse in the United States military began soon after the Civil War; beginning with morphine. This does not include the millions of opium pills that were distributed to those wounded on the battlefield. During the Vietnam war, it was reported that the US military had acquired a dependency with the widespread use of heroin and marijuana in hopes of coping with the stresses of conflict. In 1981, the USS Nimitz suffered 14 casualties and 48 injuries, due to a warefare jet slamming onto the Nimitz’s flight deck (Wilson, 1981). The total cost of property damage was an estimated $150 million. An autopsy revealed that six of the men that were killed tested positive for marijuana. The aforementioned events prompted the Department of Defense to implement and enact new regulations and policies regarding the penalties of drug use.
In the United States, it was reported that the cost of drug abuse within the workplace was claimed to be around $46.9 billion. (n.d., NIDA). By 1986, that figure had risen to $100 billion.
The Drug-Free Workplace Act of 1988 and the Anti-Drug Abuse Act of 1988 were pieces of legislation that were implemented in order to expand drug testing to federal contractors and to reduce the amount of drug use within the workplace. In 2014, it was estimated that about 10.6 percent of full-time employees and 13.2 percent of part-time employees reported using illicit drugs within the past month (National Survey on Drug Use and Health (NSDUH, 2014). The rationale behind workplace drug testing is simple and straightforward: employers who have raised the expected costs of employment will, essentially, deter the consumption of such illicit substances within prospective and current employees.
Despite the evident benefits drug testing provides for employers, it also has its inadvertent drawbacks, such as discrimination. An employer has the power to subject its employees to a drug test to maintain a ‘clean’ work employment, but it is not determined how the testing process is administered and to who. Specifically, which individuals or groups that are subjected to such a process. It is possible that the greatest concern many have is that the systemic discrimination a workplace might have will specifically target individuals who suffer from disabilities, such as individuals with diseases such as HIV and AIDS and minorities. Although, the opposition would argues that the drug testing process would only display that that the individual being tested had ingested the substance at some time, but does not reveal when.
In the United States, it was reported that the cost of drug abuse within the workplace was claimed to be around $46.9 billion. By 1986, that figure had risen to $100 billion.
Employers who have successfully implemented drug-free workplace programs have reported satisfying experiences, some of which include improvement in employee productivity and a decrease in employee turnover, downtime and theft (National Institute on Drug Abuse, 2018). In addition, employers who have enacted drug-free workplace programs have reported improved health conditions/status among their employees and their associated family members (Substance Abuse and Mental Health Services, 2008).