Visit Citebite Deep link provided by Citebite
Close this shade
Source:  http://findarticles.com/p/articles/mi_m1316/is_n5_v25/ai_13786316
Find Articles in:
All
Business
Reference
Technology
News
Lifestyle

Just say nonsense - Nancy Reagan's drug education programs

Washington MonthlyMay, 1993   by Jeff Elliott

"I can't get caught in the middle here--I've got my job," the man whispers into the telephone, careful not to be overheard. "But I've seen data that this stuff doesn't make any difference at all." The "stuff' is an educational product sold by this guy's company. Is he peddling video games? Dumbed down textbooks? Nope: a Just Say No anti-drug curriculum taught in schools. Tums out that the effectiveness of the mantra of Nancy Reagan's first ladyship may be about as durable as a caffeine buzz.

In the past 10 years, drug education has become a $2 billion industry, with companies selling expensive teachers' seminars and anti-drug texts to schools across the country. At least $1.25 billion of that comes from federal block grants that schools can use for counseling, social work, and other "anti-drug" activities and paraphernalia--including posters, bumperstickers, t-shirts, and coffee mugs. At least an additional $750 million comes from local and state governments and corporate gifts. Money well spent if it keeps our children from inhaling, right? That's the catch: There's little evidence that these dollars do anything to keep kids from using chugs. It's a new alphabet soup, all acronyms for messages of abstinence: STAR, DARE, ALERT, and dozens more. The nation's schools have bought a bill of goods so large and so ostensibly worthy that it's difficult to acknowledge disappointing results. And the money, now that it' s flowing, is subject to the oldest fact of bureaucratic life: Once a government subsidy starts, it's virtually impossible to shut down.

Since the early eighties, Just Say No courses-classroom sessions, usually about a dozen, held over a few months in sixth, seventh, or eighth grade, sometimes supplemented by community and high school events--have quietly become commonplace. These programs have produced exorbitant claims of success: "More than 25 million kids will be impacted by the highly successful DARE program," reads one glowing press release. "DARE's impact on reducing drug use among young people is well-documented by numerous studies."

Drug bust

That, of course, sounds encouraging; the only problem is that it's not true. Perhaps the best that can be said of the Just Say No ideology is that it makes parents and teachers feel better. A 1988 analysis published in the Journal of Drug Education found that although the sessions probably didn't keep kids away from drugs, they did reassure "parents that the schools are at least trying to control substance abuse among students." Meanwhile, a new University of Michigan study finds that, after years of decline, marijuana, cocaine, and LSD use is actually rising among eighth graders.

Consider Project SMART: Founded in 1981, SMART was one of the earliest Just Say No programs taught in junior high schools. The approach seemed valid: Once a week for 12 weeks, kids would be taught to resist peer pressure, interpret ads for booze or cigarettes, and practice saying no through role playing. To evaluate how well the program worked, the children were tested one and two years later. The first follow-up found children still doing well, at least in resisting cigarettes. But at the end of the second year, almost all positive effects had faded, and kids in the study were now smoking cigarettes and marijuana to about the same degree as those who hadn't been in SMART.

Worse, the evaluators lost track of more than 50 percent of the kids before the follow-ups were completed. As a result, one reviewer of SMART says, "Even the modest effects reported by the investigators are debatable."

But through the magic of statistics, these small gains have been translated into tremendous victories in the war on drugs. In an evaluation of one program, ALERT, 1.4 percent of the kids who had been in the class had tried marijuana within 15 months of completing the course, compared to 3.7 percent of the kids

who hadn't participated--a difference of 2.3 percent. ALERT researchers, however, claimed a heartening 60 percent drop in marijuana use. Technically, they were right: 2.3 is about 60 percent of 3.7. And doesn't a 60 percent improvement sound so much better than 2 percent?

A better indicator of ALERT's success comes from a study by the Rand Corporation and the Conrad N. Hilton Foundation. The evaluators found that it had the strongest effects on kids that needed it least. Children who had puffed on just one or two cigarettes came out of the program with a strong distaste for l'air de Marlboro. While ALERT delayed many kids from trying marijuana and found modest effects on their use of alcohol, the benefits disappeared the next year. According to Dr. Phyllis Ellickson, the project's primary investigator, this shows that, "If you make even a little dent, it's hard to maintain it."

Cooking the numbers like this is not unusual. "Generally, the agencies that are funding the programs are also the ones paying for the evaluation," says Dr. Joel Moskowitz, principal investigator for the Western Consortium for Public Health, a nonprofit attached to the University of California at Berkeley and at Los Angeles. "There's a tendency by the evaluators to report back what they think the agency wants to hear rather than what's really going on. Nobody wants to be the bearer of bad news. It makes the program providers unhappy, and it makes the funding agencies unhappy. It puts enormous pressure on the evaluator to massage the data to produce positive results."