Wednesday, July 7, 2010

Crisis-Point Interventions, Part I

I've been working on a new post for five days now, and it keeps squirming out of my grasp. The fact is that the data on the topic is a lot more ambiguous than I wish it were. My goal was to use last Friday's Times article on Chicago's new program to reduce gun-violence against grade-schoolers (actual article; my summary) as a jumping-off point for a discussion of crisis-point interventions versus early interventions and prevention programs: the Chicago program is taking highschoolers who are currently in gangs or have been in the past, and trying to get them safely out of gangs and out of the way of former gang-contacts who are seeking retaliation; why not, I wondered, target younger kids and try to prevent them from joining gangs in the first place. It's harder than I expected to make that argument.

When you're talking about a program like this, the big question is cost-effectiveness. You only have so much money—six million dollars, in federal stimulus funding, in this case—and you want to use it to have the biggest impact you can. By my admittedly rough estimation, the Chicago program is costing around $6,000 per student, per year. (See my calculations.) To put that number in perspective, that's about 60% of the total cost of sending these kids to school every year.[1] By that estimate, the program will cost a total of around nine million dollars, when it's expanded to cover 1,500 students next year.

Based on the data in the article, it's impossible to be sure what the real impact of the program is—there were 40 fewer students shot this year than last year, a reduction of about 16%, but there's no way of knowing whether that's due to the program or not, especially since the article gives no information on the variance in number of shootings from year to year.[2] Even if we assume that the reduction in shootings was due entirely to the advocates program, though, that doesn't mean the program was cost-effective—after all, there might be another program that could prevent more shootings for the same amount of money.

I assumed, naively, that the motivations for the Chicago program were sentimental: the city was devoting all its resources to the neediest few, rather than stepping back, practicing a little triage, and fixing the problem at its source. Turns out, I may be the sentimental one. The current wisdom on how to reduce youth gang activity is decidedly cool-headed—possibly even cold-headed.

A US Dept. of Justice manual on youth gangs from August of 2000 provides a table of "selected gang program evaluations."[3] How did they select them? Who knows, and there could be bias in the selection, but the patterns are striking nonetheless. Of 14 prevention-based programs listed, one had its evaluation discontinued for unspecified reasons, six had negligible, indeterminable, or even detrimental impacts, three had small or marginal positive impacts, two had some or moderate positive impacts, and two had unqualifiedly positive results. On the other hand, of the seven suppression-based programs—i.e. programs to reduce gang crime through curfews, prosecution, and other police activity—four were deemed all around successes, and none was completely ineffective.

The subtitle of this blog is "Ideology-free analysis of current issues in education and schooling," so I present this data, which leads to conclusions I don't at all like, and I'm not about to explain it away—but I do intend to discuss it further. Suppression is a pessimistic way of fighting gangs; it assumes that induction of poor, inner-city kids into gangs is inevitable—or at least too intractable to be worth preventing—and that the state's involvement should not extend beyond law enforcement. I'm not a pessimist, and I don't like it.

But I am sick of working on the same blog post for five days, so this one's going up, and the next one will pick up the discussion where this leaves off. In the meantime, if you're in New York, may I suggest a swim?

  1. ^Illinois average per-pupil spending for public education in the 2006-07 school year, the last year for which I have data, was $9,586 (NCES); that is unlikely to have reached $10,000 in the years since, and spending in poor, inner-city districts is likely to be below the state average.

  2. ^If I'd been in charge of implementing this program, I'd have randomized program enrollment among a slightly larger set of at-risk students, so that I'd have actual experimental data with a control group. Of course, that might have reduced the program's effectiveness, but I doubt the impact would have been significant. As implemented, the program enrolled the 210 students with the highest risk of getting shot; to create a control group, they would have had to take, say, the 400 students with the highest risk of getting shot and enrolled a randomly selected 210 of them, so some of the highest risk kids would have ended up losing their place to a slightly lower risk kid… but I think that effect would have been negligible. We don't know how they calculated student risk, but it can't have been an exact measure. (One local Chicago-area media-commentary blogger refers to "the questionable statistical analysis behind identifying the [high-risk] kids," but I haven't yet obtained any details on how the analysis worked or why that blogger thinks its questionable.)

    I'm going to write a whole post on this issue of randomized program design soon, so if you found the above discussion confusing, come back in a week or two.

  3. ^ U.S. Department of Justice, Office of Justice Programs, Office of Juvenile Justice and Delinquency Prevention, "Youth Gang Programs and Strategies." (

Summary of Friday's Article

back to main post

The discussion I'm opening up in this post is a huge one, and the article's just a jumping-off point, so its details are not crucial, but here's the gist.

In response to last year's record incidence of gunshot wounds among school children, Chicago is implementing a new student-safety program for its most disadvantaged public-schoolers. Using federal stimulus money, the city has assigned adult mentors, called advocates, to the 210 highest-risk students (highest-risk for gunshot wounds, that is, not educational outcomes as the term usually implies, but I assume there's a strong correlation there.) The advocates are on-call 24 hours a day to lend a hand in any way needed. The idea is for them to form close emotional bonds with their charges and provide emotional, practical, and academic support.

Initial data on the program are promising: according to the article, school-attendance is up for the 210 kids in the program; they're getting in less trouble than they did last year, and only three of them were shot (none fatally, in case you were worried). Citywide gunshot wounds among public schoolers were down 16%, from 258 in the 2008-2009 school year. Of course, we don't know how many of those 210 would have been shot without the program, nor how much of that 16% decrease in shootings is due to the program—but those numbers certainly sound impressive. The people of The Windy City are so happy with the program that they're planning to expand it to cover 1,500 students next year.

The article is un-critical and heavily anecdotal: it's largely the touching tale of one former gang-member and his beloved advocate. Not to give the Times too hard a time, though—there's a lot to be said for anecdotes, and it's worth remembering the actual people living these lives, once in a while.

My Questionable Calculation of the Cost of Chicago's Advocate Program

back to main post

The program pays advocates $12 per hour of one-on-one time spent with kids. If we assume that the average advocate spends ten hours per week with each of their charges[1], that makes a cost of $6,240 per kid per year. This number doesn't cover the administrative costs of the program, nor do we know whether the advocates are working with kids through the summer or only during the school year—though it seems logical that they would remain operative during the summer. I have a request in at the Chicago Public Schools website for more information on this issue.

  1. ^This is a very rough estimate. No figure was given in the article for average hours per kid per week or any equivalent metric, but a limit of four students was assigned to each advocate, so my 10-hour estimate assumes a 40-hour work-week for a full-load advocate. That's a pretty short work-week, actually, for this kind of job, but a lot of advocates have second jobs or attend school on the side. We're not on firm ground here, but given the tremendous drama of these kids' lives, I figure that any advocate who's committed to their job is liable to spend a lot of time with their charges. Ten hours a week seems reasonable, if not conservative.

No comments:

Post a Comment