In The New York Times, Amy Harmon answers some letters resulting from her article on Justin Canha:
Q: I had a concern with the lack of follow-up in the system to determine the effectiveness of various interventions in the special education program. Apparently federal funding does not require this. But that is not an excuse for ignoring the need to measure impacts, in a program that uses scarce resources to help young people deal with their challenges. There is a saying in the business world: “You get what you measure.” Since there seems to be no attempt to measure, what is society getting? Or what are the challenged young adults getting from the choices that are being made for them without a systematic effort to determine results?
A: I think that’s a very valid question, one I tried to raise in the story without being too heavy-handed about it. This year, the Department of Education required states for the first time to report how many of their special education students had either worked or attended post-secondary school for at least 90 days in the year after they graduated. But there is no federal-level effort to track the relative success of the type of transition program I profiled, and few, if any, states do it either. Gathering such data is complicated, especially because of the wide range of students with disabilities served by public schools. But it seems crucial to know. Paul Wehman,director of the Rehabilitation Research and Training Center at Virginia Commonwealth University, is in the middle of conducting what I believe is the first controlled study of a community-based transition program, comparing it to the standard transition program in a local high school.