Wednesday, April 24, 2013

Evaluating Counseling Programs

By: Patrick O'Connor Ph.D


The last twenty years of education reform indicates our gut feelings about our work is no longer enough to satisfy bosses, policymakers, and funding sources.  In addition to knowing we make a difference, we have to show we make a difference—and that requires data.

The good news is that we come to this challenge well prepared.  Almost every school counselor has had a course in statistics, and while t tests and z scores may be a blur, three key ideas from Statistics 101 are crystal clear: There is such a thing as bad data; not all data involves numbers, and statistics can be skewed to tell just one side of the story. Keeping these three things in mind, we can pursue client-centered data knowing what questions we want to answer, and how to go about answering them.

There is such a thing as bad data, and many school counseling programs use poor data simply to keep their bosses happy.  A classic example is a counseling office that gives the principal the total number of student visits to the counseling office in a year.  This is helpful at a basic level—10 total visits in a year clearly suggests something’s wrong—but a richer approach can be had asking two additional questions: Which services was the student seeking in that particular visit, and had the student been in the counseling office before?  This information can strengthen the numbers the administration wants (“450 of our 525 students visited the counseling center this year”) and gives counselors some idea of popular and underused programs (“Only 40 of those visits were related to career counseling, and those were made by the same 14 students.”) Suddenly, this is data that has a richer purpose.

Not all data involves numbers  Another source of bad data is used to evaluate the strengths of their college counseling services.  Known as “the list”, counselors devote all kinds of time tracking students down to find out where they were accepted to go to college, and where they plan to attend.

These numbers may soothe the principal and the public (“See how many students are going to State U!”), but they don’t really tell us if the students feel the counseling office helped them make a good college decision. A quick look at a college counseling survey at http://www.surveymonkey.com/s.aspx?sm=ONl3pPcBpZcC7OpTFFUH_2fQ_3d_3d shows a different approach to evaluation.  Not only will the answers provide ideas to improve counseling services; it gives respondents the opportunity to provide data in writing.  It’s easy to take these comments and create a sense of the majority, while the responses show greater depth than a scale of 1 to 10. (And if you’re on a budget—you can create your own 10 question survey for free at www.surveymonkey.com)

Data can be skewed  Stats class showed us examples of surveys that didn’t tell the whole story, and we can’t let that happen with us. If you haven’t surveyed students and parents before, your first attempt should be in the fall, when you can measure current attitudes towards counseling services.  This data can offer ways to tweak counseling services during the school year; this way, the data for the end of the year can indicate progress in meeting students’ needs, even if there is room for growth in meeting all needs.
Counselors effectively soothe student fears about testing all the time—isn’t it time we overcame our profession’s anxieties over data driven evaluation?  It’s one of the best student-centered ways to keep count of the services that count most. 


No comments:

Post a Comment