Wednesday, September 19, 2007

More Thoughts on Assessment

I can't sleep. It may have something to do with regular insomnia. Or it may have to do with the banging coming from over the neighbor's way (sounds like a door banging in the wind, maybe?). Or whatever.

One of the things that struck me today listening to the assessment head talk was that there's a crucial piece missing.

The assessment guy's been collecting data for X number of years, and has found that our students don't write as well as we think they should.

The assessment guy has also realized that the method of assessment is rather flawed, so he went around and asked people on campus if they thought that despite the flaws, our students don't write as well as we think they should. And, yes, pretty much everyone on campus says that our students don't write as well as we think they should.

Now, assessment guy wants to say two things: 1) We need a better method of assessment because the last one isn't well designed. 2) The data from the last one is actually valid enough that I should keep my job despite the fact that I designed the flawed methodology. But we can't actually use the data because I need to collect more for another X number of years.

I'd like to fire the assessment guy (which means he'd go back to teaching, not go on unemployment), but as we in higher ed know, administrators can never be fired for incompetency or actually expected to accomplish what's promised.

Let's accept, for the moment, that the interpretation of the data, though imperfect, is broadly right: our students don't write as well as they should.

The next question is why?

Is the faculty incompetent?
Is the faculty basically competent, but doing something wrong which could be changed for the better?
Are there other factors which aren't helping students write well?

Because the way assessment works, we measure students' work at the end, their "outcomes," and then decide if we're doing an acceptable job or not. And since it's the school's responsibility to to the job, and as far as writing's concerned, it's the faculty's responsibility to foster good writing, then something needs to happen with the faculty.

There are difficulties with this approach, of course, since education isn't just something that faculty do to students. But the school can't control what students do, so measuring what students do doesn't make sense for assessment. (This is like blaming plastic widget makers for imperfections in the plastic widgets they turn out without recognizing that the plastic going in--over which the widget makers have no control--is of poor quality or messed up with extraneous stuff.)

Assessment doesn't tell us about how what we do could be different, or how what we do is different (or not) from what our peer institutions do. Assessment just measures outcomes.

Ah, but we should all do best practices! Except, on an institutional level, we know we aren't, at least when it comes to class size.

As far as what faculty do in their classrooms? We're all doing process work, the sort of work that composition specialists tell us helps students learn to write better. We all teach using brainstorming techniques, drafting, draft revisions, etc.

How do we do better? I don't know. One thing I do know is that when I have students do brainstorming or drafting in my upper level classes, they tell me they usually just write one draft and turn it in. So something's not "taking" between their first year writing course and whatever other writing they do in college.


  1. My theories about why it doesn't "take" -

    1) Because it doesn't have to. Writing instructors expect a level of quality that isn't necessary in, say, their sociology class or their chemistry class. (Chose those two disciplines at random, by the way.) This is the case at my institution. I'll have students in the first year, teach them writing, and they'll seem to really have gotten it, and then I'll see them again two years later and all of it is gone - because they didn't need to practice it except for with me.

    2) It's important to remember that more people are going to college than ever before. In a former time (even 10 or 20 years ago), many of these people would never have come to college. These people may never become good writers. Why? Because maybe they're not really equipped to do so and never will be.

    3) Assessment-based K-12 education is in large part to blame. You can't expect students to do high-level critical thinking and writing well after years of filling in bubbles and hitting marks on rubrics.

  2. The question he went around asking is dumb: of course people will say that students don't write well enough. It's a trope about writing; for centuries in American education, people have complained about it.

    It would be much better to go around asking questions that would encourage faculty in all those places on campus to figure out what their expectations are, where they teach students to meet those expectations, and what the faculty think of the work in those disciplines. His approach isn't going to get there.

  3. I think Dr. Crazy (like usual) makes some good points... they lose their abilities to write when they don't practice them.

    Of course, the other side of the equation is that, even when explicitly instructed, they don't display the abilites they should after the basic comp course. I'm not sure what the cause of this is -- but, I suppose there is a difference between performance of the ability in a comp class and application of that ability in other contexts.

    We've recently done some writing assessment at my college -- and we found out that the students who should be able to communicate --because they've passed a class with that goal -- can't do so.

    What to do about it, I don't know...

  4. I think Dr. Crazy says some smart thing, in particular. I was also thinking, though: isn't writing well something that generally needs to be learned earlier than college for it to shift? Not to shift the blame here, but I don't think it's entirely in the control of faculty members. A writer is formed much before that. Their *aptitude* for writing, in a sense, is formed much before that.

  5. To the contrary, the widget makers do not turn defective plastic into defective widgets. They reject the plastic at the inspection station, and if the vendor continues to deliver defective plastic, the widget maker finds a different vendor.

    Vendors often obtain ISO 9001 quality management listing, or something similar, as a signal that their workforce understands what to do in case the production line starts producing defective plastic. Widget makers can also obtain such certification. It works a little bit differently in industry than it does in eductation, where the vendors (school districts) and widget manufacturers (universities) have more control over their own accreditation.

    And that, dear readers, is part of why widget manufacturers don't fret about a lack of "access" for vendors of substandard plastic.

  6. Anonymous11:59 AM

    I would be a much happier camper at my uni if the people who run the composition program would delineate in some detail exactly what skills and knowledge we instructors have a right to demand of incoming freshmen comp students--and exactly what outcome is expected at the end of the sequence.

    Not knowing what is considered to be a bare minimum knowledge/skill set dulls my fangs and makes some of my effort ineffective. When I tell a student, "You should have learned X before you enrolled in college," I would like to know for certain that I'm not just blowing around so much hot air, that my program directors have actually told me that the student should already know X, and that I can, indeed, toss a deficient paper back at a student and demand that s/he make an effort to learn X before trying again. I'm always willing to help out in office hours, of course.

    Most of our expectations CAN be defined and quantified, but they aren't. I wonder how it is in other departments.

    And, yes, college teachers have been complaining about students' deficient writing skills for decades, but this trend does not automatically indicate that college students' writing ability has remained essentially static during the same period. In fact, I rather suspect that it hasn't.

  7. Here's why it doesn't work: grades.

    (I blogged about this today, as it happens.)

    Students aren't trying to write something for you. They're trying to get a grade.

    They're trying to get that grade with the least effort possible because, you know, that's the scam. That's how the system works. (They aren't actually the widget: they're the lineworker, *making* the widget.)

    How do we fix it? This is what I think: eliminate grades. (Stop trying to turn a profit on the widgets.)

    Make them remake the widget until its right. (I won't take that widget until it's a good widget!) You don't leave this classroom until I *like* your essay on Eliot! Here's what I don't like, now go do it again.

    You'd get some good writing soon, I bet.

    Or fewer students, one.

  8. Coming to this late, but in terms of the writing skills "taking", I like to remind those who complain that a student can't learn everything about writing in 15 weeks. The equivalent would be for a student to pass as a biologist after only one semester of freshman level biology. Not gonna happen.

    I'm a cranky comp-rhet person today.