YOUR SAY

Has thou slain the Jabberwock?

Computer marking risks the death of learning

Mary-Ellen Betts

’Twas brillig, and the slithy toves

Did gyre and gimble in the wabe:
All mimsy were the borogoves,
And the mome raths outgrabe.

It is an obvious choice to suggest Lewis Carroll’s Jabberwocky be submitted for marking by a computer. Would the computer understand the humour, the inventiveness, the sheer brilliance of this piece of nonsense? Perhaps not, but it would recognise the accuracy of the syntax.

The Australian Curriculum, Assessment and Reporting Authority (ACARA) is about to give Australian educators exactly what many have been asking for — a quick turnaround from NAPLAN on student results.

This will be achieved by programming computers to mark student writing. We are promised that there will be solid research into the practice and transparent sharing of the findings before the computers take over in 2017. Computers can and do mark student writing already.

When I went to work in New York City in 2003 I was appalled at the New York state English Language Arts (ELA) testing program. Our NSW Basic Skills Test was a paragon of virtue in comparison. Would you believe the ELA test was not written by the US Education Department? Neither was it based on NYC standards! School results were published on the department’s website in a ranked list and, worst of the worst, the student writing was marked by computers. Unbelievable!

It was high-stakes testing and it was corrosive. Sound familiar?

On my return in 2006 I believed that there was nothing in the New York City system that should be replicated in an Australian school. Now we have NAPLAN and the Myschool website. Teachers have learnt to live with both. We will probably learn to live with computer marking. But let’s reflect on what we have already lost to this method of testing.

The writing component of the NSW Basic Skills Test was marked by teachers and involved some of the best professional development available.

Literacy consultants across the state were trained in analysing student writing against common criteria. Consultants went back to their districts and worked with teams of teachers from a number of schools.

Over time, a huge number of teachers across a large number of schools were trained in analysing and moderating student writing.

It was powerful professional learning. That doesn’t happen now. Teachers are paid to be markers and are trained by an external provider. Markers go back year after year. I have been advised that it is a much smoother process than when it was run by the department. And hey, the teachers are being paid. What’s the gripe? We have lost the professional learning.

Last week, I sat with teachers to moderate student writing. A computer could probably have sorted the writing in less than 10 minutes. It took us half the day but we learnt a great deal more than any computer analysis could have given us; we established clear goals for each group of writers.

The marking of student writing by a computer threatens our professionalism. It reduces student writing to a series of equations. Our students are much more than that.

The NAPLAN markers will tell you that it is possible to identify the work of different coaching colleges when marking student work. When we know what the computer has been programmed to look for we will be asking our students to learn how to join a set of dots, not how to write.

I do not look forward to discussing “purpose and audience” with a class of eight-year-olds before their NAPLAN test. I want assessment processes that help me improve their learning, not just measure it.

I know a computer cannot give me that.

Mary-Ellen Betts is a retired deputy principal.