National Council of Teachers of English Logo
picture of students writing on computers and discussing a piece of writing with a teacher

Machine Scoring in the Assessment of Writing - Previous Revision

 

Gettysburg Address Flunks Robograder
Do you want your writing graded by this machine?

Computer-graded writing continues to spread nationally, but at what cost? Many colleges use machines to score placement test writing and the introduction of Common Core standards assessments in 2014 will likely expand the use of robograding. Yet, "while (robograders) may promise consistency, they distort the very nature of writing as a complex and content-rich interaction between people." (CCCC Writing Assessment: A Position Statement)

The Conference on College Composition and Communication, a constituent group of NCTE, supports human assessment because assessment that isolates students and forbids discussion and feedback from others conflicts with what we know about language use. We write for social purposes, so it would follow that only a human can accurately critique communication.

"If a student's first writing experience at an institution is to a machine, this sends a message: writing at this institution is not valued as human communication­ -- and this in turn reduces the validity of the assessment." (CCCC Position Statement on Teaching, Learning, and Assessing Writing in Digital Environments)

"E-Rater doesn’t care if you say the War of 1812 started in 1945," notes Les Perelman in The New York Times. Despite what the grading companies say, their products are far from perfect. As NCTE member Les Perelman found, computer-grading machines are fallible. On a scale of 1 to 6, the Gettysburg Address earned a 2. Perelman also found several ways to game the machine. Machines may grasp how to grade great grammar; they cannot enjoy the emotions of evocative essays or see sparks of subtle smarts.

Computers grade student writing through rigid algorithms that do not account for various forms of quality writing. For example, Perelman found the machines overvalue magniloquence. And they downgrade sentences starting with “and” or “but.” But with so much riding on large scale testing, students would be loath not to satisfy the machines known as e-Raters or robograders.

The movement against robograding must start at the top. If colleges and universities begin using computers to grade student writing, high schools will begin preparing their students for computer-graded writing. As Crispin Sartwell notes, writing to computers could reduce all writing to templates.

Anne Herrington and Charles Moran took an extensive look at computer graders and noted several problems. If a computer can grade thousands of papers at once, why not put all those students in one class? Why not use one teacher to record an online lecture for thousands of students? Gone would be the demand for teachers and the special expertise they offer.

Computers may be cheaper and faster than humans, but they do not possess the requisite skills of comprehension to accurately grade student writing.   

Document and Site Resources

Share This On:

Page Tools:

Join NCTE Today

Related Search Terms

Copyright

Copyright © 1998-2014 National Council of Teachers of English. All rights reserved in all media.

1111 W. Kenyon Road, Urbana, Illinois 61801-1096 Phone: 217-328-3870 or 877-369-6283

Looking for information? Browse our FAQs, tour our sitemap and store sitemap, or contact NCTE

Read our Privacy Policy Statement and Links Policy. Use of this site signifies your agreement to the Terms of Use

Visit us on:
Facebook Twitter Linked In Pinterest Instagram