Contrasting State-of-the-Art Automated Scoring of Essays

Authored by: Mark D. Shermis , Ben Hamner

Handbook of Automated Essay Evaluation

Print publication date:  April  2013
Online publication date:  July  2013

Print ISBN: 9781848729957
eBook ISBN: 9780203122761
Adobe ISBN: 9781136334801

10.4324/9780203122761.ch19

 Download Chapter

 

Abstract

With the press for developing innovative assessments that can accommodate higher-order thinking and performances associated with the Common Core Standards, there is a need to systematically evaluate the benefits and features of automated essay evaluation (AEE). While the developers of AEE engines have published an impressive body of literature to suggest that the measurement technology can produce reliable and valid essay scores (when compared with trained human raters; Attali & Burstein, 2006; Shermis, Burstein, Higgins, & Zechner, 2010), comparisons across the multiple platforms have been informal, involved less-than-ideal sample essays, and were often associated with an incomplete criterion set.

 Cite
Search for more...
Back to top

Use of cookies on this website

We are using cookies to provide statistics that help us give you the best experience of our site. You can find out more in our Privacy Policy. By continuing to use the site you are agreeing to our use of cookies.