To Örebro University

oru.seÖrebro University Publications
Change search
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf
Bedömarvariation: Balansen mellan teknisk och hermeneutisk rationalitet vid bedömning av skrivprov
Örebro University, School of Humanities, Education and Social Sciences.
Örebro University, School of Humanities, Education and Social Sciences.
2014 (Swedish)In: Språk och stil, ISSN 1101-1165, E-ISSN 2002-4010, no 24, p. 133-165Article in journal (Refereed) Published
Abstract [en]

It is well known from studies of inter-rater reliability that assessments of writing tests vary. In order to discuss this rater variation, we depart from two research questions: 1. How can rater variation be conceived of from a professional, i.e. teacher, perspective? 2. What characterises Swedish (mother-tongue) teachers’ assessments of writing tests? The first question is addressed in a meta-study of previous research, and the second question is answered in a study of 14 Swedish teachers’ rating of texts from a national written composition test in upper secondary school. The results show that teachers in the same subject assess better, i.e. have less rater variation, than other groups. It is also clear that writing tests are notoriously difficult to rate. It is very rare that the correlation coefficients reach the desirable 0.7, a number that means that 50 % of the variance could be explained by shared norms. Another main result concerns criteria and tools for assessment. Such tools should be grounded in teachers’ professional expertise, in their expectations for different levels of performance. Our study reveals several situations where teachers’ professional expertise clashes with assessment criteria. The article concludes that valid assessments of tests that are high-stakes must handle both a technical rationality, i.e. the grading should be predictable from rater to rater, and a hermeneutic rationality, i.e. the grading must be based on teachers’ professional judgment.

Place, publisher, year, edition, pages
Uppsala: Adolf Noreen-Sällskapet för Svensk Språk- och Stilforskning , 2014. no 24, p. 133-165
Keywords [en]
inter-rater reliability, interpretative community, writing assessment, Swedish national writing tests, assessment criteria, true score
National Category
Specific Languages
Research subject
Swedish Language
Identifiers
URN: urn:nbn:se:oru:diva-42199Scopus ID: 2-s2.0-84922876854OAI: oai:DiVA.org:oru-42199DiVA, id: diva2:782942
Available from: 2015-01-23 Created: 2015-01-23 Last updated: 2023-04-26Bibliographically approved

Open Access in DiVA

fulltext(579 kB)1517 downloads
File information
File name FULLTEXT01.pdfFile size 579 kBChecksum SHA-512
dad741304a54ba302ed155c6e90cd9f7d99613ec5c34449d268dc9ead075234bec5cb6f9c39cf9ddc51bee89e863202fbc505a4c045ab97c67a2339f6b98b33d
Type fulltextMimetype application/pdf

Scopus

Authority records

Ledin, Per

Search in DiVA

By author/editor
Borgström, EricLedin, Per
By organisation
School of Humanities, Education and Social Sciences
In the same journal
Språk och stil
Specific Languages

Search outside of DiVA

GoogleGoogle Scholar
Total: 1517 downloads
The number of downloads is the sum of all downloads of full texts. It may include eg previous versions that are now no longer available

urn-nbn

Altmetric score

urn-nbn
Total: 3936 hits
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf