RATER AGREEMENT AND THE MEASUREMENT OF
RELIABILITY IN EVALUATIONS OF ONLINE COURSE DESIGN USING
THE QUALITY MATTERS RUBRIC(TM)
Open Access
Author:
Zimmerman, Whitney Alicia
Graduate Program:
Educational Psychology
Degree:
Master of Science
Document Type:
Master Thesis
Date of Defense:
None
Committee Members:
Jonna Marie Kulikowich, Thesis Advisor/Co-Advisor Jonna Marie Kulikowich, Thesis Advisor/Co-Advisor Lawrence Christopher Ragan, Thesis Advisor/Co-Advisor
Agreement between raters with diverse backgrounds in Quality Matters Program(TM) online course reviews was examined. The Quality Matters Program uses a unique peer-review process that has both quantitative and qualitative components. Quantitative analysis compared agreement on 40 specific standards used to evaluate the quality of the design of online courses. Suggestions for measuring the reliability of the Quality Matters Program course review process in future studies are proposed.