Publication Date



Supporting users in interpreting assessment results is an important but underexposed aspect of validity. This study investigated how the score reports from the pupil-monitoring Computer Program LOVS can be redesigned in a way that supports users in interpreting pupils’ test results. In several rounds of consultations and designs with users and experts using design principles from the literature, alternative designs for the reports were created and field tested. No clear differences were found in terms of users’ interpretation accuracy between the original and the redesigned reports. However, users’ perceptions of the redesigned reports were predominantly positive. The authors emphasise the need for involvement of experts and users in the design process to ensure the validity of reports.


Institute for Learning Sciences and Teacher Education

Document Type

Journal Article

Access Rights

Admin Only

Access may be restricted.