Validity of Content and Reliability of Inter-Rater Instruments Assessing Ability of Problem Solving

##plugins.themes.academic_pro.article.main##

Aini Azkiyatu Ulfah
Kartono Kartono
Endang Susilaningsih

Abstract

In general, problems that occur in the actual are the discovery of instruments that have not been tested for problem solving abilities. The aim of this research is to reveal the content validity and interrater reliability of the instrument for evaluating the problem solving abilities that have been prepared. The research method is used a quantitative description by 3 expert judgments, they are experts in research and evaluation, mathematics education experts, and mathematics teachers. The instrument was developed in the form of an expert observation sheet with 3 aspects of assessment, they are the aspect of content eligibility, construction aspects, and language aspects, which of each aspect has 4 categories, which are very relevant, relevant, quite relevant, and highly irrelevant. Data were analyzed using the Aiken's V formula to determine the level of instrument validity and to determine the level of consistency / constancy between assessors using Instraclass Correlation Coefficient (ICC) analysis with the help of SPSS version 23.0. the results of analysis of the content validity of all items valued above 0.3 which means that all aspects assessed by experts are valid. Interrater reliability test using ICC obtained a value of 0.516, which means that all aspects of the instrument for evaluating problem solving abilities that have been rated have a level of consistency. Thus, the instrument of problem-solving ability that has been tested for validity and reliability can be used by educators to determine the level of students' problem solving abilities appropriately.

##plugins.themes.academic_pro.article.details##