Simple kappa is a measure of inter-rater reliability that quantifies the level of agreement between two observers beyond what would be expected by chance. It is particularly useful in studies where categorical data are collected, such as yes/no responses or disease presence/absence. The value of kappa ranges from -1 to 1, where 1 indicates perfect agreement, 0 indicates no agreement beyond chance, and negative values suggest disagreement.