To examine interrater reliability, two investigator’s PALS data (both total and average intensity level of PA data) were compared. A difference between the raters was calculated using the average proportion of agreement and the average of the modified Cohen kappa. Also, an intraclass coefficient with a 95% CI was examined.
A Pearson correlation coefficients were used to determine criterion validity evidence of the PALS data by comparing total intensity levels of PA from the PALS data and total activity counts from Actical accelerometers. IBM SPSS Statistics ver. 20.0 (IBM Co., Armonk, NY, USA) was used to conduct the analysis.