As noted above, the NCS-A household survey included adolescents who resided in households identified in the NCS-R. Selection of NCS-R households is described in detail elsewhere6 (link) and will not be repeated here other than to note that the households were based on a three-stage clustered area probability sampling design that was representative of households in the continental US. Age and sex of each household member were recorded, allowing us to target households with adolescents. The NCS-A school sample, in comparison, was selected from a comprehensive government list of all licensed schools in the country. Although school-based samples miss adolescents who have dropped out of school, the NCS-A also included youth from the NCS-R household sample. A representative sample of middle schools, junior high schools, and high schools in the NCS-R counties was selected from the government list with probabilities proportional in size of the student body in the classes relevant to the target sample of youth ages 13–17. All accredited schools were eligible, including private and residential schools. In some cases where there were several small schools in a geographic area, those schools were combined to form a cluster that was treated as a single school for purposes of sampling.
School recruitment consisted of contacting individual school Principals, with the district’s approval, to obtain rosters from which to contact student families for study participation. Schools were provided $200 as a token of appreciation for this cooperation. Within each school, a random sample of 40–50 eligible students was selected for sampling. Toward the end of the recruitment period when more schools were needed to complete the study, school payment was increased to $300. A total of 320 schools participated in the survey. We began with a target sample of 289 schools initially contacted for participation, of which 81 agreed. The primary reason for refusal was reluctance to release student information. Some schools had policies against giving out student information. We had the additional problem that our recruitment took place shortly after the Columbine shooting incident, at which time schools around the country were inundated with requests from local colleges to carry out studies of students in area schools.
Districts that required formal research proposals usually granted our request eventually, but sometimes with the stipulation that they would only release student information if they first had parental written consent. We generally rejected schools of the latter type based on the fact that active parent consent has been shown in previous research to result in a very low response rate. In cases where no replacement schools were readily available, though, we agreed to this requirement. This occurred in roughly 15% of schools. As shown below, the response rate was dramatically lower in this subsample, which we referred to as blinded schools because we were blinded to the identities of the sample students until after signed consent was obtained by the school principals.
Based on the low initial school-level response rate and often protracted time frame of recruitment, we recruited multiple replacement schools for some refusal schools. Replacement schools were selected to match the initial refusal schools in terms of school size, geographic area, and demographic characteristics. The fact that we ended up with 320 schools rather than the original 289 reflects this expansion of recruitment. In cases where multiple replacement schools were included in the sample for one original school, the total number of interviews targeted in the replacement schools added up to the number targeted for the original school.