Keeping proposals anonymous is of major methodological importance to reduce the influence of "loud-speaking" experts and to facilitate the influence of their "silent-speaking" peers [31 (link),38 (link)]. The expert panel members were aware of its composition, but anonymity related to proposals was secured until stage 3.
Stage 1. The expert panel received a predesigned worksheet by email. Each expert was asked to propose the ten most important variables to be routinely documented for shared research and benchmarking within each of five predefined sections:
1. Fixed system variables
Variables relating to system characteristics concerning how the service is organised, the operational capacities of the service and its integration with the EMS with which it operate.
2. Event operational descriptors
Variables documenting the context of a mission (dispatch) or episodes of use (for services with advisory functions).
3. Patient descriptors
Variables documenting information related to the patient's profile, e.g., age, gender, co-morbidity and type of medical complaint.
4. Process mapping
Variables recording what happened to the patient and how the episodes of care proceeded.
5. Outcome and quality indicators
Variables relating to patient and/or mission outcomes, as well as measures of quality.
An optional sixth section for proposals of variables that did not fit into one of the predefined sections was provided. The experts were informed that a subsidiary aim of the process was to establish a core data set that was easy to routinely collect and did not require excessive database alterations.
The project organisers recognised the challenge that several reports on how to document and report data in various parts of emergency care have already been published and implemented [24 (link),32 (link),37 (link),39 (link)]. These templates (e.g., Utstein for cardiac arrest, the Utstein template for major trauma and the Utstein template for the reporting of advanced airway management) contain some common variables with slightly different definitions. The expert panel was supplied these published templates and asked to make the new variables compatible with existing template variables, if feasible.
The proposed variables were returned to the project group by email and systemised. Different variables with identical meaning were combined carefully so as not to interfere with the expert panel's proposals. No single proposed variables were deleted. The variables within each section were ranked according to how many times the variables had been proposed by the different expert panel members.
Stage 2. The revised worksheet containing aggregated results from stage 1 was sent to the expert panel. The panel were then requested to rank the ten most important variables in each section from 10 (most important) to 1 (least important). The variables with no ranking were then removed from the list. The results from this ranking provided the basis for the consensus meeting.
Stage 3. The expert panel gathered in Stavanger, Norway and, during a 2-day meeting, agreed upon a core data set. At this meeting, moderators (DL and HML)) led the experts through the results from stage 2. The experts were divided into two groups and discussed portions of the preliminary dataset. The experts subsequently presented their discussions in plenum, and variables were discussed, debated and agreed upon. The variables were decided upon on day 1 and were defined and categorised on day 2. Some variables were not finally defined during the meeting, and the expert panel approved the project group to propose definitions for the remaining variables before stage 4.
Stage 4. Based on step 3, the final data set, including definitions, was prepared by the project group and submitted to the expert panel for final approval. At this point, no additional variables were accepted, but minor changes related to answer categories and definitions were allowed.
Consensus was defined as agreement on the proposed variables at the consensus meeting among the attending experts. Furthermore, we informed the experts during stage 4 that no response was interpreted as agreement to the final core data set.