So we’ve got our hands on an unreleased consultative report from the National Council for Curriculum and Assessment (NCCA). The report in question is part of the NCCA’s current attempt to ‘review and redevelop’ the primary school curriculum.
Normally I would be hesitant to review a draft report like this, as the document could be improved as it goes through various drafts. However, as you’ll see as you read through this piece, the research that this report is based upon is flawed to such a degree that no improvement in presentation could make up for it. Even so, it should be kept in mind that the NCCA may have intended to improve certain aspects of the report prior to publication.
In 2020 the NCCA released a consultation document titled ‘Draft Primary Curriculum Framework.’ That document laid out the NCAA’s newly proposed curriculum for primary school students. Following a two-year consultation, during which the NCAA held online surveys, focus groups, and bi-lateral meetings with interest groups, the NCCA began to pull together the feedback it had gathered into a document titled ‘Consultation Report on the Draft Primary Curriculum Framework.’ This is the document we’ll be reviewing today.
I’m going to be focusing purely on the quantitative elements of the report. This is partially as we’re all already aware it’s trivially easy to bias qualitative work to support whatever narrative you want, and partially as the report lacks the detail necessary to tell if the qualitative sections are as flawed as the quantitative, which is an issue in itself.
The quantitative section of the report consists of several surveys that were put out by the NCCA over a two-year period. There were four surveys in total, split across two phases. Each phase consisted of two surveys, one designed for educators and one designed for parents. The survey design was exceptionally awkward, with participants routinely being asked what they thought about a section of the Draft Primary Curriculum Framework without being shown the text of that section. Due to this, participants would have had to open the Draft Primary Curriculum Framework in a separate window, or on a separate device, and find the appropriate section, before knowing what the question was actually asking them to comment on. That is humorously poorly designed, but it is also potentially problematic as increasing the difficulty of completing your survey will increase the likelihood that only people particularly interested in an area, i.e. activists, bother to complete the survey.
Section 2.2 of the consultative report notes that, following an evaluation of the response rate for Phase 1, which was “lower-than-expected,” the survey was modified, and the surveys hosting platform was changed “to improve accessibility and user experience.” As such responses from Phase 1 are not directly comparable to responses given during Phase 2. It’s unclear why the issue regarding the text of the document not being displayed alongside the questions was not rectified at this point.
The report does not detail exactly what changes were made to the surveys between the two phases, which is in itself rather a large issue, but we can see from some of the graphs contained within the consultative report that questions were materially changed between the phases. One question, for instance, asked participants to select from one of three possible answers in Phase 1, and from one of four possible answers in Phase 2. It should go without saying that directly amending the text of either questions or answers will change how people respond to the survey. An examination of the limited data made available in the consultative report shows just that; participants giving notably different answers to questions between the phases.
It appears that there was a substantial drop-off in participation over the course of the surveys; likely contributed to by the poor design of the surveys. Whilst the report notes, in Section 2.2, that 320 ‘educators’ finished the online questionnaire in Phase 2, Section 3.4 notes that 957 ‘educators’ started the survey. The graphs contained in later pages of the report show that that figure had declined to 517 by the third section, and 348 by the sixth. Responses to the survey by ‘parents’ saw a similar decline, with 2,614 beginning the Phase 2 survey but only 930 completing it.
The report does not detail how many of the responses to Phase 1 completed the questionnaire, it merely notes how many people responded to the questionnaire. As such we cannot compare the completion rate of the online questionnaires in Phase 1 against Phase 2. In a similar vein three of the graphs drawn from responses to Phase 1 do not detail how many responses the graphed question recieved; all graphs drawn from Phase 2 note these numbers. Whilst this may be accidentally it could also be due to a deliberate attempt to obfuscate information regarding how many people completed the Phase 1 questionnaires.
The most fundamental issue with the NCCA report is that it made no attempt, at all, to verify that those who filled out the online surveys, which were aimed explicitly at parents and educators, were actually either parents or educators. This is so fundamental a flaw that it cannot be moved ahead of – it is simply factually incorrect for the NCCA to say that they can confirm any particular number of educators or parents engaged with the survey component of their consultation. A certain number of people filled out the survey, and the NCCA blindly hopes that that all those people were either educators or parents. I’m afraid I must burst their bubble on that one as I’m neither a parent nor an educator and I personally filled out both of their Phase 2 surveys in order to archive the questions they were asking.
It should be noted that the online surveys prominently state, on each page, that the survey is anonymous. Presenting a public facing notification that the NCCA would not be engaging in attempts to verify respondent’s identity would have increased the likelihood of fraudulent responses.
That is a particular problem when we consider the next issue with the report – the consultative report does not detail how individuals were made aware of the online questionnaire, or who the NCCA enlisted to help them acquire participants. The lack of information in this regard is concerning as, given that the subject in question is a topic of interest to various activist groups, selection bias and brigading are major concerns.
Whilst normally one could analyse responses to attempt to correct for a biased sample, although usually only in a limited fashion after the fact, the total lack of verification conducted by the NCCA, and the near total absence of demographic information requested, means this is impossible in this case. Strikingly it appears there was a roughly 1,600% increase in the number of parents who responded to the Phase 2 online questionnaire (2,614), when compared to the Phase 1 online questionnaire (158). Responses from educators also increased from 208 to 957. The report provides no explanation for how they sourced so many additional respondents for Phase 2, but the marked increase could, barring other explanations, indicate the surveys came to the attention of an interested party.
Taken together the lack of verification, combined with the rather noted, but unexplained, increase in respondents to the online survey in Phase 2, means that there exists a substancially, but unquantifiable, risk that the survey has been tampered with or influenced in some fashion.
Altogether the research backing this consultative report cannot be improved upon, or even salvaged, and that means the NCCA have spent two years, and God knows how much money, putting together research results which aren’t usable. Or rather, aren’t usable unless you happen to agree with the results the NCCA have gathered and want something to give your views a very flimsy appearance of widespread support. Fixing the issues highlighted above would require the NCCA to begin their research on this entirely anew, starting with a total redesign of their survey. On the plus side, it certainly wouldn’t take anyone competent two years to produce something better.