Such is the corrupted nature of our education system, we always knew that the ‘independent’ national standards monitoring project would not be as such. It is disappointing, though, that a group of respected New Zealand educationists (mainly in mathematics) should be associated with such a biased research design and they would acquiesce in not consulting teacher organisations. (126 schools have been selected at random and invited to participate in a three year longitudinal study. The consent forms from schools were to be returned by 21 May.)
The contract process was, of course, structured to produce a result self-serving to the system, starting with the way the ministry set out the contract requirements. The scope suggested for the research was grossly circumscribed and pointed to an expectation that a group of schools be selected for long-term special attention. The wording read objectively in the formal language the ministry uses on such occasions, with the word ‘independent’ having pride of place. The subtext, though, communicated that the ministry wanted a pitch that looked straightforward but guaranteed the ‘right’ result. In effect, the whole process was value laden.
The Dunedin research design follows the well-trodden path of the official research used in England following the introduction of national testing. (It also follows the path of most quantitative research.) In England, that path had the official research producing fairly positive results while other research designs, undertaken at the same time, produced very different results. Ten years later, none of the research produced positive results. But that didn’t matter. The bureaucracies had had their way.
We do know the research group is a Dunedin-based limited company, in this case funded by the ministry. I do not know whether they have had many other research contracts, and if so, whether any of them were from the ministry. We can presume, I suppose, that the ministry will be a major source of income for the foreseeable future. These, however, are honourable people with good intentions, the problem is the system.
It is hard to believe, though, that when the research group was working on the design they weren’t aware to some degree of the two elephants in the room: the Hawthorne effect and the blinkered research focus.
The use of the Hawthorne effect provides the foundation and much of the framework for quantitative research. Quantitative researchers talk constantly about ways to reduce the Hawthorne effect, but do little about it because their careers, status, and future are based on their research producing outcomes attractive to the bureaucracies and the government.
The Hawthorne effect is based on the idea that a new development in education, irrespective of the nature of the new development, usually has initial positive effects on student achievement. That is because the groups involved are made to feel special as a result of the focus on them, leading to feelings of solidarity, enthusiasm, and competition with other groups.
The crucial point, however, is that little of the initial improvement is sustained.
By being guilefully naive, or straightforwardly slippery, the Hawthorne effect can be manipulated in all sorts of ways. For instance, as I explained in a posting (‘The Hattie series, Part 2′) if you want to get an accurate assessment of the influence on learning of a reduction in class sizes, all you have to do is to wait for the reduction in class size to occur throughout the system, then apply the research in a general way. The influence on learning indicated will be small but realistic. But if you select a group of schools within the larger grouping for special attention, and the monitoring is to be sustained and structured, then you will, inevitably, get the an initial burst of enthusiasm and performance that comes from pride, group feeling, and status being at stake. A lot of the improvement, though, even at this initial stage is false because of what is at stake. Nor does it matter that the research undertakings and results have a degree of anonymity; the feeling that a lot is at stake is still present.
A statement of caution about the influence of the Hawthorne effect on findings is of no value at all. The Hawthorne effect is laden in research such as the Dunedin one and cannot be fenced off as a percentage. When people are asked to report on themselves or the group they are part of, the Hawthorne effect comes seriously into play; and when the stakes are high, as they are in this kind of research, the figures will not be accurate.
I am willing to believe that in a conscious way those involved in the monitoring project have not taken the Hawthorne effect on board, but deep in their bones they must have known they were going to deliver results that would please their paymaster. And no doubt, they have responded to the cues of ministry approval as they felt their way forward.
In a recent posting (‘How corrupted is our education system?’) I wrote that ‘the basic ideals of a western-style education system are based on the consistency with which the welfare of the child is at the centre of decision making; the ethicality of decision making; and the degree of freedom the main participants have to express their ideas freely and influence that decision making.’
‘ Our education system’, I said, ‘was out of balance, lacking sufficient checks and balances, creating an environment in which certain participants feel free to act blithely, unscrupulously, autocratically, and without due consideration for some other participants, namely teachers and children.’
At the very least, the teacher organisations should have been consulted and the considerable issue of the Hawthorne effect thoroughly accounted for.
I went on to say that ‘much number-bound research is akin to lying’ and that ‘the Hawthorne effect and other research influences are much talked about – also much relied on to produce career-enhancing results’; also with ‘high-stakes situations abounding, most numbers produced about anything in education are, at the minimum, distortions.’
Then there is the blinkered scope.
The research is designed to point to the nuts and bolts of literacy and numeracy and the particulars of national standards. There is little direct attention to the way national standards: establish high stakes’ situations; increase competition between schools; increase bureaucratic control over schools and reduce school initiative; encourage hierarchical school organisational structures; are found attractive by certain personality-type teachers and principals and alienate others; lead to an emphasis on standardised assessment tools; lead to an emphasis on numbers to the detriment of teacher judgement; lead to a stigmatising and labelling of children; confuse and concern less confident children; become curriculum ends; formalise, distort, and narrow the curriculum; reduce the importance of the affective and cognitively complex in learning; encourage atomistic learning; lower the status of curriculum areas not subject to national standards; provide children with the perception that learning is about narrow, instrumental procedures; and industrialise education.
National standards in education are not an island. They are having and will continue to have a tumultuous effect on primary school education. Ignoring the wider effects on education is a form of severe dishonesty and authoritarian in its ethical basis.
In the previous posting I suggested one way out of the dilemma could be by putting effect against effect:
‘The fundamental, bottom line demand by the teacher organisations should be that a control group of schools be set up. That control group would use individually developed school standards and indicators based on the curriculum levels and school-based moderation; they would also report to parents on the basis of the curriculum level most appropriate to a child and the progress that child had made. (All of which is where schools are, or heading to now. If such a control group is not permitted then the monitoring project should be decisively opposed.)’
However, the best way to reduce the Hawthorne effect would be to recast the research design entirely; and the best way to do that would be to dispense with the special group of schools selected for a longitudinal study. At any particular time, all schools should be up for grabs in relation to stages of the implementation of the policy, or a particular issue on which information is sought.
The present research design is another kick in the teeth for teachers and children. I appeal to all those who have the power to do something about it, to do their best to rid the system of this offensive development. It just will not do. I also appeal to the members of the Dunedin group: you are a lot better than this – try again. I do not blame you: I blame the system. Its increasing authoritarianism inevitably affects the fairness and honesty of its functioning.
This is an opportunity for all of us to mount a telling challenge.