David Williams
Christchurch Newsroom, January 11, 2024
On November 10 last year, the day a systems glitch robbed some students of the chance to complete their NCEA level one English exam online, Auckland mum Liz (not her real name) wrote to the New Zealand Qualifications Authority.
Hers was a general complaint, rather than particular to the glitch, about what she called, an inequitable situation. Some students had the disadvantage of using pen and paper to complete exams, she said, while others were able to use a computer.
Students with messy handwriting – a common affliction among a generation taught on keyboards rather than biros – were likely to be penalised if markers could not read their papers, Liz told the authority (NZQA).
Her son, a Year 11 student, was among them, in the year’s internal assessments.
“We all know how much easier it is to write and edit an essay on a computer, where you can easily draft, correct errors and re-write passages,” Liz wrote.
Refreshers
More changes to NCEA students in Auckland
NCEA students get additional benefits for University entrance
NCEA Review to modernise school education system
Nothing on ‘Paper’
If NZQA gave no allowances to those taking paper exams, she asked how it could justify giving 40% of students the “unfair advantage” of using a computer.
She asked for statistics to see if they backed her hunch. NZQA duly provided them, and Liz passed them to Newsroom.
To her mind, the evidence is clear. However, an educational expert has crunched the numbers and has a different view, with some nuances.
Gavin Brown, a Professor of Learning, Development and Professional Practice at the University of Auckland, analysed NZQA’s data for NCEA English for levels one, two and three in 2020 and 2022. (Data from 2021 was excluded because of “unexpected events,” which led to grades in Auckland, Northland and Waikato being labelled “paper” in its system.)
“The difference that I see is very small. The overall difference in performance is about one-tenth of a standard deviation,” Brown says.
He analysed the data to gauge the “effect size,” using the proportion of the group at each grade point. This was 0.113 in favour of digital-based exam results.
“Everybody who uses effect sizes using Cohen’s d would say anything less than 0.2 is trivial.”
The effect size for aggregated grades from 2020 and 2022, comparing paper to digital, was even smaller at 0.101.
Brown, who is also Director of the university’s Quantitative Data Analysis and Research Unit, says: “If I was NZQA, I would be saying: ‘Administering our tests on digital does not harm students’ performance’.”
(A 2020 psychometric analysis by NZQA found: “There was no conclusive evidence of a difference in achievement between these two groups of students that could be attributed to the examination format – digital or paper-based”).
Consistent differences
There are observed differences between grades, Brown says, such as 19.2% of paper-based students not achieving level one in 2020, compared with 14.4% online.
Also, those achieving excellence that year were 9.6% of online exam-takers at level one, dropping back to 8.4% on paper.
Liz says that there is a consistent grade difference showing that you are more likely to fail on paper, while those taking online exams have a higher proportion of students across all grades: not achieved, achieved, merit and excellence.
Brown says that these are individual data points, and because the “not achieved” group is so large, there is wide variability between grades.
“The analysis that I have done has looked at what the overall effect on the whole distribution,” he says.
He agrees that digital students did better but notes the different sample sizes: roughly 80,000 using digital across both years; while 212,000 took paper exams.
“You have to consider the possibility that these 80,000 students who did the digital version had background factors that the data you have sent me don’t speak to,” he says.
“This tiny difference in favour of digital may reflect that the students who did the digital exam had prior experience, had the opportunity to practise and learn, and had good equipment and good internet speed, and were good with computers. We just don’t know. There should be no advantage or disadvantage due to the exam mode,” he says
The clarity factor
Another factor potentially accounting for the difference, the academic says, is the clarity of typed answers versus handwritten ones.
“So it could be that in the longer questions where students have to write longer answers, then it may be easier to get a higher score when people can read the writing easily.”
Brown adds: “Legibility matters.”
NZQA maintains that students are not penalised for poor handwriting. Efforts to decipher what is written can lead to exams being viewed by three or more markers.
The tiny advantage for digital may be due to factors other than the testing mode. It could be a socio-economic, a practice effect, or legibility, he says.
“There are so many things that this data does not say.”
NZQA Deputy Chief Executive of Assessment Jann Marshall, says that it is incorrect to assume differences in attainment are necessarily related to the testing mode. Many factors could be at play, she says, “including which schools and which students are engaging in paper versus digital assessment.”
“A significant part is played by teaching, learning and exam preparation.”
NZQA’s Official Information Act response to Auckland parent Liz, from Chief Executive Dr Grant Klinkum said that “a number” of schools teach students on devices, “but still choose to enter their students for the paper version of the assessment.)”
Digital version scores
Many prefer completing exams digitally, Marshall says, because they prefer typing to handwriting, and easier editing of their work.
Marshall, who formerly held a leadership position at the Ministry of Education, says that NZQA has encouraged all schools to adopt digital assessment.
“We have dedicated staff working with schools to provide one-on-one support and training, along with tools for schools and students to test their readiness for digital assessment.
“We have previously worked with Network for Learning to offer schools free readiness checks, providing reassurance of whether school networks are ready to support digital assessment.”
Marshall makes a distinction between last year’s exam glitch and a pure comparison between paper and online tests.
In November, some students were blocked from logging in to three exams, including Level One English, because the digital platform reached capacity. Marshall confirms that students who were “significantly disadvantaged” may be eligible for a derived grade.
“This differs from students who were not disadvantaged, did not plan to attempt the assessment digitally or chose to complete the assessment on paper,” he said.
In 2019, Brown wrote a journal article about the costs and obstacles of developing large-scale, computer-based testing.
“As you move to digital, you have to assure that there is equity in society for students’ opportunities to have the right kind of machines, and the right kind of practice, and preparation for being assessed digitally,” he tells Newsroom.
“To NZQA’s credit, they are going about this very cautiously.”
Marshall says that since 2016, the authority has been encouraging schools to shift towards digital assessments. This year, NCEA level one exams are being offered “digital first,” meaning schools have to opt out of digital exams, instead of opting in.
The question of equality
Schools with “more socio-economic barriers” have lower participation in digital assessments, she says, and students at those schools “attempt less external assessment.”
“To address this, we have prioritised these schools when offering support for digital assessment. Additionally, we have created the option for schools to offer digital assessments drawing on support from more experienced schools.”
This leaves the question: Don’t those statements confirm the inequity Auckland mum Liz suspected?
Her 16-year-old son, who types at 120 words a minute, is nervously awaiting his results. He wishes that he could have deleted answers and re-written them in several paper exams.
He accidentally used bullet points in English – “I might get marked down for that.”
“There were times I had to go off the page and into the margins because there was not enough room to write. In a few of my exams, I took my ruler and ruled extra lines to give me more space – dividing each line into two – so I wouldn’t run out of room.
“So that took a lot of time, and made my handwriting squashed and messier.”
He is sure that he would have written better essays, and higher-quality answers if he had been using a keyboard.
Liz concludes: “It is so incredibly obvious that students using computers have an advantage.”
NCEA results will be released online on January 17.
David Williams is an Investigative Writer for Newsroom. The above Report and Pictures have been published under a Special Arrangement.