As Chair of the Measurement Advisory Group that provides advice to ACARA on NAPLAN technical issues regarding assessment and measurement, I reject the conclusions of the recently released Perelman review of NAPLAN comparability.
It is my belief that Professor Perelman’s report is under-researched and lacks the technical understanding required for the analysis of NAPLAN tests and results.
Professor Perelman’s expertise to make these claims is questionable as he is not recognised through the peer-reviewed academic literature as an expert in the field of computer adaptive assessment.
It is untenable to claim, without any evidence, that reading and numeracy content cannot be assessed in a comparable manner. There are numerous examples of international assessments, similar to NAPLAN, that have clearly demonstrated comparability between online and paper versions of a test.
With the correct assessment design and post-test analysis, the Measurement Advisory Group (MAG) has concluded that online and paper NAPLAN results are comparable andcan be placed on the common NAPLAN assessment scale.
The Perelman report’s technical comments about scaling and branching suggest a misunderstanding of NAPLAN procedures and analysis, and have no sound basis.
NAPLAN technical reports showing the statistical and psychometric methodology that led to the results are routinely released on the NAP website (
As with previous years, the MAG has reviewed and signed off on the technical work undertaken before the release of the 2018 NAPLAN results and has endorsed the NAPLAN 2018 comparability analysis.
Professor Ray Adams
Honorary Senior Fellow
Measurement Advisory Group Chair
The University of Melbourne
/Public Release.View in full here.