Since the publication of the 2012 Annual National Assessment (ANA) results on 3 December 2012 a number of concerns have been raised about the validity of the results. Some of these concerns relate to the comparison of results from 2011 and 2012, as discussed by Professor Servaas van der Berg and Nicholas Spaull in the Mail and Guardian last Friday [http://mg.co.za/article/2012-12-07-00-vast-improvements-in-national-pupil-test-results-highly-implausible]. The Department of Basic Education (DBE) might be celebrating improvements that are not verifiable or an accurate reflection of the state of South Africa’s education system.
1. Have the 2011 and 2012 ANA’s been set at the same difficulty level to ensure comparability?
Minister Motshekga claims that following the release of 2011 ANA results, “a national strategy to improve literacy and numeracy achievement in all schools was implemented.” She credits this strategy for the large improvements that were noted in the 2012 ANA results. In particular she highlighted the improvement in Grade 3 Literacy results:
“In Grade 3, the national average performance in Literacy, stands at 52% as compared to 35% in 2011, registering an improvement of 17% from 2011.
I must say this is extremely encouraging and should give South Africans great hope that at this rate, we will reach, or even surpass, the targets we have set for ourselves. This is a big margin to achieve in a year by any standards.”
However, the large improvements in the ANA results may not be a consequence of the DBE’s intervention. Rather it seems possible that they are due to the DBE’s failure to ensure that the 2011 and 2012 ANA tested learners at the same difficulty level – a possibility acknowledged by the DBE on page 13 of its report. If the 2011 and 2012 ANA were not equated for difficulty then it is impossible for the DBE to draw a comparison between them.
2. The improvements in the ANA results from 2011 to 2012 are internationally implausible
Improvements of this magnitude over the course of a single year are unheard of internationally. Van Der Berg and Spaull argue that an increase of 17 percentage points from 2011 to 2012 in Literacy is simply not possible. Even if the 2012 ANA results were accurate, it would suggest South Africa has improved its education system more in a single year than Columbia did in 12 years from 1995-2007. During this period Columbia was the fastest improving country out of the 67 countries tested in the Trends in International Mathematics and Science Study. It would also be a larger improvement than Russia achieved from 2001-2006. During this period Russia experienced the largest improvement in learner achievement out of 28 countries that were tested in the Progress in International Reading and Literacy Study. It would make South Africa the leading country in education improvement and an international anomaly.
3. Evidence from the Western Cape suggests the improvements in the ANA results from 2011 to 2012 are implausible
Local comparisons also bring into question the accuracy of the 2012 ANA results. The Western Cape conducts annual tests of Grade 3 and Grade 6 learners. These tests are standardised to ensure that they are of the same difficulty each year. They are also marked centrally and not by the schools. The Systemic Test results in the Western Cape from 2011 to 2012 reveal almost no improvement in learner performance; however, the 2012 ANA results show an improvement of 14 percentage points from 2011.
4. Were the ANA’s subject to external verification?
Compounding these concerns is the DBE failure to reveal whether the 2012 ANA results were subjected to external, independent verification. Without verification it is impossible to determine the integrity of the assessment or the results. In 2011 the ANA results were verified by the Human Sciences Research Council.
5. The lack of disaggregated data, at this point, makes it difficult to draw real conclusions
It is imperative that the DBE releases disaggregated results. National averages can be misleading and mask inequalities that exist between former ‘model-C’ schools and township or rural schools, attended mainly by working class learners. Disaggregated results will reveal the differences between schools, provinces and districts. They will allow citizens to see if there are any real improvements in former black schools.
6. The culture of rapid results improvement fostered by the public, media and government is not helpful.
Improving schools and learner performance will take time. There is no ‘quick fix’ for the challenges and systemic inequalities that affect our education system. The purpose of the ANA results must not be subverted for short-term political gains. Expecting year-on-year improvements on the order seen this year is untenable. The ANA is an important tool that, if implemented properly, will allow the DBE to measure educational progress and refine interventions. The DBE deserves credit for introducing this measure, but the problems raised need to be addressed for the ANAs to retain their value.
For comment please contact
Yoliswa Dwane (EE Chairperson) on 072 342 7747/021 387 0022
Doron Isaacs (EE Deputy General Secretary) on 082 850 2111