ADVERTISEMENT
SSLC district rankings, a game?District rankings tend to generate intense pressures to 'perform', which lead to multiple preparatory exams.
DHNS
Last Updated IST

“Udupi tops”, “Gadag slides to bottom” are two headlines for this year’s SSLC results in Karnataka. Every year, a key media highlight is the ranking of districts from top to bottom, based on the percentage of students who pass in SSLC examinations from the districts.

The education head of Gadag is quoted as saying woefully, “In 2012-13 Gadag stood at 18th place and this year we expected to be within the 10th place”. Whereas Udupi is delighted at having moved up from ninth rank last year to first this year. The business of ranking leaves one wondering how all districts can hope to improve theirs. If some districts ‘go up’, others have to ‘go down’.

The district pass percentage figure by itself, does not give any analytical insight on what needs to be done. Two out of three students from Gadag have passed; what needs to be done for the remaining one-third? More resources for students? More teacher training? More testing?  Or less? The pass percentage reveals no answer to these questions. But what does it hide?

Take Bangalore South district, which is ranked at 29th out of 34 places. A total of 139 schools with 100 per cent results are from here; Bangalore South is in fact in the third place for this statistic in the state. This suggests that the average pass percentage can hide a stark variation in the results.

There are some ‘high performing’ schools and some ‘very poorly performing’ schools and the average of these has placed Bangalore South in the ‘not too bad’ 29th position, which is quite misleading. So at the least, disaggregating results by management (private, aided and government schools) is necessary.

District rankings tend to generate intense pressures among districts to “perform”, which lead to multiple preparatory examinations, encouraging or turning a blind eye to copying, creating guides and answer keys, and conducting coaching classes.

A significant part of teacher energies is devoted to “Mission 40” in which the focus is on drilling students to get the “right answer” to ‘likely questions’ so that they get the minimum 40 per cent pass marks, this may not be associated with any actual learning. Udupi credits its “Mission 40” for its number one position.

While the SSLC pass percentage is only a proxy for school quality, yet, it can give some idea of performance, especially in identifying outliers (schools with very high or very low pass percentages). So what analyses should be done, to drive school improvement as well as policy and programmatic correction?

Analysing performance by school, by subject, by medium of instruction, and across time, can help identify 'schools with challenges', but this needs to be done at the block level, the lowest tier in the high school system. If there are schools with a pass percentage much lower than the average for that block, there is a need to investigate.
It is important to guard against a premature conclusion that ‘the teacher is not doing her job’, but begin a mutually respectful dialogue with the teachers and the school administration to assess and understand the causes for the quality of education.
The key here is the keenness to understand through ‘investigation’ what can help the schools, and provide the required support, in terms of students resources, teacher training or school infrastructure, rather than in ‘punishment’.

Analyse own performance

Also schools should analyse their own performance. The assessment data is ‘computerised’ but is usually available only with the state examination board. Schools do not hold the data of individual students. Since the data is held in digital form, it is eminently possible to provide student-wise marks to every school, in simple spreadsheet format.

Teachers need to be trained and empowered to use spreadsheets (software available on any computer), to analyse marks by subject and across years, the ranges of student marks and so on to identify and understand patterns. Similarly, block level analyses and data visualisation of schools by subjects, medium of instruction, and across years can be done using spreadsheets.

In the absence of such deeper analyses, district ranking seems simply a game played to identify ‘top ranking districts’ and feel good about them, and in the process condemn ‘bottom districts’ as no good, effectively obscuring the need to study and provide the resources for tackling the real challenges of school improvement. 

Ranking students in itself is problematic, yet it can be justified as being the ‘objective’ basis for deciding admission to higher education. In the case of districts, there is no such compulsion. Education authorities, non governmental actors, academic institutions and media would be well-placed to ignore this meaningless statistic and instead, creatively collaborate at the micro-local level, to improve the quality of education.

(The writer is the Director, IT for Change, (www.ItforChange.net) an NGO which is working with  government high schools across Karnataka)

ADVERTISEMENT
(Published 21 May 2015, 23:15 IST)