The Primary Cause of Structural Unemployment in South Africa: Poor Education Standards and Policy

South Africa’s unemployment is a different creature from that in the US and in the developed world’s papers at the moment. We don’t have a cyclical lack of demand (although demand isn’t as robust as I’d like).  We have massive, unmanaged structural unemployment in large sectors of the economy.

I say “in large sectors of the economy” because it isn’t true to say that we have universal unemployment. In fact, a feature of structural unemployment is that it usually is not uniform throughout the economy (like cyclical unemployment often is).  I don’t know any actuaries or engineers who are unemployed for more than a brief period between jobs, and usually the jobs start and end back to back. There will be other examples too.

Unemployment is driven by education

Interesting that 75% of our unemployed are “unskilled”. (I heard this on the radio, so I don’t know that  the number is correct or it’s source, but it does map to my previous analysis based on census showing unemployment by education level attained.

  • The unemployment rate for those with less than “matric with university exemption” is between 30% and 40%.
  • Matric with university exemption unemployment is 23%
  • The unemployment rate for this with better than “matric with university exemption” is on average below 10%.
Supply and Demand for Labour
Supply and Demand for Labour

Economic growth isn’t the only solution to unemployment; in fact it’s not even necessarily a solution.  Prior periods of strong economic growth added jobs only very slowly. We have massive, structural unemployment in this country. We are making some of the right noises with our government’s new jobs plan and jobs fund.

Education in South Africa is not performing as needed

However, given the obvious relationship to education, why don’t we take the problems of our education system seriously?

  • Standards have dropped (although enrollment has increased, which is also critically important for improving average education attainment)
  • While enrollment has increased, the percentage of GDP spent on education has dropped from 6.4% in 1994/1995 to 5.3% in 2006/2007. This is partly explained by an increase in government spending in total related to social programmes. The real expenditure per learner has increased by approximately 5% per year from 2000 to 2007. We still spend a high proportion of our budget on education, but with poor outcomes.
  • We don’t seem to participate in as many international standardised tests as before (reflecting a lack of appreciation for wanting to know what our standards are actually like)
  • Standards are too low so that passing isn’t sufficient for further education / job placement
  • “Literacy” measures have improved, although this is measured as having completed 7 grades of education. Why is it so hard to appreciate the need to  use measures (or at least also use measures) independent of the system to measure the success of the system?
  • The Department of Educations 2009 Macro report uses only 1999 and 2003 data from TIMSS.  It shows our results are very low, but have not decreased by much. We didn’t participate in 2007 (probably because it’s embarrassing coming bottom of the class), which is a shame. 2003 data is really not appropriate to be using in 2011 to evaluate performance of our education system. (It looks like we are at least signed up to participate in 2011, which is great news. We’ll have to wait till December 2012 to get the results though.)
  • The Quality Improvement, Development, Support and Upliftment Programme has budget allocated per province to spend on improving education.  While the Free State managed to spend 99% of its budget and the Western Cape 96%, the Eastern Cape spent only 51%, Limpopo 36% and KZN an impressive 11%. This is also home to possibly the worst, most impossible to read, mixed-numbers, mixed-decimal point table I have ever seen.

What does this table even mean? Except that provinces are generally not even spending the budget allocated to improve underperforming schools. How is it even possible that the Eastern Cape has so little budget allocated to this problem when they are one of the biggest culprits when it comes to poor school infrastructure and poor education results?


International indicators of performance

TIMSS 1999 Rock-bottom in Maths and Science. 6 provinces results went backwards in Maths and Science. The DoE’s report indicates that our syllabus has the least overlap with the TIMSS assessment framework – this offered as an excuse for poor performance (which it is) but without even considering that maybe that is a problem in itself. Should our education system really be going about things in a different way to established international norms? Maybe, but I think that needs to be proved and the default position should be international standards unless otherwise proved.

PIRLS 2006 Rock-bottom in reading, although this is against primarily developed countries (and only above Kuwait and Qatar in terms of gender inequality of reading education) (Chapter 1 and full report) (Curiously, other measures including our own Department of Education and SACMEQ studies seem to show girls outperforming boys, rather than the other way around.)

South Africa’s SACMEQ scores didn’t  change significantly from 200 to 2007 – whereas only one peer country had a reading score decline and 2 had mathematics scores decline. and 14 countries’ scores increased across maths and science. The percentage of students not sharing a textbook declined from 45.5% in 2000 to 45.0% in 2007. We are not making progress.






The link between enrollment and attainment levels is clear, but more clearly than that South Africa is massively underperforming.Curiously, the DoE’s report seems to suggest that a reason for our low attainment is high enrollment, whereas the graph above shows that high enrollment is associated with high attainment for those enrolled. My guess at the cause of this strong correlation is that “countries that take education seriously focus on both enrollment and attainment”.




Some South African problems with matric pass rate measurement

The pass rate of matric exams is a function of the quality of education (higher means higher pass rate), standards (higher means lower pass rate) and selection of learners who write the exam (restricting it to only relatively strong students artificially increases the pass rate). We should not celebrate an increase in the pass rate unless we have shown that standards of the exam have not been decreased and that learners weren’t discouraged from writing the exam. I would suggest that the standard of the test is more likely to vary from one year to the next rather than miraculous changes in education quality levels for learners who have had 12 years (or more…) of education to get there.

For example, this graph worries me no end.

As the number of candidates writing the exam goes up, so the pass rate comes down. This reflects progress in keeping learners in school on the one hand, but also likely over-eager promotion of students at lower grades. As the number of candidates reduces (as steps are taken to ensure non-trivial promotion) the pass rate trends back up again. From this graph it's hard to know whether the pass rate tells us anything at all.

I’m not going to show the real/nominal expenditure per capita because the glaring errors in the “2006 real expenditure per capita column” suggests to me that the creator of the table needs to go back to school.


This post has been more of a collection of observations and thoughts and extracts from other research. In summary:

  1. Education is key to unemployment and growth. Unless we recognise that education is the most important long-term solution to structural unemployment we can’t take the problem seriously. Skilled citizens by and large are employed. Unskilled citizens are unemployed at 30% to 40% levels.
  2. Government spends significant budget on education, but with poor outcomes. Some budget available is not being spent at all.
  3. Measures of literacy and matric pass rates are fundamentally flawed. We need to place increased focus on outcomes compared to international standards if our measures are to be useful.
  4. We need to participate more, and more regularly, in international comparatives, no matter how embarrassing the current results are.
  5. Current performance against international (and other African country) performance is poor and not improving.
  6. The quality of research and ability to recognise the problems in our education system by the Department of Education itself needs to be improved.





Published by David Kirk

The opinions expressed on this site are those of the author and other commenters and are not necessarily those of his employer or any other organisation. David Kirk runs Milliman’s actuarial consulting practice in Africa. He is an actuary and is the creator of New Business Margin on Revenue. He specialises in risk and capital management, regulatory change and insurance strategy . He also has extensive experience in embedded value reporting, insurance-related IFRS and share option valuation.

Join the conversation


Leave a comment

Your email address will not be published.

This site uses Akismet to reduce spam. Learn how your comment data is processed.