Our Story

Ceremony

Just Married

Learn To Code, It’s Harder Than You Think

by - December 04, 2015

TL;DR: All the evidence shows that programming requires a high level of aptitude that only a small percentage of the population possess. The current fad for short learn-to-code courses is selling people a lie and will do nothing to help the skills shortage for professional programmers.

This post is written from a UK perspective. I recognise that things may be very different elsewhere, especially concerning the social standing of software developers.

It’s a common theme in the media that there is a shortage of skilled programmers (‘programmers’, ‘coders’, ‘software developers’, all these terms mean the same thing and I shall use them interchangeably). There is much hand-wringing over this coding skills gap. The narrative is that we are failing to produce candidates for the “high quality jobs of tomorrow”. For example, this from The Telegraph:

“Estimates from the Science Council suggest that the ICT workforce will grow by 39 per cent by 2030, and a 2013 report from O2 stated that around 745,000 additional workers with digital skills would be needed to meet demand between now and 2017.

Furthermore, research by City & Guilds conducted last year revealed that three quarters of employers in the IT, Digital and Information Services Sector said that their industry was facing a skills gap, while 47 per cent of employers surveyed said that the education system wasn’t meeting the needs of business.”

Most commentators see the problem as being a lack of suitable training. Not enough programmers are being produced from our educational institutions. For example, here is Yvette Cooper, a senior Labour party politician, in The Guardian:

“The sons and daughters of miners should all be learning coding. We have such huge advantages because of the world wide web being invented as a result of British ingenuity. We also have the English language but what are we doing as a country to make sure we are at the heart of the next technology revolution? Why are we not doing more to have coding colleges and technical, vocational education alongside university education?”

There is also a common belief in the media that there are high barriers to entry to learning to code. This from the Guardian is typical:

“It’s the must-have skill-set of the 21st century, yet unless you’re rich enough to afford the training, or fortunate enough to be attending the right school, the barriers to learning can be high.”

So the consensus seems to be that high barriers to entry and a lack of accessible training mean that only a rich and well educated elite have access to these highly paid jobs. The implication is that there is a large population of people for whom programming would be a suitable career if only they could access the education and training that is currently closed to them.

In response, there are now a number of initiatives to encourage people to take up programming. The UK government created ‘Year of Code’ in 2014:

year-of-code

The message is “start coding this year, it’s easier than you think.” Indeed the executive director of Year of Code, Lottie Dexter, said in a Newsnight interview that people can “pick it up in a day”. Code.org, a “non-profit dedicated to expanding participation in computer science education”, says on its website, “Code.org aims to help demystify that coding is difficult”.

hour-of-code

So is it really that easy to learn how to code and get these high paying jobs? Is it really true that anyone can learn to code? Is it possible to take people off the streets, give them a quick course, and produce professional programmers?

What about more traditional formal education? Can we learn anything about training programmers from universities?

Given the skills shortage one would expect graduates from computer science courses to have very high employment rates. However, it seems that is not the case. The Higher Education Statistics Agency found that computer science graduates have “the unwelcome honour of the lowest employment rate of all graduates.” Why is this? Anecdotally there seems to be a mismatch between the skills the students graduate with and those that employers expect them to have. Or more bluntly, after three years of computer science education they can’t code. A comment on this article by an anonymous university lecturer has some interesting insights:

“Every year it's the same - no more than a third of them [CS students] are showing the sort of ability I would want in anyone doing a coding job. One-third of them are so poor at programming that one would be surprised to hear they had spent more than a couple of weeks supposedly learning about it, never mind half-way through a degree in it. If you really test them on decent programming skills, you get a huge failure rate. In this country it's thought bad to fail students, so mostly we find ways of getting them through even though they don't really have the skills.”

Other research points to similar results. There seems to be a ‘double hump’ in the outcome of any programming course between those who can code and those who can’t.

“In particular, most people can't learn to program: between 30% and 60% of every university computer science department's intake fail the first programming course.”

Remember we are talking about degree level computing courses. These are students who have been accepted by universities to study computer science. They must be self selecting to a certain extent. If the failure rate for programming courses is so high amongst undergraduates it would surely be even higher amongst the general population - the kinds of candidates that the short ‘learn to code’ courses are attempting to attract.

Let’s look at the problem from the other end of the pipeline. Let’s take successful professional software developers and ask them how they learnt to code. One would expect from the headlines above that they had all been to expensive, exclusive coding schools. But here again that seems not to be the case. Here are the results of the 2015 Stack Overflow developers survey. Note that this was a global survey, but I think the results are relevant to the UK too:

SO-dev-survey

Only a third have a computer science or related degree and nearly 42%, the largest group, are self taught. I have done my own small and highly unscientific research on this matter. I run a monthly meet-up for .NET developers here in Brighton, and a quick run around the table produced an even more pronounced majority for the self-taught. For fun, I also did a quick Twitter poll:

mh-self-taught-twitter-pol

76% say they are self taught. Also interesting were the comments around the poll. This was typical:

self-taught-tweet

Even programmers with CS degrees insist that they are largely self taught. Others complained that it was a hard question to answer since the rate of change in the industry means that you never stop learning. So even if you did at some point have formal training, you can’t rely on that for a successful career. Any formal course will be just a small element of the continual learning that defines the career of a programmer.

We are left with a very strange and unexpected situation. Formal education for programmers seems not to work very well and yet the majority of those who are successful programmers are mostly self taught. On the one hand we seem to have people who don’t need any guided education to give them a successful career; they are perfectly capable of learning their trade from the vast sea of online resources available to anyone who wants to use it. On the other hand we have people who seem unable to learn to code even with years of formal training.

This rather puts the lie to the barriers to entry argument. If the majority of current professional software developers are self taught, how can there be barriers to entry? Anyone with access to the internet can learn to code if they have the aptitude for it.

The evidence points to a very obvious conclusion: there are two populations: one that finds programming a relatively painless and indeed enjoyable thing to learn and another that can’t learn no matter how good the teaching. The elephant in the room, the thing that Yvette Cooper, the ‘year of code’ or ‘hour of code’ people seem unwilling to admit is that programming is a very high aptitude task. It is not one that ‘anyone can learn’, and it is not easy, or rather it is easy, but only if you have the aptitude for it. The harsh fact is that most people will find it impossible to get to any significant standard.

If we accept that programming requires a high level of aptitude, it’s fun to compare some of the hype around the ‘learn to code’ movement with more established high-aptitude professions. Just replace ‘coder’ or ‘coding’ with ‘doctor’,  ‘engineer’,  ‘architect’ or ‘mathematician’.

  • “You can pick up Maths in a day.”
  • Start surgery this year, it’s easier than you think!
  • skyscraper.org aims to help demystify that architecture is difficult.
  • “The sons and daughters of miners should all be learning to be lawyers.”

My friend Andrew Cherry put it very well:

free-training-ac

Answer:  only one: software development. You want to be a doctor? Go to medical school for seven years.

Accepting that aptitude is important for a successful career in programming, we can approach the ‘shortage’ problem from a different angle. We can ask how we can persuade talented people to choose programming rather than other high-aptitude professions. The problem is that these individuals have a great deal of choice in their career path and, as I’m going to explain, programming has a number of negative social and career attributes which make them unlikely to choose it.

There’s no doubt that software development is a very attractive career. It’s well paid, mobile, and the work itself is challenging and rewarding. But it has an image problem. I first encountered this at university in the 1990’s. I did a social science degree (yes I’m one of those self taught programmers). Socially, us arts students looked down on people studying computer science, they were the least cool students on the campus - mostly guys, with poor dress sense. If anyone considered them at all it was with a sense of pity and loathing. When towards the end of my degree, I told my then girlfriend, another social science student, that I might choose a career in programming, she exclaimed, “oh no, what a waste. Why would you want to do that?” If you did a pop-quiz at any middle-class gathering in the UK and asked people to compare, say, medicine, law, architecture or even something like accountancy, with software development, I can guarantee that they would rate it as having a lower social status. Even within business, or at least more traditional businesses, software development is seen as a relatively menial middle-brow occupation suitable for juniors and those ill-qualified for middle management. Perversely, all these courses saying ‘learn to code, it’s easy’ just reinforce the perception that software development is not a serious career.

There’s another problem with software development that’s the flip side of the low barriers to entry mentioned above, and that is there is no well established entry route into the profession. Try Googling for ‘how to become a doctor’, or ‘how to become a lawyer’ for example:

how-to-become-a-doctor

There are a well established series of steps to a recognised professional qualification. If you complete the steps, you become a recognised member of one of these professions. I’m not saying it’s easy to qualify as a doctor, but there’s little doubt about how to go about it. Now Google for ‘how to become a software developer’, the results, like this one for example, are full of vague platitudes like ‘learn a programming language’, ‘contribute to an open source project’, ‘go to a local programming group’. No clear career path, no guarantees about when and if you will be considered a professional and get access to those high-paying jobs of the future.

Yes, I made this up, but it makes the point. :)

Now take a high-aptitude individual who has done well at school and finds demanding intellectual tasks relatively straightforward, and offer them a choice: on the one hand, here is a career, let’s take medicine for example, you follow these clearly enumerated steps, which are demanding but you are good at passing exams, and at the end you will have a high-status, high paying job. Or, how about this career: go away, learn some stuff by yourself, we’re not sure exactly what; try and get a junior, low status job, and just learn more stuff – which you can work out somehow – and work your way up. No guarantees that there’s a well paying job at the end of it. Oh, and, by the way, the whole world will think you are a bit of a social pariah while you are about it. Which would you choose?

So could software development follow the example of older professions and establish a professional qualification with high barriers to entry? There are attempts to do this. The British Computer Society (BCS) calls itself ‘the chartered institute for IT’ and seeks establish professional qualifications and standards. The problem is that it’s comprehensively ignored by the software industry. Even if you could get the industry to take a professional body seriously, how would you test people to see if they qualified? What would be on the exam? There are very few established practices in programming and as soon as one seems to gain some traction it gets undermined by the incredibly rapid pace of change. Take Object Oriented programming for example. In the 2000’s, it seemed to be establishing itself as the default technique for enterprise programming, but now many people, including myself, see it as a twenty year diversion and largely a mistake. How quickly would programming standards and qualifications stay up to date with current practice? Not quickly enough I suspect.

However, my main point in this post has been to establish that programming is a high-aptitude task, one than only some people are capable of doing with any degree of success. If the main point of a professional qualification is filter out people who can’t code, does it really matter if what is being tested for is out of date, or irrelevant to current industry practices? Maybe our tentative qualification would involve the completion of a reasonably serious program in LISP? A kind of Glass Bead Game for programmers? The point would be to find out if they can code. They can learn what the current fads are later. The problem still remains how to get industry to recognise the qualification.

In the meantime we should stop selling people a lie. Programming is not easy, it is hard. You can’t learn to code, certainly not to a standard to get a well-paid-job-of-the-future, in just a few weeks. The majority of the population can not learn to code at all, no matter how much training they receive. I doubt very much if the plethora of quick learn-to-code courses will have any impact at all on the skills shortage, or the problem of unskilled low pay and unemployment. Let’s stop pretending that there are artificial barriers to entry and accept that the main barrier to anyone taking it up is their natural aptitude for it. Instead let’s work on improving the social status of the software industry – I think this is in any case happening slowly – and also work on encouraging talented young people to consider it as a viable alternative to some of the other top professions.

You May Also Like

0 comments