Computer programming and the trouble with collective nostalgia

Lord Puttnam said something every interesting at an E-Learning Foundation Conference. Having been a film producer, he said that up to about ten years ago, to be a successful cinematographer you had to be able to take a camera apart and put it together. Now, none of those sort of skills  are required: you need a whole different set of skills in order to find employment in that occupation.

I believe a similar thing is true in the realm of “digital education”. Almost nobody needs a gasp of computer programming, and even fewer need to know how computers actually work.

You don't need to know how it works

Now, if you’re talking about computational thinking, to use the Royal Society’s term (Shut Down or Restart?), as a reason to study Computer Science, that’s different:

“Computational thinking” offers insightful ways to view how information operates in many natural and engineered systems.

I’m not entirely convinced that Computer Science is the only, or even the best, way of teaching computational thinking, but at least it’s a sound reason – and an honest reason – for suggesting it. How many pupils will need to use coding in their future employment? How many people in the games industry itself (one of the loudest voices in this debate) need to have coding skills? Almost none of them.

Also, given that in order to succeed as a programmer you need high level mathematical skills too, telling kids that lots of doors will open for them if they take Computer Science or something similar is, I would suggest, somewhat naive at best. In fact, I’m almost certain that universities would be better off insisting that students come to them tabla rasa as far as “proper” computer programming is concerned.

Don’t get me wrong: I’m all in favour of encouraging young people to explore computer science options in education and employment, but not by misleading them, even if unintentionally.

There is currently a sort of collective nostalgia for the time when you had to do real programming. Just about every conference I go to includes a presentation that contains a photo of the BBC Micro. Using the BBC Micro, and programming with it, was fun in a way – but only because there weren’t that many alternatives in school. In fact, at around that time someone drew my attention to the Atari ST. With its graphical interface, and WYSIWYG applications, it was the obvious choice for doing productive and creative work. The BBC wordprocessor of the day required you to type a code in the margin in order to make words bold or underlined, and you couldn’t see what the document looked like until you printed it out. In what sense of the word could that be described as “fun”? It was a monumental waste of time.

My recollection of coding at the time is one of spending ages copying lines of code from a magazine article into the computer, only to have it tell you, when you typed “RUN”, that there was an error on Line 1210. Or of writing a program yourself only to realise, with horror, that you’d forgotten to put a line number in somewhere near the beginning, meaning that you had to delete everything back to that point. Not fun at all.

As for understanding how computers work, in the sense of being able to take them apart and put them together again, it’s a completely unnecessary skill, at least for 95% of the population. It may be fun, but only if you’re that way inclined. I have known people who like taking car engines apart, making their own candles and refitting their own kitchens. I don’t think I have suffered in any way at all by having no interest in developing any of those skills.

One thing I did like doing was making and editing films as a hobby. The editing was a skilled and labour-intensive job, and completely different from digital editing. I used to have a lot of fun doing it, and would spend hours at it – but I wouldn’t dream of suggesting that kids should be able to edit “real” film as part of a modern video-making course.

Admittedly, the early days of computing in schools were satisfying and fun in the way that pioneering and undergrowth-clearing activities can be, but I’d never want to inflict any of it onto people these days. I regard the nostalgic views expressed about computers and computing in the the same way as I regard the way people speak about the last world war: nothing they say will ever convince me that it was an experience anyone should have foist upon them.