Something I’ve spent quote a bit of time thinking about during my years as a student, then lab assistant (AKA demonstrator), and finally occasional lecturer with the computer science department in NUI Maynooth, is how students should be introduced to computer science and programming. I’ve seen all sorts of tactics tried over the past 14 years. The absolute worst tactic I’ve seen is the abandonment programming from first year computer science program altogether. Another disaster in my opinion was the introduction objects before the introduction of basic constructs like conditional statements and loops, the confusion that cased was monumental. I have been involved with final year undergraduate projects for much of my time with the department and have seen first-hand the effects of some of the different approaches. No one seems to be able to agree on how best to start computer science students programming, but something no one can argue with is that any system that results in final year honours students being unable to program is fundamentally flawed.

[tags]computer programming, education[/tags]

I’ve watched the programming abilities of final students plummet to positively frightening levels over the years. I put it down to a poor introduction to the basic principles. The language used is peripheral, it’s the principles that matter. They are universal. Starting students off with high-level libraries, flashy GUIs and object orientation before they even know what an if statement is nothing short of retarded in my view. Not teaching programming at all to first years is even worse. It is simply not fair on students. Some people are not made to program. It’s an inconvenient truth that no matter how good your teaching methods are, not everyone is wired to program and programming is a vital and integral part of computer science. You can’t do computer science without the ability to create sets of computer-readable instructions to command computers. In other words, programming is vital to computer science, and not everyone can program.

Imagine you are a first year and you have some subject choices to make. You try computer science to see how it fits and you get on fine for the six week period you get to change your mind. In fact, you get on fine for the entire first year because you only cover some vague theories and some history. You don’t implement anything real. You come back and start your second year and are then introduced to programming. You find out two things, firstly, that it is not for you, and secondly, that you can’t be a computer scientist without the ability to program. How hacked off would you feel that your department chose to ‘protect’ you from the realities of computer science until well after the period where you can easily change your subject choices? Personally I’d be right browned off!

We can now take for granted that we need to introduce computer science first years to programming, the question of how remains. I see a lot of that debate focusing around the choice of language. Should we teach C or Java? What about .Net, there’s a lot of jobs in that? How about JavaScript, the web is cool after-all? If the choice of language is your starting point then you’re on the highway to failure already. You need to focus your entire course around the core principles. What is a program? What is a compiler? What is a grammar? Why do we need grammars? What are variables, what are functions and why do we need them? How do we control the flow of a program? What are objects? Why would we need them? What’s the difference in passing arguments by reference or value? Why should I care about any of this? This is not an exhaustive list but I hope it illustrates my point. To make any of this make sense you obviously need to give students some practical experience and to do that you need to teach them a language. Your aim should not be to teach them all the details of the particular language you choose, but instead, to use the language to teach the principles. What really matters is that what ever language you choose can be used to illustrate the core concepts with the minimal of fuss and complexity.

Just about any language can be used this way. I was thought programming in this way and it worked. I find it easy to move to different languages and always have because I fundamentally understand the principles. As it happens the language used in my course was Java. Did I see any silly GUIs? Nope. Did I see many pre-made libraries? Almost none. I was not though Java, I was thought the principles of computer programming through Java. Could you do this with C++? Yes. How about Perl? Sure. JavaScript? Mostly, but the section on objects would get ‘interesting’, and not in a good way. I could keep listing languages and the vast majority of main-stream languages would come out as suitable candidates.

It’s for this reason that I find reports like this one stupid. They give a list of reasons why Java makes a bad first language but they are all stupid. What the article actually shows that when the specific language becomes the focus of the course, rather than the core principles, things will go down hill. In other words, if you use Java badly then it is a bad first language. The same is true of ALL languages! The debate needs to shift away from arguing about what language is used, and towards arguing about how best to use any language to get the core principles across.