Something I’ve spent quote a bit of time thinking about during my years as a student, then lab assistant (AKA demonstrator), and finally occasional lecturer with the computer science department in NUI Maynooth, is how students should be introduced to computer science and programming. I’ve seen all sorts of tactics tried over the past 14 years. The absolute worst tactic I’ve seen is the abandonment programming from first year computer science program altogether. Another disaster in my opinion was the introduction objects before the introduction of basic constructs like conditional statements and loops, the confusion that cased was monumental. I have been involved with final year undergraduate projects for much of my time with the department and have seen first-hand the effects of some of the different approaches. No one seems to be able to agree on how best to start computer science students programming, but something no one can argue with is that any system that results in final year honours students being unable to program is fundamentally flawed.
I’ve watched the programming abilities of final students plummet to positively frightening levels over the years. I put it down to a poor introduction to the basic principles. The language used is peripheral, it’s the principles that matter. They are universal. Starting students off with high-level libraries, flashy GUIs and object orientation before they even know what an if statement is nothing short of retarded in my view. Not teaching programming at all to first years is even worse. It is simply not fair on students. Some people are not made to program. It’s an inconvenient truth that no matter how good your teaching methods are, not everyone is wired to program and programming is a vital and integral part of computer science. You can’t do computer science without the ability to create sets of computer-readable instructions to command computers. In other words, programming is vital to computer science, and not everyone can program.
Imagine you are a first year and you have some subject choices to make. You try computer science to see how it fits and you get on fine for the six week period you get to change your mind. In fact, you get on fine for the entire first year because you only cover some vague theories and some history. You don’t implement anything real. You come back and start your second year and are then introduced to programming. You find out two things, firstly, that it is not for you, and secondly, that you can’t be a computer scientist without the ability to program. How hacked off would you feel that your department chose to ‘protect’ you from the realities of computer science until well after the period where you can easily change your subject choices? Personally I’d be right browned off!
It’s for this reason that I find reports like this one stupid. They give a list of reasons why Java makes a bad first language but they are all stupid. What the article actually shows that when the specific language becomes the focus of the course, rather than the core principles, things will go down hill. In other words, if you use Java badly then it is a bad first language. The same is true of ALL languages! The debate needs to shift away from arguing about what language is used, and towards arguing about how best to use any language to get the core principles across.