Jan
8
Teaching Programming – Why the Choice of First Language is Irrelevant
Filed Under Computers & Tech, Software Development on January 8, 2008 at 9:37 pm
Something I’ve spent quote a bit of time thinking about during my years as a student, then lab assistant (AKA demonstrator), and finally occasional lecturer with the computer science department in NUI Maynooth, is how students should be introduced to computer science and programming. I’ve seen all sorts of tactics tried over the past 14 years. The absolute worst tactic I’ve seen is the abandonment programming from first year computer science program altogether. Another disaster in my opinion was the introduction objects before the introduction of basic constructs like conditional statements and loops, the confusion that cased was monumental. I have been involved with final year undergraduate projects for much of my time with the department and have seen first-hand the effects of some of the different approaches. No one seems to be able to agree on how best to start computer science students programming, but something no one can argue with is that any system that results in final year honours students being unable to program is fundamentally flawed.
[tags]computer programming, education[/tags]
I’ve watched the programming abilities of final students plummet to positively frightening levels over the years. I put it down to a poor introduction to the basic principles. The language used is peripheral, it’s the principles that matter. They are universal. Starting students off with high-level libraries, flashy GUIs and object orientation before they even know what an if statement is nothing short of retarded in my view. Not teaching programming at all to first years is even worse. It is simply not fair on students. Some people are not made to program. It’s an inconvenient truth that no matter how good your teaching methods are, not everyone is wired to program and programming is a vital and integral part of computer science. You can’t do computer science without the ability to create sets of computer-readable instructions to command computers. In other words, programming is vital to computer science, and not everyone can program.
Imagine you are a first year and you have some subject choices to make. You try computer science to see how it fits and you get on fine for the six week period you get to change your mind. In fact, you get on fine for the entire first year because you only cover some vague theories and some history. You don’t implement anything real. You come back and start your second year and are then introduced to programming. You find out two things, firstly, that it is not for you, and secondly, that you can’t be a computer scientist without the ability to program. How hacked off would you feel that your department chose to ‘protect’ you from the realities of computer science until well after the period where you can easily change your subject choices? Personally I’d be right browned off!
We can now take for granted that we need to introduce computer science first years to programming, the question of how remains. I see a lot of that debate focusing around the choice of language. Should we teach C or Java? What about .Net, there’s a lot of jobs in that? How about JavaScript, the web is cool after-all? If the choice of language is your starting point then you’re on the highway to failure already. You need to focus your entire course around the core principles. What is a program? What is a compiler? What is a grammar? Why do we need grammars? What are variables, what are functions and why do we need them? How do we control the flow of a program? What are objects? Why would we need them? What’s the difference in passing arguments by reference or value? Why should I care about any of this? This is not an exhaustive list but I hope it illustrates my point. To make any of this make sense you obviously need to give students some practical experience and to do that you need to teach them a language. Your aim should not be to teach them all the details of the particular language you choose, but instead, to use the language to teach the principles. What really matters is that what ever language you choose can be used to illustrate the core concepts with the minimal of fuss and complexity.
Just about any language can be used this way. I was thought programming in this way and it worked. I find it easy to move to different languages and always have because I fundamentally understand the principles. As it happens the language used in my course was Java. Did I see any silly GUIs? Nope. Did I see many pre-made libraries? Almost none. I was not though Java, I was thought the principles of computer programming through Java. Could you do this with C++? Yes. How about Perl? Sure. JavaScript? Mostly, but the section on objects would get ‘interesting’, and not in a good way. I could keep listing languages and the vast majority of main-stream languages would come out as suitable candidates.
It’s for this reason that I find reports like this one stupid. They give a list of reasons why Java makes a bad first language but they are all stupid. What the article actually shows that when the specific language becomes the focus of the course, rather than the core principles, things will go down hill. In other words, if you use Java badly then it is a bad first language. The same is true of ALL languages! The debate needs to shift away from arguing about what language is used, and towards arguing about how best to use any language to get the core principles across.
Surely you would admit that there are some pretty fundamental differences between the programming experience when working with, say, Java vs x86asm vs Ada vs LISP?
I don’t think anyone’s trying to say we don’t need to teach those basic constructs you list. What they’re saying is that you need to use a programming language as part of that teaching process, and that given that various different languages differ pretty widely, those differences can have an impact on that process. Hence they need to be taken into account.
Incidentally, re “Did I see any silly GUIs? Nope. Did I see many pre-made libraries? Almost none.”, I agree that stuff often needs to be toned down when first intro’ing programming. However, out here in the real world, we use silly GUIs and pre-made libraries pretty extensively. If your coursework doesn’t at least pass over them at some point, you’re liable to produce the kind of muppet who tries to re-invent every wheel he sees, and thinks that’s a good thing. NUIM unfortunately tends to do that – produce folks who’ve never used IDEs, source control, debuggers, test-driven development, think they have to implement things like dynamic lists and crypto from scratch, etc. If you’re going to teach someone programming, you might as well teach them software development at some point too.
Not re-inventing the wheel is very important. But that comes after learning the basics. To really understand a linked list you need to implement your own. Ideally a few different ways, each implementation better than the one before. Once you actually understand what’s going on of course you use something like java.util.Vector. Like wise, IDEs and source control are very important. However, you need to be able to work from first principles first. I’ve seen students who after a year of Java programming didn’t know what javac was! The only way you can appreciate an IDE is to first do without. Then you understand just how much it is helping you and just what it does. Similarly, introducing first years to source control is too much too soon. You have four years to turn raw recruits into proper computer scientists or software engineers.
As for the GUI, it is important. Just like CGI is important, just like HCI is important. However, students need to really get the basics down first and GUIs are not the basics, not even nearly. The time to introduce people to one of the many GUI toolkits is not when they are still struggling to understand the difference between static and final, it’s when all that simple stuff is second nature, when the concepts that underlie GUIs like objects and threads are properly understood.
I’m talking about laying the foundations here, not building the house or furnishing it!
Finally, the criticisms against Java in the article I linked were all about how Java was miss-used by throwing in GUIs and libraries far too soon. The raw basics first, then move on.
Bart.
I think “irrelevant” is putting it a bit strong, but then I always like strong opinions, more interesting to read. There are a lot of things first year CS needs to communicate to students
1) Problem Solving (this is where most students fail, in my opinion)
2) Writing Code (few fail here, it’s rare I’ve seen a student who can articulate a solution but can’t generate code to do it).
3) Software Engineering (the bigger picture stuff, i.e. methods, classes, the whole abstract thinking piece)
My opinion is that there are only four significant concepts in programming that first years really need to master. Everything else (i.e. everything from the lambda calculus y-combinator stuff through to the JUNIT system integration testing) should wait until the basics are nailed.
These 4 concepts are CRAP, That is to say they are
C – Concatenation, the notion that statements are sequential. i.e a=4; b=2; a++; b=a; What does b=?
R – Repetition, the notion that pieces of a program can repeat. Loops of all sorts basically
A – Alteration, the notion that programs do one thing OR the other, IF-statements basically
P – Parentheses, the idea that pieces can be grouped together and re-used. (This is the abstract part).
Now, I’m sure that these 4 concepts are familiar to you already. They are after all the foundation of Regular Expressions.
if (s =~ m!(a+[b|c])*d!) {…} and the like
For me, I’d be happy if every student possessed a full understanding of these concepts at the end of first year.
The other stuff is kinda what splits the scientists from the engineers and I think they should be taught once that split has been made.
Or maybe not, I’m just throwing ideas out there 🙂
Thanks for the four C’s Des, CRAP, I like it!
To be honest this post was about the CS100 style “principles of computer programming” courses which introduce students to programming rather than a post about what students should learn in their entire first year or what programming they should learn in their entire degree.
Everything people have mentioned is important, but all of it needs to be set on a solid foundation of CRAP 🙂 Mind you, I think CRAP is not quite inclusive enough. The concepts of IO and exception handling are also pretty fundamental as are the concepts of object orientation but they are definitely less fundamental than CRAP and if you don’t know CRAP you can’t possibly get OO.
I worry when I hear about people introducing people to programming by getting them to write GUI apps and use libraries. The linked article implied that Java was bad because people use it like that. The problem there of course isn’t Java, it’s forgetting to teach CRAP first!
I really like this whole CRAP idea …. will have to use it more often in conversation 🙂
Bart.
Yeah, it’s ‘interesting’ to see programming dropped from the first year curriculum of a computer science course, or one imperative programming language being rubbished over another. I’ve a particular problem with the former of those two, when I know so many of the people who agreed to this move away from the norm; the mind boggles…
It’s always amazed me how over even an hour of a well-thought out tutorial the difference you can make to a student’s (or even a group’s) understanding, and to be honest, often effectively doing the course lecturers job for them. Unfortunately, all of this apparently pales into insignificance when it comes down to pure and simple numbers – the modern university is a degree factory, and the numbers must keep going up, up, up! But then, myself, Des and Bart know this first hand all too well…
I was a bit surprised to read Dave Cahill’s take on IDEs, GUIs etc. While quite valid (and certainly true of past years that have gone through), if you take away the ability to program at all, such “advanced” material will simply be beyond them. I remember the day I asked, “What happens if programming moves to second year?!?” The reply? “Well, the second year courses go to third year.” “And the third year courses?!?” “Oh, they go to fourth year.” Before I could ask about the fourth year courses, the person I’d asked very pointedly changed the subject…
Well, what I was getting at was more in terms of their overall education – as in, once you’ve taught them basic programming, you should teach them software development, which NUIM doesn’t seem to do much of. I’m not suggesting we teach that kind of stuff first – as you say, it wouldn’t be too useful without the underpinnings. Programming by itself however, is of relatively limited use unless you’re going to teach computer science or software development, or indeed something else, on top of it.
As a first year CSSE, let me say the CRAP concept is a good course foundation.
I feel let down by the major changes made to CSSE this year (for the uninformed its now the same thing as first science and then the proper stuff begins in second year).
The core concepts should be explained before any programming even begins. I’m aware that theres changes planned for next year.
I agree the language isn’t important, the ideas are important, calling an interrupt 10h in 8086 asm, printf or system.out.println all do the same thing. Once you grasp the concepts and you realize that a language is a tool, you use this tool to accomplish certain things, and the tool must fit the purpose.
Think of a cordless drill, once you grasp the idea that your controlling a motor that rotates a drill bit, and you understand how not to injure yourself with it you can easily understand how to use any other type drilling tool, a pillar drill, a centre lathe, etc.
Bart, Des & Peter, I totally agree.
Peter taught me CS100 (Programming) in first year along with Jackie and now it’s gone. Whilst I already knew 99% of what was being taught I still found it helpful, as did all the venture management and others who were in it – it was a great course and set up students for the coming years.
There’s a departmental seminar where they decide what’s going on next year (or rather, what direction the CSSE course is taking in the coming years). Thats coming up at the end of this month if you want to raise some concerns again – having been involved in the department for some time at various levels perhaps they’ll entertain your comments
Danny
http://www.codinghorror.com/blog/archives/001035.html
An interesting commentary on this topic
Your right Bart, you just don’t hop in a car and drive off you need to learn the basics, how to steer use clutch, rules of the road. There are about five thining you need to know , variables , control statements , loops , methods/functions, and objects. All popular languages have them, once you know them it wont take you long pick a new language( depending on who’s teaching it , some are great programmers put crap teachers).BTW the best teacher in programming I had was Des, he made it fun and paratical( I still the code for Cunncery Converter and Notepad some where )
That’s a nice article Des but I’m struck by the fact that he keeps using the word “computer science” where I really think he means “software engineering”. Computer Science degrees are not really geared at producing developers, software engineering course should be (but in Maynooth clearly aren’t).