Inevitably, I find myself involved in discussions about what makes a good Computer Science curriculum, and there tend to be differing opinions as to whether a more theoretical curriculum or a more practical curriculum is better.
My university, The University of Pennsylvania, leaned very heavily to the theoretical side. Out of the 15 requires computer science courses, only one required any significant programming after freshman year (Operating Systems). Sure, you have electives and your senior project, but very few electives even have programming. In my mind, it's embarrassing for students to graduate without really knowing how to write code. Like so many other universities, Penn feels that theoretical skills were more important, as pratical skills become quickly outdated.
I happen to disagee somewhat, but let me start out by stating that you should maximize the following things in designing a curriculum:
- Long-term value (7+ years after graduation)
- Short-term value (present day to 7 years after graduation)
- Student interest
Long-term value:
Theory certainly holds long-term value - after all, languages change, theory stays true. But does that make theory inherently uber-useful? Most engineering students, even if they start off in a technical position, eventually move on to less-technical roles where they won't need to know the details of P vs NP. What does hold long term value? Learning to push yourself, stretch your mind, and critically analyse problems. Both theoretical work and practical work can do that.
Short-term value: Short-term value is a little bit easier question to answer: projects. Projects teach design skills, planning skills, team working skills, critical thinking, etc which you will need throughout your life. Furthermore, they can actually be a way of applying ideas from theoretical courses. Use projects to reinforce you short-term skills (security, c, c++, user interface design, embedded systems, etc) and you will have both short-term and long-term benefits.
Student interest: Not only do students learn more in courses that they're interested in, but students are more likely to independently seek out knowledge outside of that course. Teach to a student's interest and they'll learn more in the long run than you could possibly teach them in that course. Many students (no, not all) feel a much great sense of accomplishment by creating an actual GUI application versus figuring out some proof. In fact, Unlocking the Clubhouse argues that women, in particular, tend to be more interested when they're doing GUI development. GUI development done with .NET (or Cocoa) is incredibly easy. If you just teach students a little bit, there's a good chance they'll continue to write more and more apps because it comes in handy.
So how would I design a computer science curriculum?
First year: Programming languages - probably Java, but at some point do some .NET GUI development. Not for the whole semester - just a bit of it to plant "seeds" where students can start doing development on their own.
Second year: Course on software engineering. Additionally, start teaching things like security, algorithms, architecture, etc. All classes should have projects where programming is required in them. This year should have more theory than the first year.
Third year: Shift a bit more towards theory. Require some sort of "junior-project" where students work individually for an entire semester on a project of their choosing
Fourth year: Full-year senior project, plus theory courses.
Oh, and all students should have to learn how to write well. Require writing courses of any kind.