Monday, July 13, 2009

How To Teach Software Development

How To Teach Software Development
  1. Introduction
  2. Developers
    Quality Control
    Motivation
    Execution
  3. Businesses
  4. Students
  5. Schools

Education is broken. Education about software development is even more broken. It is a sad observation of the industry from my eyes. I come to see good developers from what should be great educations as survivors, more than anything. Do they get a headstart from their education or do they overcome it?

This is the first part in a series on software education. I want to open a discussion here. Please comment if you have thoughts. Blog about it, yourself. Write about how you disagree with me. Write more if you don't. We have a troubled industry. We care enough to do something about it. We hark on the bad developers the way people used to point at freak shows, but we only hurt ourselves but not improving the situation. We have to deal with their bad code. We are the twenty percent and we can't talk to the eighty percent, by definition, so we need to improve the ratio that comes out of the factory, because we can't touch them once they are set loose on the world. Fix this problem at its source, with me, please.

For Students This Means...

You're paying for what you aren't getting. Either you really care about the world you're spending all this money to get indoctorined into or you expect to be honestly prepared for a career you think its lucrative. Neither case is true.

For Schools This Means...

You aren't producing the impressive minds and individuals that makes a school stand out.

For Businesses This Means...

You front the cost for the bad performance and overcoming of a lacking education, so consider this problem a fiscal one.

For Good Developers This Means...

You have to put up with these poor saps.

4 comments:

Jason said...

I hear you loud and clear on this one. I currently have one more year of "survival" left before graduation, and I'm lucky to have it paid for by a scholarship or I'd be even more angry. My friends here feel similarly.

I think a hands-on/apprenticeship style of learning is essential. It's remarkable that at no point during my university education has a teacher or assistant watched me as I coded something and given advice, but I suppose it's tough to do with such a high student to teacher ratio.

The skills acquired during a degree program are incidental until they take precedence over the actual piece of paper in the eyes of recruiters. The problem is that it's a lot easier to judge whether someone has a piece of paper than it is to judge their skills.

There are many people who realize that these problems exist, but how do we increase the ratio? Go door-to-door handing out copies of Robert Martin's Clean Code?

rgz said...

I have developed php webapps for rougly 4 years, and generaly ASP/Winforms apps for 2 years.

And I can tell that I don't know about TDD. I get the theory I just don't know how to approach the subject.

And outside of development done in Visual Studio, I never bother with source versioning. I administer and program no less than 3 websites and still don't use svn/git/etc my idea of backups is tarballs.

These days development is 70% of the time done in the production system, it used to be the other way around when I set up a development environment.

This is all pretty usual for a php junkie but I know it's far from ideal.

I know nothing about deployment.

The truth is, as soon as you teach them how to write Hello World applications, you should teach them proper developing disciplines.

Chris said...

I'm going into my 4th year of surviving a computer science program right now. Whatever ability I come out with will be fully in spite of my degree and not because of it.

I skip class so I can program on my own time, because that's the only way I'll learn how to program.

I could rant at length about it, but I'll just mention a few problems and possible solutions I see:

- University administration repeatedly hires professors who can't teach. I think this is for two reasons: they're more interested in the research aspect, and they don't know how to tell who can teach and who can't.

I've had fantastic teachers and terrible teachers, and it was always evident by the end of the first class which was which. The quality of teaching would improve a lot if students had any choice in their professors. I think maybe by this age, students are mature enough to pick the "tough, yet fair" prof over the "A+ but you won't learn anything" prof.

- Professors would love to teach better, engage their students more, etc etc, but the fact is they don't have the time to prepare elaborate lesson plans, so they teach the same old selection sort garbage for decades in a row.

Maybe it would help here to actually do less prep and let students run with it. It's much more motivating to pick a project you really like, or to contribute to an open source project. There are certainly some initiatives in this area already, but they're the exception rather than the norm. I don't think it's ever too early to start writing useful, relevant code rather than yet another AVL tree implementation.

- My school only teaches Java, because the profs are familiar with C from their undergrad days and Java is similar to C, but with better marketing. This leads to awkward situations like trying to explain why "Hello World" is five lines of keywords and desperately trying to pretend hash tables and don't exist until 2nd year because the syntax for using them is so convoluted.

Some professors think that the language is not too important since it's really about the underlying language-agnostic basic concepts, but I firmly believe that it is extremely important: a modern scripting language like Python or Ruby (or your favorite super-high-level language here) would keep the unimportant details out of the way so you could more clearly see the basic concepts. It would also stress that OO is only one successful paradigm, and not the Holy Grail of programming.

Anyway, just wanted to let you know that this is a real problem and you're not the only one.

Tennessee Leeuwenburg said...

I think that people undervalue their undergraduate courses, not realising just how much foundation their provide. I used to think of myself as a university survivor rather than having being given a great education, but I have since revised my opinion. Once I started sitting on interview panels, I started being able to ask prospective hires some questions about IT. It was clear that most of them did not, in fact, come equipped with the basic knowledge that should have been imparted during their undergraduate degree.

If those applicants had demonstrated a sound knowledge of their undergraduate course content, they would almost certainly have won the day.

The university can only do so much -- hire professors from the field of applicants, do their best with staff/student ratios etc. I can't think of an obvious sweeping reform which would ease the burden of competition for scarce resources in education.

The most important thing, I think, that may be missing is a vibrant student culture which is central to the university experience. I'd love to see a kind of "20% time" subject at university, where students are expected to pursue something worthwhile in their own time, and produce some assessable content on their own motivation.

Something that I find missing personally, however, is more support for IT master-classes past university. I would love, for example, to be able to go on a one or two-month intensive learning course with the best of the best.

I write here about programming, how to program better, things I think are neat and are related to programming. I might write other things at my personal website.

I am happily employed by the excellent Caktus Group, located in beautiful and friendly Carrboro, NC, where I work with Python, Django, and Javascript.

Blog Archive