The WSJ Business Technology blog has a post about “Where the Next Generation of Techies Won’t Come from“. Aside from offending my grammatical sensibilities, you know, ending a sentence with a preposition, the post interests me for a couple reasons.
The crux of the post refers to statistics published by the Computer Research Association that show the number of students receiving computer science degrees from American universities is at its lowest point in the last ten years. According to the WSJ post:
The number of new students enrolling in computers-science programs at these schools today is only half of what it was in 2000 — 15,958 then compared to 7,915 now.
The number of students enrolling in computer-science programs dropped when the dot-com bubble burst in 2001. You might expect enrollment to shoot back up now that the “Web 2.0” renaissance is minting a new round of techie millionaires.
This is the interesting part. Stands to reason that the lure of Web 2.0 riches would put butts in the seats of computer sciences classrooms, but statistics don’t lie. That’s not happening. Why? Because we are DIY nation.
As Americans, we like to do it ourselves, i.e. D.I.Y. This probably has a little to do with the Pilgrims and the pioneer spirit, but if you look around, you can tell we value independence and self-sufficiency. This is why stores like Home Depot and Lowes flourish. If you read Engadget or Gizmodo, you’ve seen all manner of DIY electronics. We’re a curious bunch, and we like to rely on our own skills whenever possible.
Couple this spirit with some of the points made by Zed in his rant, echoed in uncov frequently (wow, I miss that blog) and you get what I’ve called the “lazyweb“. Why would an enterprising young college student waste time in computer science classes, when it’s so easy to learn and execute on your own? Languages like Rails have made it simple to build nice web apps that barriers to entry have been removed from the world of development. What you get is a generation of DIY developers who don’t see the need for computer science courses.
Pretty much anyone can design a Rails app and run it on Amazon Web Services for cheap. Is this a good thing? I took a few CS classes back in the day; remember Pascal? CS always reminded me of Calculus. A working program did not guarantee you an A; the professor often critiqued your code writing and algorithm design. This is kind of a big deal to learn because poorly design code tends to fall over when stressed.
Anyway, I’m no expert, but it seems like a disturbing trend. If anything, the WSJ should be right; more money in Web 2.0 should signal a rise in student seeking CS degrees. The fact that it doesn’t means two things: 1) there are way too many DIY developers out there because there’s no shortage of Web 2.0 apps and 2) the overall quality of developers (or people who claim to be developers) is falling.
What do you think? Do people need CS degrees to really be good at development? Is easy development on Rails driving down the overall quality of software development? Sound off in comments.