I wholeheartedly agree with this article. Better than teaching any given language is teaching how to *think* about problems. If you can picture how computers work, understand what they can and can't do, and apply their strengths to a problem or data set, you can grab any old language's cookbook and bash out a solution. The algorithm is more important than the code.
I've met people in the past who have been involved in writing software and firmware, and yet they just aren't in a position to design, analyse and optimise an algorithm. I don't know what it was they missed out in their education that got them to that point, but it seems so odd to me that you can get by slotting together snippets without ever really thinking through the overall architecture. Makes for some pretty slow problem solving and when one of a certain class of bug crops up, people who work like this have a *really* hard time killing it.
Bravo for this article, expresses my thoughts on the situation precisely.