|We don't bite newbies here... much|
Interesting. When I read mandog's post my first interpretation was "CS doesn't teach people how to program" and I agreed with it.
Because my definition of "programming" is getting your code down, tested thoroughly and working properly.
Knowing when to break things into a module, just how much you can code before you need to test it, what a good variable name is, and why. These are the things I think of as "programming". And since I haven't seen anyone try to teach them outside of "The Practice of Programming" by Kernighan and Pike and "Large Scale C++ Software Design" by Lakos, I tend to think of them as things you learn through experience.
Picking algorithms, mapping out data structures, choosing an overall program architecture that works with everything else. I think of all these things as "Software Engineering". In a way they're easier to learn. Most are things that people can measure easily (e.g. testing the performance of various sorting algorithms). Much of it has a strong scientific basis and there are a million books about it. It's covered in every CS curriculum.
I don't think you can be good at developing software without both. The two parts are like the difference between architects and carpenters. One can tell you what to do, the other how to do it, and you aren't going to get a good building without both.
I've got a Computer Engineering degree (not quite CS - but the only relevant difference was that I had EE classes instead of electives, and a few extra semesters of math) and I must say that it didn't teach me anything about programming. Professors lectured on data structures, analysis of algorithms, language syntax, etc. But nobody ever tried to sit a class down and say "this is how you write code".
Nobody ever mentioned that you shouldn't write 2000 lines of code, then see if it compiles. And I saw CS students just about to graduate do exactly that. People in their last semester, with good grades, who didn't know how to break a C program up into more than one .c file and still get it to compile, much less use multiple files to make the code more reuseable.
This is a very real problem. Did anyone else go to a school where the CS department didn't offer a class, or even a lecture on debugging? Debugging isn't a big research topic. People don't write theses about whether debuggers are more efficient than print() statements. It won't make a professor's career. But you can't be a professional programmer without it.
You can't be a good professional without the theory of algorithms, different development cycles, a good gut feel of what happens when someone makes a O(n^2) algorithm, etc. CS is where you learn that. I'm just not sure that it's "programming".