Beefy Boxes and Bandwidth Generously Provided by pair Networks
No such thing as a small change
 
PerlMonks  

Re: Worst blog post ever on teaching programming

by TedPride (Priest)
on Apr 03, 2006 at 04:59 UTC ( #540853=note: print w/ replies, xml ) Need Help??


in reply to Problematic post on teaching programming

I don't know. Some student problems are the fault of the teachers, but on the other hand, many students are just lazy and don't work hard enough on courses they consider "too difficult". And the coursework for a programming degree IS quite difficult, when you consider that many colleges, even community colleges, require up through Calculus 3 and Linear Algebra. What's that all about? Why should programmers have to slog through Calculus 3? Anything after Calculus 1 is basically useless in terms of your career, and I'd think it would be far more useful to spend the same amount of time learning several new programming languages and/or operating systems.

Bottom line though, I mostly agree with him - introductory programming courses are a good way to weed out the people who aren't commmited to getting a programming degree. What sort of person fails an introductory programming course? The sort of person who should be majoring in something else. Assuming your textbook is reasonably adequate, you have no excuse for failure.


Comment on Re: Worst blog post ever on teaching programming
Re^2: Worst blog post ever on teaching programming
by Anonymous Monk on Apr 03, 2006 at 16:15 UTC
    They all assume you'll be doing wierd, theoretical work, preferably, in CS grad school.

    You see, you might need calculus for non-linear optimization work, and you might need linear algebra and field theory if you decide to go into crytography research. So everyone has to take it.

    And if you decide to just do a job doing boring, practical things like writing programs that work, instead of clever, abstract things like proving neat, theoretical boundries on toy problems for systems that can never actually exist in the real world, you're derided for not being clever and abstract and academic enough.

    I don't know how many young kids I've seen wander out of a CS degree thinking a Turing Machine is Something Important(TM), as applied in the real world. They think that constants are irrelevant, because they learned order notation, but didn't learn quite enough.

    In reality, all those boring little constants in front of your abstract little order notation symbols mean the difference between "highly profitable" and "completely worthless". In the real world, people need constant time speedup: and the difference between running in twenty minutes and running in a hour can make or break a program. In the real world, you worry about the scalablity of your algorithm only after it meets the inital performance requirements to begin with. If the the hardware to make the problem fast is too expensive to do what the business requires, it's a no go, no matter how much nicer your algorithm scales "towards infinity".

    I don't know how many people have tried to use "Turing completeness" as a way to explain what a computer can or can't do, and gibber on and on about halting problems and so forth. It's far simpler than that: any computer you find in the real world will have finite memory, finite run time, and finite amount of cash available to construct and run it. No computer can do more than a finite state machine can, if only for reasons of economics. But finite state machines are boring, and we don't have a neat paradox for them, so I'm left listening to boring undergrads drone on and on about "undecidablity" as if it's a real world problem...

    When they get to the real world, they'll learn that no one else cares about CS theory. No one else cares about whether P=NP. They just want the billing system to run, the accounting ledgers to add up, and the reports to look pretty, with colourful graphs that show the wiggly line going upward. If you do what the rest of the world needs, you get paid; if you don't, you don't.

    Universities are largely in the business of training grad students to become professors; any other education they provide is mostly just incidental.
    --
    Ytrew

      Exactly.

      The most succinct encapsulation I've seen of your oh so eloquant dissertation above, is something I read in someone tag line somewhere. From memory it went something like:

      In theory, theory is enough. In practice, it isn't!

      Examine what is said, not who speaks -- Silence betokens consent -- Love the truth but pardon error.
      Lingua non convalesco, consenesco et abolesco. -- Rule 1 has a caveat! -- Who broke the cabal?
      "Science is about questioning the status quo. Questioning authority".
      In the absence of evidence, opinion is indistinguishable from prejudice.
        The quote I heard was:
        In theory, theory is the same as practice. In practice, it's not.
        --
        Ytrew
      Understanding the concept of a Turing machine is actually quite important in real-world programming. A little digression -

      Nearly everyone on this site will be able to point to an experience in their careers where they wrote something in Perl and it took them a couple days. Turned out to be really really useful and the PHB had it rewritten in Java. Took 15 people 6 months and it still doesn't work right.

      Why is that? Perl isn't inherently a better language than Java. In fact, there are many things Java has better support for than Perl. However, Java projects generally take longer than the equivalent Perl projects and generally require more people.

      My feeling is that Perl programmers tend to be more capable than Java programmers, precisely because we tend to have a stronger grasp of the fundamentals. Things like a Turing machine. In fact, I once implemented a Turing machine in production code because it was the correct and cost-effective solution to the requirements. It's easy to deride the theoreticals, but they're extremely useful, just not as presented. You actually have to think about how to apply them. :-p


      My criteria for good software:
      1. Does it work?
      2. Can someone else come in, make a change, and be reasonably certain no bugs were introduced?
        I once implemented a Turing machine in production code because it was the correct and cost-effective solution

        I think it is all too easy to forget how much practical experience it requires to be able to reach that determination.

        Just as theory without practice--real-world practical application--is just so much hot air; so you can practice as much as you like, but without the theory to back you up and allow you to choose the right starting point, the likely outcome of your practice is that you will become very good at doing the wrong thing.

        From previous discussion, I think that you are likely in tune with the theory and practice of 'balance in all things'.

        In programming as in life, balance is everything, and inbalance--the over concentration on one aspect to the exclusion of others--is the source of most woes.


        Examine what is said, not who speaks -- Silence betokens consent -- Love the truth but pardon error.
        Lingua non convalesco, consenesco et abolesco. -- Rule 1 has a caveat! -- Who broke the cabal?
        "Science is about questioning the status quo. Questioning authority".
        In the absence of evidence, opinion is indistinguishable from prejudice.
        In fact, I once implemented a Turing machine in production code because it was the correct and cost-effective solution to the requirements.

        Really? Please explain where you got the infinitely long tape, and how your software made markings on it. If you didn't do that, then you didn't make Turing's machine; and any computing device with an infinite datastore that we can concieve of is computationally equivalent to a Turing machine.

        Turing machines are just a theoretical device for discussions of computational equivalence; you can't "implement" one in any sense of the word. You can create a state machine with an associated finite datastore, but we tend to call those devices "computers"; the hardware already does that for us.

        There's no sense of the word in which I can find it meaningful to claim one has "implemented" a Turing machine; it's a thought experiment, not a device you can actually build. --
        Ytrew

      When they get to the real world, they'll learn that no one else cares about CS theory. No one else cares about whether P=NP. They just want the billing system to run, the accounting ledgers to add up, and the reports to look pretty, with colourful graphs that show the wiggly line going upward. If you do what the rest of the world needs, you get paid; if you don't, you don't.

      True.

      Of course sometimes knowing about N=NP, big O notation, etc. is exactly what you need to get the job done.

      I've encountered my fair share of fresh CS graduates who think they know everything and are terrible at their job. The thing is I've also encountered my fair share of non-graduates who've been working in the industry for years and think they know everything and are terrible at their job. I think a lot of this has to do with the person - rather than whether they come from an academic or industry background.

      Universities are largely in the business of training grad students to become professors; any other education they provide is mostly just incidental.

      Back when I was at uni I remember being taught tons of purely "academic" content. People I knew who were working in industry told me I'd never use it in the real world. Silly things like object orientation, virtual machines and garbage collection.

      I certainly don't think a university education provides you will all of the skills needed to do the job. I'm actually glad that they don't since I don't think universities should be in the job of just vocational education. They do provide a bunch of useful skills though. IMHO as ever ;-)

        Of course sometimes knowing about N=NP, big O notation, etc. is exactly what you need to get the job done.

        When? Academic learning isn't at all bad, but those two examples are a very poor choice. Measuring computational complexity at all well requires far more than order notation (and perhaps more than graduate level computational complexity theory). Performance modeling remains a poorly understood and active field of research, and the gains are being made quite slowly.

        So in practice, the concept of "P vs NP", and order notation are largely useless. "Polynomial space/time" explodes quadradically for a polynomial as low as two! The choice of P vs NP boils down to "too slow to be workable" versus "really too slow to be workable". The exception, of course, if the constants are nice, and N is small enough to be workable: but that's exactly what order notation and most computational complexity theory ignores in the first place!

        Personally, I found that while academic learning is interesting, it's rarely useful. It's nice that you can write your own compiler, but your job will involve producing graphs and reports, not writing compilers. And when and if some of that deep, complex academic learning is required, your company will just hire a PhD: so unless you're willing to give your life to CS theory, there's no great benefit to a mere undergrad degree. Perhaps that's why there's so many open source languages: people desperate to find an excuse to write their own compiler, now that they've wasted thousands of dollars learning how!

        One guy I worked with was so desperate to do something "academic" with his job that he wrote his own recursive descent parser ... for a configuration language that he invented himself ... for an EBICDIC to ASCII translator ... which only needed a very limited set of options ... and which never actually changed. But hey, he got to be all "academic"; and now I've got a tonne of painfully useless code to untangle if I ever have to maintain his over-engineered monstrosity.

        Back in school, I took a lot of courses in things like multidimensional calculus, vector algebra, and group theory. None of it is terribly useful for producing billing reports and the other assorted drudge work that actually pays the bills. In some sense, I understand why my co-worker decided to waste company funds on his wierd design; but I certainly can't condone it.

        In any case, I've been left with a distaste for breathless undergrads, and people who think that "more complicated is better", or people who think "new is better": most of the time, the boring, obvious encoding is the most maintainable encoding, and when it's not, you can at least understand what was done, and slot in your clever little algorithm where it's needed.

        --
        Ytrew

Re^2: Worst blog post ever on teaching programming
by Popcorn Dave (Abbot) on Apr 03, 2006 at 19:47 UTC
    I don't know. Some student problems are the fault of the teachers, but on the other hand, many students are just lazy and don't work hard enough on courses they consider "too difficult".

    You're 100% correct there!

    When I took my first Perl course a few years back, one of the people in the class was balking at figuring out a simple program to calculate the volume of a planet, given the variables needed to do the calculation. And this was one of the first assignments!

    Fortunately, that was an isolated incident and that person dropped from the course within a couple of meetings.

    Useless trivia: In the 2004 Las Vegas phone book there are approximately 28 pages of ads for massage, but almost 200 for lawyers.

Log In?
Username:
Password:

What's my password?
Create A New User
Node Status?
node history
Node Type: note [id://540853]
help
Chatterbox?
and the web crawler heard nothing...

How do I use this? | Other CB clients
Other Users?
Others imbibing at the Monastery: (8)
As of 2014-12-22 12:20 GMT
Sections?
Information?
Find Nodes?
Leftovers?
    Voting Booth?

    Is guessing a good strategy for surviving in the IT business?





    Results (116 votes), past polls