Beefy Boxes and Bandwidth Generously Provided by pair Networks
Come for the quick hacks, stay for the epiphanies.
 
PerlMonks  

Re^3: On Quality

by adrianh (Chancellor)
on Jul 26, 2005 at 21:39 UTC ( [id://478377]=note: print w/replies, xml ) Need Help??


in reply to Re^2: On Quality
in thread On Quality

You Aren't Going to Need It: Unless, of course, you are.

Well then you do need it don't you and it isn't a situation where YAGNI applies :-)

The whole point of coding it beforehand is that later you won't have time to fiddle with coding and testing; whereas right now, you do.

Right now I have a set of features to implement. Some the customer needs now. Some the customer needs later. YAGNI is all about doing stuff the customer needs now first. That way I end up spending time coding and testing features the customer actually needs, rather than spending time coding and testing features that we don't need until some indeterminate date in the future - by which time the requirements may have changed anyway.

Once And Only Once: Is rarely the simplest thing to do, because it requires abstraction. Abstracting away from the business requirements is one more thing that can go wrong.

Removing duplication abstracts away from business requirements? Usually the opposite in my experience.

Having, say, a global variable or class default that has to be tracked down is often harder to find than just calling the functions explictly, without implicit values being passed around.

Sorry I just don't follow what you're getting at here. Removing duplication invariably makes things clearer in my experience so I think we must be talking about different things.

Worse still, once you move your code around, from say a nested if cascade to a hash table lookup with coderefs, in a month, the business requirements will require that you move to a different model of abstraction to handle a new problem. Now you've got to go back to the old code again... which is needless repetition of coder time.

I don't see how moving from if/then/else statements to a table look up is removing duplication?

A decent search and replace, applied intelligently (or better still, a language-specific refactoring browser) can make mass changes at a single stroke: without the added confusion of implict values lurking in the background.

In my experience people waste far more time dealing with bugs related to duplication in code than they would save by avoiding refactoring the duplication out.

Refactor Mercilessly: This is a pipe dream. Coder time is very, very expensive. This is one of those comfortable academic ideas that doesn't make much business sense: doing a lot of work on a program, that if done correctly, will result in a product that does the same exactly same thing as it did before, and if done wrong might ruin the product.

It's not a pipe dream since lots of people do it with a fair amount of success.

Short term expense, long term profit. If you are refactoring mercilessly it's a very short term expense since it's a background task that you're doing all of the time. Refactoring only gets expensive if you let the code get messy. Clean the kitchen after every meal, not once a month.

(which reminds me - I need to do the washing up :-)

Test First: This is a great concept. It doesn't work in most business settings, though.

I know lots of business coders, myself included, who'll disagree with you there :-)

In order to test, you need to know exactly what you're testing, and how. To properly test a section of the program, you need to define what that section does, and how it does it (all preconditions, post conditions, and side effects, etc.)

That is, as it were, the point. You use the tests you write to define the requirements for the code that you write. It works really well.

Tight Feeback: is costly. If you have the money to burn, it may or may not be profitable

Is it more expensive to find your code fails tests now or next week when QA gets it? Is it more expensive to know the users hate a feature now or after the manuals are printed? Is it more expensive to know that the code your writing now duplicates a feature Bob wrote last week as you start, or two weeks later in the code review?

Tight feedback loops save money. In spades.

Introspection: is good when it works; and a complete waste of time when it doesn't pay off. Best reserved for people with the actual power to change things. Usually, the real problem is: "we don't have enough money/resources/manpower to solve this problem correctly"

Often resources aren't the problem. It's resources being badly applied, usually because of foolish project management practices and overcomplicated development methodologies. Give me a well organised group of 12 developers over a badly organised 120 any day of the week.

Transparency: again, this typically generates better code, at a cost of time (and money). Worth it in most cases: but may be hard to persuade management.

I have to admit I've got to that stubborn age when I'm going to dig out the people in charge and shake them until they listen :-)

Writing good code is a trivial excercise: any half-decent coder can learn to do it.

I have to disagree. Writing good robust code is damn difficult. I've been a professional programmer for more than half my life now and I'm still finding new and better ways to do it.

If it's so damn trivial why do so many people bugger it up on such a regular basis?

Log In?
Username:
Password:

What's my password?
Create A New User
Domain Nodelet?
Node Status?
node history
Node Type: note [id://478377]
help
Chatterbox?
and the web crawler heard nothing...

How do I use this?Last hourOther CB clients
Other Users?
Others pondering the Monastery: (7)
As of 2024-03-28 18:42 GMT
Sections?
Information?
Find Nodes?
Leftovers?
    Voting Booth?

    No recent polls found