http://www.perlmonks.org?node_id=132589

So there I was, sitting across from this "DBA" who had managed to impress me with the depth of his knowledge regarding MS SQL Server, Oracle, Sybase and Postgres. He chatted on about their various histories, performance tuning, strengths and weaknesses and the little quirks you should know when dealing with each. Now, I can't say that I am much of a database expert, but this guy clearly knew his stuff. Then I asked him about normalization:

Uh, well, I know about the first, second and third normal orders (sic), but I can't say I've done a lot of that stuff.

Now, this wasn't a job interview. It was just someone I happened to casually meet and we were chatting about this stuff. If this was an actual interview, that last comment would have sunk him.

In chatting with our CTO, he tells me that this is something that he has seen this quite a bit in older DBAs. He claims that they worked in a time when space and CPU time were at a premium and therefore they didn't focus as much on "esoteric" concepts such as normalization. Newer DBAs, he claims, have the luxury of throwing lots of high performance hardware at a problem and are therefore more likely to be able appreciate the benefits of normalization. Frankly, I don't know enough about the topic to weigh in on that debate, but I do know that poorly normalized databases make my life miserable as a programmer.

In another example, I was working on a program written before I was born (as of this writing, I am 34 -- no prizes for guessing the language). This program was an atrocious mess. The only subroutines that were used were ones that had been defined externally to the program. For flow control, a mass of goto statements had been used. I was complaining about this to another programmer (who was pushing 70) and he didn't understand my point. goto, in his opinion, was a wonderful tool to be used liberally.

Personally, I tend to mistrust younger programmers as many of them are hotshots who skipped college (confession time: I only have a two year degree). They often don't have a grasp that there are business considerations that have as much priority (if not more) than IT considerations. They sometimes think that using "$x=8;$y=$x+++ ++$x;print$y" in production code to set $y to 18 is okay because it's kewl¹. And testing? Testing is for wimps.

Contrasting that with older programmers, they are usually more experienced, have more patience and quite often are a ways behind the technology curve.

I don't mean to generalize too much. I've known older programmers who are always abreast of the latest technologies and younger programmers who truly care about putting out good production code. Unfortunately, while we search for a new DBA, the generalization seems to hold. Are my expectations too high? Is this frequently a problem with DBAs but less so with programmers or sys admins?

Cheers,
Ovid

1. We had an IS director who used 404 errors to serve images out of a database. He thought this was so kewl, he changed out corporate intranet to use 404's instead of proper URLs. It worked silently and no one ever knew ... until the Web server crashed horribly and lost a lot of configuration information. It tooks quite a while to try and figure out what the hell was going on.

Join the Perlmonks Setiathome Group or just click on the the link and check out our stats.

Replies are listed 'Best First'.
Re: (OT) Old blood versus new blood
by bmcatt (Friar) on Dec 17, 2001 at 22:57 UTC
    To step away from talking about normalization (which I'm sure some people will talk about), I think there's the (correctly identified) larger issue of different reactions to different techniques, etc.

    Consider the current "fad" (and I say that without meaning to imply that it's wrong) of "don't optimize until you need to". Well, years ago, when memory and cycles were tight, the answer to "When do I need to?" was usually "from the beginning." These days, cheaper memory, broader I/O, and faster CPUs means that we tend to write for readibility and optimize only when we absolutely can't avoid it.

    I think there needs to be a happy medium between "old blood" and "new blood". More experienced developers (, designers, engineers, admins, etc) bring the knowledge of what else to think about and how to think through a problem. Less experienced (but probably more current) developers, etc., bring newer approaches. I think the major conflicts arise when the two camps are unwilling to see the benefits that the other camp brings.

    I can't even begin to count the number of times that I've seen someone start tearing off on an OO implementation of something without thinking through what they're doing, only to need to stop in the middle and completely redo their object design. (Heck, I've done it myself when I discovered I overlooked something.) While someone more experienced in OO design would make sure they got closer to "right" before starting to implement.

    The flip side, though, is that someone who's steeped in OO doesn't have the same understanding that functional-oriented programming isn't evil and, when done properly, provides many if not most of the benefits of OO. I was doing cohesive and functionally complete "modules" in assembler and C (and lots of other languages that aren't that relevant these days :-) before I ever heard the word "object". It just made sense to make the pieces of a program distinct bits so that tweaks to one didn't automatically ripple through the entire structure.

    To point back a little to your DBA - having not done much normalization, he may not have understood the reasons that it makes a difference to a developer, architect, etc. It would have been a perfect opportunity to teach him why normalization is important. Show it from the perspective of:

    • scalability and performance - "Well, sure, we can blaze through with full-table scans, but if we had things properly indexed and normalized, we could get by with much faster index lookups and, instead of being able to support 20 simultaneous hits, we can scale up to 2000."
    • maintainability - "Take a look at how much code needs to be written to support this. Whereas, if it was normalized, we could cut it down to this much, which, as you can imagine becomes much easier to maintain and much less likely to have lots of bugs in it."
    • Pick your own perspective here

    I firmly believe that it's important to use any opportunity to instruct. Otherwise, how will the new blood come to understand that the old blood has valuable lessons to teach (and is also willing to learn).

    So, in summary, yeah, there's lots of "new blood" that don't respect (or pay attention to) the more experienced folks. But there's enough of the opposite that everyone winds up losing (imho).

Re: (OT) Old blood versus new blood
by VSarkiss (Monsignor) on Dec 17, 2001 at 23:11 UTC

    I've found that the definition of what a DBA does varies from shop to shop. To be precise, however, administration of a database (particularly a complicated beast like DB2 or Oracle on MVS, things of that sort) is not the same as administration of the data model or the data within the database. I've done both tasks, and the degree of overlap varies from place to place.

    Nonetheless, I would contend that normalization is really part of a data architect or data modeller's job, not a DBA's. Generally the DBA must contend with resource management: space, CPU, permissions, and so on. OTOH the data modeler has to contend with business rules, data integrity, and "turning data to information". The best results come from the two of them working together (or by one person doing both), but they're not really the same task.

    As to the larger issue, well, I regularly work with people who are younger than the first program I ever wrote. Often I end up teaching them about new technology like XML, Perl (;-) I've always said the issue is not age, but how willing a person is to learn.

Re: (OT) Old blood versus new blood
by Rex(Wrecks) (Curate) on Dec 17, 2001 at 23:11 UTC
    Hmm, this is an old "fossils vs. young pups" debate. Unfortunatly the generalizations are often true.

    The only real answer I have ever found is to take each person for who they are and what skills they actually have. Good interviewing skills on your part go a long way to helping this.

    I would also like to address the "hotshots" observation. I have seen this by product as well, however it is usually very easy to weed out in an interview. I generally have a mistrust of people who are fresh out of school with a shiney new BS in CIS (confession time: I only have an AS). Most of these "developers" have almost no clue how software is developed in the real world (see (OT) Where is programming headed? for a sample of this)

    In many cases the self taught undereducated person is a much better resource as they have already made many of the real world discoveries.

    All that said, I think that the key to it all is finding the balance of curiousity driven skills as well as the education that is required. It's this ballance that maked a good programmer, all the education in the world is useless without the practical skills found in the real world, and all the practical skills turn into ever so much spagetti code without some educational enforcement.

    I understand and believe there are exceptions to the rules, but generally I interview with the goal of finding this balance. Someone right out of school CAN have this balance, but it will generally only be the person who has done things OUTSIDE the ciriculum.

    "Nothing is sure but death and taxes" I say combine the two and its death to all taxes!
Re: (OT) Old blood versus new blood
by perrin (Chancellor) on Dec 18, 2001 at 00:43 UTC
    I think the bottom line in your quest for a new DBA is that it's hard to find people who have a broad understanding of the issues surrounding any given technology. I often meet DBAs who can't design a schema, because they've spent all of their time tweaking tablespace allocations, implementing backup tools, etc. Many people in IT have a very narrow world view, and I think it's particularly bad for DBAs. In my current job, I have discovered that I am often the only person on the team who bothered to read the manuals for tools we're using. I make suggestions that come straight from page 36, and people think I'm some kind of Oracle guru. It's a bit disturbing, especially since they still won't read the manuals.

    Anyway, I think it's much better to hire people who have a general understanding of at least the technologies they are responsible for. I would ask a DBA candidate questions about normalization, deadlocks, isolation levels, scaling strategies, and other fundamental database issues before I got into how to tune SQL Thingy v.90210 on VMS.

Re: (OT) Old blood versus new blood
by dws (Chancellor) on Dec 18, 2001 at 02:45 UTC
    Now, I can't say that I am much of a database expert, but this guy clearly knew his stuff. Then I asked him about normalization ...

    As mentioned, the problem with the label "DBA" is the definition of the DBA role depends a lot on the organizational context. I don't think that it's entirely reasonable to take the position that every DBA should know the details of normalization. That depends on where you draw a line.

    I've worked in shops where the DBA worked in Operations, but the developer doing schema design worked in R&D. The DBA worried about the physical upkeep of the system, while the developer worried about conceptual integrity. I've also worked in shops where "DBA" meant "everything to do with that icky database thing, which nobody else wants to touch for fear of their mortal soul." The trick is to be clear about where you're drawing the line between logical and physical design.

    In chatting with our CTO, he tells me that this is something that he has seen this quite a bit in older DBAs. He claims that they worked in a time when space and CPU time were at a premium and therefore they didn't focus as much on "esoteric" concepts such as normalization.

    This generalizes past the DBA. A lot of people get stuck in whatever was hot when they first learned. And some of the people who manage to move past that still fall back on what once was when they're under pressure. And some people have been around through enough fad cycles to realize that what's hot today probably won't be in a few years. That's one of the benefits of experience.

Re: (OT) Old blood versus new blood
by poqui (Deacon) on Dec 17, 2001 at 22:33 UTC
    Re: the relative importance of Normalization...
    It also depends upon what application you are working.
    For instance, normalization is wonderful and important in OLTP, but it loses its importance in Data Warehouses, where the extra joining required to find related data is frequently too expensive.

    But, that said, I kind of agree with your CTO, I have seen that same general trend; but its not *that* strong.

    "That that is, is... for what is that but that? and is but is?" Shakespeare, Twelfth Night, Act IV, Scene 2

    "Yet THAT which is not neither is nor is not That which is!" Frater Perdurabo (pseud. Aleister Crowley), Liber CCCXXXIII, The Book of Lies
Re: (OT) Old blood versus new blood
by mpeppler (Vicar) on Dec 18, 2001 at 00:34 UTC
    To add my two cents...
    A DBA who doesn't know (much) about normalization isn't (necessarily) a bad DBA - in general a DBA is more concerned with space management, index tuning, backups and similar tasks. Also keep in mind that while the guy (or gal) might not really know the exact definitions of the 1st, 2nd and 3rd normal forms, s/he will very likely understand them intuitively - you can't be a good DBA without getting exposed to normalization issues in one way or another.

    In several shops where I've worked the DBA(s) often didn't really know the data model. They knew which tables tended to grow fast, which tables tended to cause locking problem, etc, and how to intervene to solve those specific problems.

    Michael