Beefy Boxes and Bandwidth Generously Provided by pair Networks
good chemistry is complicated,
and a little bit messy -LW
 
PerlMonks  

Re^2: Nobody Expects the Agile Imposition (Part VI): Architecture

by BrowserUk (Pope)
on Jan 25, 2011 at 13:11 UTC ( #884110=note: print w/ replies, xml ) Need Help??


in reply to Re: Nobody Expects the Agile Imposition (Part VI): Architecture
in thread Nobody Expects the Agile Imposition (Part VI): Architecture

IBM who saved itself by making the transition from type-writers to computers and business IT design in the late 80's and 90's

Oh dear!

Truth be told, a large part of NT is actually borrowing from *nix

Oh dear, oh dear.

If you are going to write authoritatively about history, it would really be better if you actually knew something about it.


Examine what is said, not who speaks -- Silence betokens consent -- Love the truth but pardon error.
"Science is about questioning the status quo. Questioning authority".
In the absence of evidence, opinion is indistinguishable from prejudice.


Comment on Re^2: Nobody Expects the Agile Imposition (Part VI): Architecture
Re^3: Nobody Expects the Agile Imposition (Part VI): Architecture
by Corion (Pope) on Jan 25, 2011 at 13:17 UTC

    Actually, IBM restructured itself (and thereby I guess saved itself) by moving from a hardware+OS vendor (OS360) to a consulting company (also see its purchase of PWC), so that part isn't that far fetched.

      I was working for them when the transition occurred. But it certainly wasn't from typewriters to computers.

        That comment was a rhetorical oversimplification (typewriters to computers) meant to encompass not just the 1980/90's transition into software and systems consulting but also a much earlier transition from mechanical office equipment and punch card machines and tabulating equipment to being one of the early vendors of mainframe computers (in the 60's).

        IBM wasn't always a computer company. It was started in the 1910's as a consolidated company for all manner of mechanical equipment - tabulating and punch card machines (used with mechanical tabulating equipment - not computers) to be sure, but also meat grinders and scales. By the 1930's they were selling typewriters as well.

        In the 1960's there was a massive restructuring and redirection as the company became increasing concerned that its mechanical tabulating equipment was going to be overshadowed by the newer electronic computing equipment. Although they had been involved in electronic computer long before that, it was not the their core business until after the 1960's restructuring.

        For those outside of the computing world, IBM's reputation in the 70's and 80's came from its selectric typewriter. According to Wikipedia, at its peak the IBM selectric had a whopping 75% of the typewriter market.

        I think we take IBM's involvement in computers so for granted these days that we fail to realize that it had to make a major internal strategic transition to get there. Not every company has been so successful even when they created the opportunity in their own backyard. Xerox muddled around with the software and operating system that eventually became the Mac PC because, even though image production (the Xerox copier) was a core product line, it just couldn't see the market for a personal computer with pretty pictures and a mouse. According to the book I read on the history of their research park, they weren't even all that sure there was much of a market for a personal computer. They also feared it would distract attention from other product lines that were important to them at the time.

        As for IBM, by encouraging the mainframe business they killed off their mechanical tabulating business. By being early vendors of the personal computer they killed off their typewriter business and the typewriter market as a whole. Two core product lines from the 1940's - gone. It may be an oversimplication but as a summary of IBM's history of reinvention, I don't think "typewriters to computers" is all that wrong.

Re^3: Nobody Expects the Agile Imposition (Part VI): Architecture
by ELISHEVA (Prior) on Jan 25, 2011 at 15:09 UTC

    The comment about NT and *nix - that was based on my memory of press reports at the time it was being developed. If I recall correctly they originally wanted to do a green field system and then found that they had to borrow certain parts of the *nix architecture - what exactly I don't remember. I know many of the developers came from DEC, but the few things I'm finding on the web focus on the VMS influence. Business press reports on technology often get it wrong, so I might be remembering someone reporting the DEC hirings and just assuming it was DEC UNIX rather than VMS that ended up in NT.

      I also got the early history of NT wrong going from memory. That won't stop me from continuing though and I'm sure BrowserUk will correct me if I'm wrong again. :-)

      I tried doing some googling just now but couldn't find any citations to support a couple of anecdotes I remember reading about years ago. I would have read these in a book not the press, but can't remember which one. One was that Dave Cutler was a very "passionate" guy, so much so that on one occasion he punched a hole in the wall when one of his junior programmers disobeyed him and made a "safe" last minute change that broke an OS release. Apparently, the hole he punched in the wall was later cut out, framed, and mounted on the wall of Dave's office. I also remember reading he was pretty passionate about not liking Unix very much, one quote I remember was that Win32 will never have Unix-like signals because Dave shouted "signals are a crock". BTW, I'm not a fan of signals either and they certainly don't mix very well with threads. Another anecdote I remember was Dave responding angrily when one of his team said "Unix did it this way". Dave responded something like "Just because Unix did it this way, it doesn't mean it's right". Anyway, the overall impression I got was that Windows NT was strongly influenced by the (non-Unix) Digital operating systems Dave and the other Digital guys hired by Gates had worked on.

      Apart from the history and the anecdotes, the Win32 API "feels" very different to Unix -- for example, compare and contrast the many complex parameters and sub parameters of the Win32 CreateProcess call with Unix fork and exec. Generally, Unix system calls have far fewer parameters than Win32 ones.

        one quote I remember was that Win32 will never have Unix-like signals because Dave shouted "signals are a crock".

        I've also heard that one. And from memory, it was related to me by someone (who claimed to be) in the meeting at the time. Which is about as good a defintion of hearsay as I can think of, but it did reinforce my long term impression that if anything, NT went out of its way to trying to avoid borrowing anything at all from unix.

        There are obviously many things that every OS has to have in common. And many more that are so obvious that when independent teams set out to solve the same problem, similar solutions are bound to arise.

        Hence, virtual memory, developed by IBM (as Virtual Storage) for OS/VS and MVS is an integeral part of any modern OS, including *nix and NT, but it would be churlish to say that they borrowed (often written as "stole") the concept from IBM.

        BTW, I'm not a fan of signals either and they certainly don't mix very well with threads.

        One analogy I read about signals was imagine what would happen if your car just instantly stopped, frozen in place--mid-corner or wherever--when your mobile phone rang. Cos that's what signals do to your code. And that kind of speaks to the fundamental difference between *nix and NT.

        for example, compare and contrast the many complex parameters and sub parameters of the Win32 CreateProcess call with Unix fork and exec.

        The NT kernel was written from the ground up to me multitasking. The entire kernel was written to be reentrant. That is, all the of the state associated with a given process (actually, thread) is maintained within the auspices of that entity, or keyed by a handle specific to that entity. It is effectively object oriented, albeit it that the restrictions of C mean that it doesn't use object syntax at the API level so you have to manually pass the object reference (handle) to the relevant APIs manually.

        This shows up (at the kernal API level) in such things as the pids, tids and fids; which are available but are mostly not used at the API level, where opaque handles are used instead.

        Another example is memory buffers. In *nix, these are routinely allocated by the kernel and passed back to the application program. The Win32 API forces the application program to allocate the memory for the buffer and pass it's address into the kernel.

        The advantages of this approach really show up when writing code that will live in a dynamically linked library, where the same code may be executing concurrently in the auspices of many threads and many processes.


        Examine what is said, not who speaks -- Silence betokens consent -- Love the truth but pardon error.
        "Science is about questioning the status quo. Questioning authority".
        In the absence of evidence, opinion is indistinguishable from prejudice.

Log In?
Username:
Password:

What's my password?
Create A New User
Node Status?
node history
Node Type: note [id://884110]
help
Chatterbox?
and the web crawler heard nothing...

How do I use this? | Other CB clients
Other Users?
Others taking refuge in the Monastery: (7)
As of 2014-12-20 09:05 GMT
Sections?
Information?
Find Nodes?
Leftovers?
    Voting Booth?

    Is guessing a good strategy for surviving in the IT business?





    Results (95 votes), past polls