http://www.perlmonks.org?node_id=645261

Since my previous meditation seemed to upset some people, here's more serious pondering.

This is not the first node on the merits of Object-Oriented Programming (or Object-Oriented Paradigm), and certainly not the last. Interested monks can read The world is not object oriented, Damian Conway's ten rules for when to use OO, and Coding styles: OOP vs. Subs for starters.

I dislike the object-oriented approach

Before proclaiming my dislike, I ought to explain what I refer to when I say object-oriented, since it has many different interpretations: I mean the approach to programming that touts

I'm sidestepping the issues how these are implemented, what kind of message-passing or inheritance mechanism is used, and certainly the syntax involved.

However, it's not the components of the approach that are icky; it's the mindset taken as a whole. In particular, I claim that

Although dealt with here separately, these are all aspects of the same thing.

Separation of concerns

An expression popularized by E. Dijkstra, separation of concerns means decomposing the problem into smaller conceptual pieces, disentangling it so that you can attack one part of the problem at a time without worrying about the rest just yet. It's a general problem-solving strategy, but it has particular importance in programming.

If you forget for a while (separation of concerns in action) that source code or program text is executable on a machine, what the program really is is a document from one human to another describing how to solve the given problem. The solution is written in a formal language to be as explicit and precise about it as needed, though unfortunately often in terms of a (abstract) machine. It's information.

In order to understand the steps taken to solve the problem or to understand the solution, there has to be a way to encode this information in such a way that you don't have to think about several different things at a time, but that you can concentrate on understanding one component of the solution, then move to the next. If you are doing statistical analysis on a set of data, you certainly need not and should not think about computing the correlation coefficient or plotting the values on the screen when you are retrieving the data from, say, a database. They are different concerns entirely.

Since computer programs have no real physical limitations (similar to material objects), it's absolutely essential to prevent yourself from making a mess of it. You, the programmer, have to actively ignore the temptation to just connect parts of the program criss-cross because it seems convenient at the time. All current programming paradigms try to assist and even enforce separation of concerns, including object-oriented programming. The strategy is to contain different parts of the solution in separate units, modules, of which more next.

Modularity is extremely good

Ignoring trivial programs, modularity is necessary to both understand a problem and find the solution. Modularity at the source code level means building your solution from blocks or units that ideally solve that one part of the problem: do only one thing, but do it well. The whole solution is then built by connecting the different modules together.

Although there is often correspondence between the concerns and the modules, this is not a bijective mapping. Often, one concern needs, due to the magnitude of the problem, several modules, and sometimes one module can address several concerns. Bringing back into mind again that the program text can be executed by a computer, which can then find solutions to instances of the problem, there may also be engineering reasons why dividing modules into smaller submodules is necessary.

An important aspect of modules is that ideally they are replaceable and reusable. Replaceable: if you find a better way to solve a particular subproblem, you can replace an existing module without having to modify the rest of the program. Reusable: once you have solved a subproblem that is general enough, that is, is bound to recur, you can reuse the module in the next problem. More about these below when we discuss interfaces.

A fundamental building block people often overlook is procedural abstraction: procedures or functions. They can and should be considered modules in and of themselves, because when written correctly, they are replaceable and reusable, and they abstract one small part of the solution behind a good name. (This applies to any programming language that has procedures or functions, not just functional programming languages.)

Also an important aspect in modules (including functions) is parameterization. That is, parts of the module's behaviour can be abstracted into parameters to that module, parameters that can depend on the problem instance. Not only does this add reusability, but in all modern programming languages you can also give modules as parameters to other modules (in one way or another), which enables you to separate concerns to (almost) orthogonal categories.

When you have separate, decoupled modules, becoming certain that they work correctly, or, if you are bent that way, proving that they are correct, becomes much easier. You have less cases to consider, less interactions between different parts, and simply less to take into account.

Decoupling interface from implementation is vital

The interface to the module is arguably even more important than the implementation behind it. This is a commonly repeated phrase in the Perl community.

The interface to the module means the connection points that you can use in combining the module with other modules. The interface abstracts functionality away; ideally you need absolutely no knowledge of how the module works, as long as you know what it does. A bad interface depends directly on implementation details that, when changed, propagate needs to change the program outside the module. A good interface seals the insides of the module from the outside world. An excellent interface is transparent: like superb quality hi-fi speakers, you cannot hear the speakers when you listen to music; similarly an excellent interface is concise yet complete, and deals directly with the right abstractions.

This decoupling is important not only when making changes to modules (when you can concentrate on changing exactly one part of the program), but also in helping separate what from how. This cannot be stressed enough, though you are probably exhausted to hear it. This makes it conceptually easier to understand what the module does: it can only receive and send data through the interface, and what is not there is not there adding to the complexity.

On the need to abstract

Abstracting is always necessary when programming, because the computer, by its discrete design, can model the real world only very poorly (and this is often not even what is wanted). You need to find a way to express the key concepts and their relations of the problem in a way that yields itself to encoding the concepts and relations in a programming language.

However, an equally important part of abstracting is to be able to distill the key concepts from the problem in the first place. Abstractions are necessary in reducing complexity, but used poorly, they will instead add complexity. The simpler way should always be preferred, and information that is not needed should be completely ignored.

A good example, as told by E. Dijkstra (in one of his EWDs, forgive me the lack of link), is dealing with synchronization issues in a multitasking environment: several threads or processes that share resources. Even if you have knowledge of the relative running times between instructions, so that you could perhaps time access to the shared resources based on this, the problem becomes much easier if you forget about the time information and focus on only sequential access to the resource -- a process may access a resource before or after another process, and that's it. (Although this is given nowadays in the problem domain, the situation was different in 1960s. According to Dijkstra, many people opposed to throwing away valuable information.)

This is actually hard to do in general in my experience. It takes much practice and thinking to find the right concepts, but this is a topic for another day.

The problems in OOP

Object-oriented programming deals with all of the issues above in the following way:

A more detailed description of object-oriented design and principles is beyond the scope of this meditation.

Now, my beef with OOP is the following:

In other words, what might otherwise be a good and universal way to decompose and model a problem, if suboptimal in some instances, is botched by three rather irritating things.

Anthropomorphic terminology -- sloppy thinking

First, anthropomorphic terminology: we have entities and objects ("that guy" and "this guy") who send messages to each other ("this guy send the packet to this one"). While this is not a requirement in OOP, it is inherent to the way you are supposed to think about problems: in terms of objects and messages.

Not only is this inelegant, there are two serious problems it creates:

  1. It guides you towards finding entities where there are none.
  2. It guides you towards operational thinking.

In other words, both ingredients in the recipe for sloppy thinking.

Trying to find entities where there are none is perhaps excusable. After all, arguably it is often easier to see the problem as a bag of co-operating entities. Perhaps this can be attributed to us being prone to seeing ourselves in everything (which ranges from mistaking a tree branch in a dark forest to a human being to projecting our desires on an amoeba).

Trying to model the system or problem as a set of entities is particularly useful in some domains, such as the frequently cited graphical user interface. Sometimes there is simply a natural fit.

However, this is a sad price to pay, because anthropomorphic terminology leads to operational thinking. By operational thinking, I mean trying to understand a program in terms of how it is executed on a computer. Usually this involves keeping mentally track of variables and their values, following one statement after another checking what it does to data, and doing case analysis with if-then-else blocks. You are knee-deep in problems once you start trying to understand loops operationally (does it terminate? will the loop counter be always inside the bounds?).

An even more complicating factor is that OOP guides you towards thinking about interactions between classes, or rather objects, operationally, that is, what will happen at runtime. Due to inheritance, the number of cases simply explodes, as instead of having an object of one type, in its place could equally well be another, similar object, whose class inherits from the class of the first object. Does this matter at the point where the object is used? Maybe. Good luck with trying to check that operationally. Not only that, but there is a proliferation of small, unrelated internal states: the states of the objects. It's simply too much to hold in one's head, even if you can only access the internal state through the interface.

Perhaps there is a way to reason about the correctness of object-oriented programs in a non-operational way, but I have yet to see it.

Conceptual overhead

Object-oriented programming includes an idea that concepts in the problem domain, be they entities or relations, should be modelled directly as classes. So far so good, but this is a tricky path to travel.

Although you theoretically can map almost any entity to a class, there are many cases where you simply should not. Consider keeping track of balls that have ended up in the lake in an amateur golf course. According to object-oriented thinking, the natural way to model this problem is to create a class for the golf ball, and another "modelling" the lake, a container class for the lost golf balls. Instantiations of the golf ball class, golf ball objects, would be added to the container, and then simply asking the container how many balls it contains gives the answer.

However, if there is nothing more we need to know about the golf balls or the lake, the entire concept can be modelled with a single integer. If there is no need to make a difference between golf balls, they can all be considered identical and treated identically.

The example is trivial and also a caricature, yet this kind of thinking seems pervasive among OOP proponents. Note in particular that I am not talking about "implementation overhead" or any notion of the former being slower to execute than the latter. I am talking about conceptual overhead, which is strongly related with separation of concerns and abstraction, particularly finding the key concepts. Having "concrete" golf balls in a "concrete" container object gives perhaps a fuzzy feeling of having created something easy to grasp, but this is an illusion.

The golf balls in this example are identical, and can be treated identically. In fact, there is nothing to distinguish one golf ball from another. We should definitely never introduce such distinctions in the source code, which is a conceptual model of the problem (forget about execution again). As it is a model, and there is supposed to be some goal in making the model in the first place, excluding unnecessary detail will not only make the model simpler and more elegant, it will also make solving the problem much, much easier.

You may snicker at the silly example here, but in my experience extraneous classes are incredibly common. (I'm no better, to be honest, but I try to learn.)

The only way

Although limiting yourself to only OOP may seem like a mistake on your part, object-oriented thinking generally advertises that the class is the only means of abstraction you need. Let us assume for a moment that it is.

While the class is a useful tool in modelling entities, and even in modelling relations, there are frequently cases in my programming where a full-blown object is simply unnecessary: a simple function would do. This is considered bad habit in OOP, since all your abstractions are supposed to take the form of classes.

A real example: I have a small framework that does statistical tests on data. The similarity between two data objects (which are not objects in the object-oriented sense, but can be, for example, numbers or arrays of numbers) depends on the data I am analyzing, and sometimes I want to use different similarity metrics on the same data.

The object-oriented way would be to create a class encapsulating the idea of similarity measure, perhaps into a superclass called SimilarityMetric, which has a public method that, given two data objects (say, numbers), returns their similarity. The class SimilarityMetric is abstract in the sense that by default it defines no similarity metric; it is the responsibility of inherited classes to refine the similarity concept. For instance, you might have a simple class called AbsoluteDifference that would base the similarity on the absolute difference of two numbers. Then, you would instantiate an object from a concrete class, inherited from SimilarityMetric and sharing the same interface, and give it as a parameter to the framework.

The more sensible way, if only your programming language has the means (and Perl does!), is to simply give a function to the framework. While this may seem ad hoc, it needs not be. As long as the framework demands that the function accept certain kinds of input parameters and returns, say, a numeric value as a result, that is, defines the interface that the function must have, then this model is equally well modularized. It is also conceptually considerably simpler, and it solves another problem: polymorphism.

The point deserves more stressing: the function is a module in this case, because it models a single concept (similarity in a particular domain), and it is replaceable (since there is a defined interface). It also operates at a higher abstractional level than a class hierarchy.

In the object-oriented way, if you want your classes to support any type of data, you need to either use generics (which is a topic in and of itself) or create some sort of "interfacing" class that your classes use and that is able to encapsulate the data objects -- in some way.

However, in the functional way, you can supply the similarity function when you supply the framework with the data. If the framework is oblivious to the type of data it analyzes (for instance, requiring only that each data object has an identifier), and only uses the results of the similarity metric in the analysis, then there is no need for generics or polymorphism in the first place: the question never arises, because your function is by construction specialized to handle the input data.

Another way to model the problem in object-oriented way, with inspiration received from the functional model, is to encapsulate the similarity metric with the data. That is, the data container would provide means to compute the similarity between to data objects it contains. However, the same problem presents itself: how do we parameterize the similarity metric? We could define different classes that would accept a data object container and define a similarity metric -- but this is not any better than the first object-oriented model. There is a serious problem: the data container is now coupled with the similarity metric. These are two entirely different concepts and should not be mixed in this way.

If my experience is any guide, there are more cases where a simple function does better than a fully-blown class (or worse yet, class hierarchy).

But I still use objects and classes

Despite my dislike, I often use objects and classes, because in many cases they are a natural fit. There are indeed many problem domains where the most elegant way is to use classes. For example, suppose you have tabulated data of some sort that you wish to print. There are different output formats, say HTML tables and CSV, but they all share the same basic interface; namely, you print a header listing the names of the columns, and then you print rows one at a time.

(Even though the CSV format, informal as it is, defines no header, this can simply be just another row.)

Object-oriented programming is a natural fit here. Say you create an abstract class called Tabulator, and concrete classes HTMLTabulator and CSVTabulator. The implementation details of the concrete classes differ, mainly in what kind of formatting they do, and the HTMLTabulator should probably support setting attributes to rows and columns in some way (such as alignment, width, or column or row spanning) or perhaps it uses templates, but once configuring is done, there is no difference. You can simply give either as a parameter to any function or class that needs to print your tabular data, and if you later need more output formats, simply inherit it again from Tabulator.

Naturally you can solve this with the functional approach, but at least I cannot think of a way that would not be structurally (very) similar to the object-oriented approach and still be better.

To recap: I dislike object-oriented programming in general, but it has its uses. One just needs to be careful, and have more than one tool in the toolbox.

Perl is good

The good part about Perl is that my modules can be simply collections of procedures, or collections of higher-order functions, or classes in the object-oriented sense. I can pass both functions and objects to and from other functions and objects. I can use whichever paradigm I need at any moment.

Of course, with all the power comes responsibility. You need much experience to be able to decide which way works best for any given problem, and I do not claim to have that experience yet. However, I stand fully behind the claim that problems have natural solutions in different paradigmatic approaches. Sometimes object-oriented programming does the trick, sometimes functional, sometimes a completely another approach.

However, I'm slightly worried about the approach taken in Perl 6: everything is an object starting from fundamental types of data such as numeric constants. Sometimes it is useful to think in terms of objects, but often I want a number to be just a number, nothing more. Once again, this is a topic that has meditations already and that deserves another meditation of its own. The situation is not as black and white as this short paragraph makes it seem.


Pardon me all the references to E. Dijkstra, but he was simply an excellent and clear-thinking chap. I also apologize for the possible uses of anthropomorphic terminology in the above. It's difficult to avoid when talking about OOP.

--
print "Just Another Perl Adept\n";

  • Comment on I dislike object-oriented programming in general

Replies are listed 'Best First'.
Re: I dislike object-oriented programming in general
by bradenshep (Beadle) on Oct 17, 2007 at 03:57 UTC
    I'm glad my school does not have an "Everything is made of Java" mindset. There are FP classes here, and we use C as well as C++ (among others), so even our imperative programming is not all object-oriented.

    There should be charities for those who have never even glimpsed outside of the box. Adopt-a-Drone, or the Functions are Data Foundation. It's a terrible illness and it destroys their quality of life.

    I would use something like the golf ball example as an interview question, especially with students or fresh grads. Can they, even with some prodding, realize that a GolfBall class is unnecessary in that example? If not, I don't really think I want someone with that much of a conceptual mental block working for me.
      But the burning question is what should GolfBall inherit from.

        Ball and SportsEquipment; Ball inherits from Sphere, which inherits from PlatonicSolid, which inherits from...

        It's classes all the way down!

Re: I dislike object-oriented programming in general
by stvn (Monsignor) on Oct 17, 2007 at 20:25 UTC

    Well, you make several good points, but as with anything good in life (sex, drugs, rock-n-roll, you name it), it is best when it is in moderation. OOP is just one of the paradigms you should be using in your code. Anyone who tells you different is usually a first year CS student who just finished thier first "Intro to Java" class and can be safely ignored. Also, first class functions != functional programming, there is a very long history of first class functions/blocks in OOP (see Smalltalk). FP is a much deeper mindset which goes beyond just "functions as data".

    You should really look into some of the newer thinking around OO, and even some of the older, but less mainstream, thinking as well.

    For instance, roles (and the original concept of Traits) goes a long way towards helping modular decompositon in OO not be so "entity" centric. You might be interested in this talk I just gave on Moose::Role at the Pittsburg Perl Workshop this weekend. Towards the end of the slides it gives a number of examples of how roles can provide features that that a class "does" where an "isa" relationship just wouldn't make any sense.

    You should also look into some of the more multi-paradigm languages like Scala and OCaml, both of which provide an excellent hybrid of OO and functional paradigms.

    And lastly, OOP != Java/C#/C++ there are some really nice OO systems out there in which modeling is not so "entity" centered. Take CLOS for instance, it uses generic functions and classes, so that behavior is very clearly seperated from state. There are many Scheme OO systems which expand on the CLOS concepts too. There is also prototype basesd OO, which also leads to very different modeling approaches.

    However, I'm slightly worried about the approach taken in Perl 6 ...

    Fear not, it might be OO under the hood, but this wont stop you from ignoring it's OO-ness as much as you want. And as for efficiency, let the compiler writers worry about that :)

    -stvn
      You forgot to mention Self and its more popular cousin Javascript. JS is more OO than Ruby, but it doesn't have the concept of a class.

      My criteria for good software:
      1. Does it work?
      2. Can someone else come in, make a change, and be reasonably certain no bugs were introduced?
        I'm curious. In your view, on what basis is JS more OO than Ruby? If anything I'd say that it was the other way around. (Note that both are far more OO than Perl.)

        For example in Ruby if you sort an array of numbers, it defaults to sorting them as numbers. In JavaScript it defaults to sorting them alphabetically, no matter what their types are.

        For another example, in Ruby I can trivially add a new method to Integer, and then call my new method on the integer 5. In JS it works, but much less consistently. This works:

        Number.prototype.foo = function () {alert("Hi!")}; x = 5; x.foo();
        But this doesn't (at least not in Firefox):
        Number.prototype.foo = function () {alert("Hi!")}; 5.foo();
        That isn't what I expect from a completely object-oriented language!

        Furthermore the OO hooks in Ruby are much more pervasive than in JavaScript. For instance I once tried to write a fairly efficient factorial function in Ruby. I found that it was more efficient than the routine to convert big integers to strings! So I replaced the routine to convert big integers to strings.

        I wouldn't even dream of messing around in the internals like that in JavaScript.

      And lastly, OOP != Java/C#/C++

      I deliberately tried to steer away from any particular programming language, but if you received the expression that I'm talking about those three, my apologies. The concerns I have about OOP goes beyond single versus multiple inheritance, static versus "dynamic" typing, calling methods versus sending messages, etc. The point was not to compare programming languages, but to explain why the means of abstraction and combination in pure object-oriented thinking do not appeal to me in general.

      FP is a much deeper mindset which goes beyond just "functions as data".

      I should know, having been a FP fanatic... My "functions are modules" argument doesn't mean first class functions or doesn't even presuppose functional programming. What I mean by "passing modules as arguments" is trying to be a generalization of what you can do in different programming languages. It might be implemented as being able to pass function pointers (C); or function names (ALGOL); or function references (Pascal). It might be implemented as passing closures (e.g. Scheme, Haskell, Perl, and too many languages to list); or passing objects; or doing something exotic.

      The point is that you can parameterize what code does as well as which state it starts the computation from ("non-module" parameters such as numbers and strings).

      And as for efficiency, let the compiler writers worry about that

      But it's not even a concern for me... To me, a programming language is foremost a notation with which and in which to express ideas, usually algorithms. That we have machines that can use text written in the notation to do something is just a bonus. (Rather nice bonus, I must say.) This stand is partially hypochritical, but I can live with it.

      If I am worried about pervasive OO thinking in Perl 6, it's because frequently I don't want to think in terms of objects. There are no "efficiency" worries -- I already know there are efficient implementations for message-passing, delegation, virtual function tables, and what-have-you that goes with implementing these things. Just take a look at C++ or OCaml.

      The concept of roles resembles Objective-C protocols, though with being able to define not only which functions the implementing class needs to provide but also some common functions that all classes implementing the role "inherit". However, this would again be a much more useful technique to think about if there was no mandatory link to objects and classes! (That's just me.)

      I'll install Moose::Role some rainy day, I promise.

      --
      print "Just Another Perl Adept\n";

        I deliberately tried to steer away from any particular programming language, but if you received the expression that I'm talking about those three, my apologies.

        Well, it just seemed to me (and I may have read in between the lines too heavily and I apologize in advance if that is so), that much of what you were talking about were problems with particular OO implementations and the more idiomatic usage of said OO implementations. For instance, in CLOS programming it is not uncommon to have many plain vanilla functions (and macros) along with the classes and generic functions. The same can be said of much Javascript and C++ programming as well. It is only languages like Java which do not allow vanilla functions/subroutines to exist that the "pure OO" approach tends to win out (mostly because there is no other choice).

        The point was not to compare programming languages, but to explain why the means of abstraction and combination in pure object-oriented thinking do not appeal to me in general.

        Well, I think it is very hard to discuss abstract OO thought without at some point coming back down to the language level. Every OO system has it's own set of rules and therefore has its own set of limitations, and some systems contradict or are in direct conflict with one another. A "pure OO" system which is not tied to any language would need to be defined before it can be discussed. As for discussing the merits of the parts of OO like abstraction, polymorphism, encapsulation, modularization, etc etc etc, that discussion too will eventually need to come back down to a particular implementation for all the same reasons.

        My point is basically that there is not such things as "pure object oriented thinking" which is 100% language agnostic.

        My "functions are modules" argument doesn't mean first class functions or doesn't even presuppose functional programming.

        I am a little confused by what you mean when you say "functions as modules", this makes me think of the Standard ML module system and functors in particular. A function being a function which takes a module as an argument and returns a new module. Is this what you are refering too? If not then I am totally confused by your use of the word "module". Please expand.

        If I am worried about pervasive OO thinking in Perl 6, it's because frequently I don't want to think in terms of objects.

        This is exactly my point, you wont have to think it objects if you dont want too. This is a stated design goal of Perl 6.

        The concept of roles resembles Objective-C protocols, though with being able to define not only which functions the implementing class needs to provide but also some common functions that all classes implementing the role "inherit".

        No, that is totally wrong actually. Obj-C protocols are pretty much the same as Java interfaces, and therefore are about as use{ful,less}. Roles (optionally) provide an implementation as well as just abstract interfaces. This makes them more akin to mix-ins or multiple inheritance. However, unlike mixins or MI, roles are actually composed into the class (in perl we do this with symbol table operations to copy from the role package into the class the role is being composed into) which means there is no inheritance relationship between the role and the class. There is also a strict set of rules by which roles are composed into classes, which makes for highly predictable behavior, whereas MI and mixins are much more difficult to predict behavior-wise.

        However, this would again be a much more useful technique to think about if there was no mandatory link to objects and classes!

        Well actually, they don't have to be linked to objects and classes at all. Roles are very similar the OCaml/SML module systems in that they can be used to compose a non-OO module just as easily as they can be used to compose classes. In fact, I have one specific usage of roles which uses pure roles as "function groups" which can be composed together (following the rules of role compositition) and are then used as "function groups" and never actually composed into classes/objects.

        I'll install Moose::Role some rainy day, I promise.

        You won't like it because it comes with the entire Moose object system. And that is deep down OO to the core, complete with metaclasses and all that fun OO-over-abstraction you seem to really dislike. In fact it is a meta-circular object system so it is actually OO that is implemented in OO. Which I am sure just makes you cringe :)

        So, in conclusion, I think you have many good points, but your anti-OO stance seems to me to be somewhat reactionary to the pro-OO zealots. All good programmers know that silver bullets don't exist and in the end you just need good tools to get the job done. Those tools may be OO-based, they may be full of FP madness, or they may be a crusty CPAN module from the mid-90s like FindBin that sucks horribly but (for the most part) works when you need it to so you just use it and move on with your life. The moment you exclude any of those tools on some philisophical, moral or religious basis, you are really just adding more work for yourself.

        -stvn

        Regarding Perl 6, giving you access to everything via objects/classes does not mean requiring you to write everything as objects/classes. I like OO because it provides a simple and effective way to encapsulate data, even provides for encapsulating code, and provides convenient namespaces to avoid name collisions (encapsulating names).1

        Sure, many (perhaps most) authors trying to teach OO certainly make too big of a deal of inheritance (and don't teach the pitfalls of overuse of it). And I can see your concern about people trying to make too many or the wrong classes trying to identify the "objects" they are dealing with.

        But I don't think "the Larrys" are newbie OO fanboys so I don't expect them to make stupid OO design mistakes. And I don't see the downside to having a unifying "framework" for providing convenient access to methods and attributes of internal components. And the Larrys certainly don't appear to have drunk the "OO koolaid" of the Java designers, trying to deny coders the ability to design in paradigms other than OO. Quite the opposite.

        1 Yes, I know about the problems with name collisions in the face of inheritance. I don't use that type of inheritance much and you shouldn't have to either.

        - tye        

Re: I dislike object-oriented programming in general
by Tabari (Monk) on Oct 17, 2007 at 14:37 UTC
    No programming paradigm can aspire to completeness.
    All four styles: procedural, OO , declarative and functional should have their place.
    Bad feelings towards one of them is perhaps not the best starting point, but often OO evangelists say that OO is not a silver bullet and then continue forgetting this ( a style figure which may be compared to paralipsis).
    I don't think that good design should model the world, it needs to serve other purposes, including the ones you mention. The design patterns are a good example.
    Tabari
      I don't know why, but I've always felt uncomfortable speaking of OOP as a programming paradigm (it was taught that way to me); rather it always seemed to me that perhaps OOD is a manner of abstraction that may apply to more than one programming paradigm.

      The three main programming paradigms would obviously be "imperative", "logical" and "functional". Am I correct in believing that all three paradigms could host OOD practices ?

      If that's the case, then we can say that the manner of abstraction is somewhat orthogonal to the operational semantics of the language:

      • O'Haskell is an object-oriented abstraction library for Haskell.
      • CLOS is an all-encompassing object system for Lisp.
      • LogTalk and OL(P) are examples of object-oriented abstraction libraries for Prolog.

      Languages usually aspire to one paradigm (exceptions include languages like OCaml which implement both "imperative" and "functional" semantics).

      Within languages, programmers often find ways to express other paradigms within the paradigm of the host language:

      Anyone have any thoughts on this?

      -David

        The lazy answer is that these are all Turing complete programming languages, and can thus (skipping a few implicational steps) emulate each other. So yes, you are correct that given any Turing complete programming language, we can do object-oriented programming in it. Since all implementations of programming languages run on a computer, ultimately what you are doing is functional/imperative/logical/object-oriented programming in machine language.

        However, it's a separate thing to ask if the syntax and semantics of a programming language makes one paradigm easier or harder than the other. I find programming languages such as Scheme much easier to work with exactly due to minimalistic core features and closures: if I want objects, I'll just wrap the methods in a closure. Doing the reverse in, say Java, that is, using classes and objects to emulate closures entails creating a new class definition for each closure you use, then instantiating objects from them.

        Both are possible approaches, and as has probably been discussed in the monastery many times (sorry, I can't search right now), closures and classes/objects are just about equivalent. Now, many "functional" programming languages support objects (since it's really easy to do with closures) and many "object-oriented" programming languages support closures. Arguing which one is better is a source of much heat but usually little light.

        As for me, I'm prone to pick a programming language that has closures rather than one that only has objects, because I tend to use closures more.

        --
        print "Just Another Perl Adept\n";

        I think you have a point. Logical programming describes what the solution looks like; functional, a set of transformations from initial conditions to the solution; imperative, a sequence of steps from here to there. OO describes... what? How a bunch of things interact in a system that, if you're lucky, gets you where you want? It's a useful way of breaking down problems in the imperative paradigm, but not really a new mode of thought.
Re: I dislike object-oriented programming in general
by blazar (Canon) on Oct 17, 2007 at 15:12 UTC
    However, I'm slightly worried about the approach taken in Perl 6: everything is an object starting from fundamental types of data such as numeric constants. Sometimes it is useful to think in terms of objects, but often I want a number to be just a number, nothing more.

    I personally believe you should not be worried, not even slightly. When you want a number to be just a number, you're free to think of it as such except for some situation in which you may need to call a method on it. But then you can even consider that as a little bit of funky syntax.

Re: I dislike object-oriented programming in general
by toma (Vicar) on Oct 18, 2007 at 08:57 UTC
    Don't bother to read my rambling reply unless you are an OO skeptic...

    You can eschew OO jargon and consider objects to be user-defined data types, methods to be language extensions or function libraries, and inheritance to be a curiosity.

    OO hype can be annoying and counterproductive.

    From a user/programmer point of view, it is interesting to notice bugs that occur in applications due to over-fondness for objects. But cataloging this menagerie is unlikely to dampen the current enthusiam for objects.

    Even before the first software object, there was a business in selling integrated circuits (ICs) made from semiconductors. These worked wonderfully, and you could design circuits with them without understanding the IC guts. Software developers became jealous, and proposed selling 'software ICs' with no exposed guts. This terminology only appealed to electrical engineers, and to broaden the appeal the name was changed to 'object oriented programming'. Nauseating terminology was introduced, and non-believers were declared incompetent to comment on the subject. New books were written, courses were attended, conferences were established, and money changed hands. Software productivity ground to a halt as developers retooled, and quality suffered for years.

    I think the industry has mostly recovered from the OO setback, and now there is possibly some benefit from OO. As Abigail observed, OO enables the creation supportable spaghetti code. If programmers are going to write spaghetti, supportable spaghetti is better.

    Last I checked, the overall semiconductor industry revenue is still larger than the overall software industry revenue. The irony is that they semiconductor designers use languages like Verilog, VHDL, and SPICE, which are not particularly OO, to design products that are actually tangible objects.

    Electrical engineering is fun, perhaps you should check it out! In that domain, objects tend to be physical objects, the abstractions are less annoying, and there is still a decent living to be made.

    It should work perfectly the first time! - toma
      ... hype can be annoying and counterproductive.

      ++ The ellipses above can be replaced by almost any magic bullet, holy grail, one true way or dogma and still be true, but it is rarely more so than than in your original formulation.

      That said, XML and relational databases come close.


      Examine what is said, not who speaks -- Silence betokens consent -- Love the truth but pardon error.
      "Science is about questioning the status quo. Questioning authority".
      In the absence of evidence, opinion is indistinguishable from prejudice.
        Being a relative newcomer to both XML and relational databases I too find them annoying.
        In my case this is because I do not as yet fully understand their mechanics and as a result I am irritated at not being able to do what I would like to do.
        However I doubt that that is the reason for your feelings.
        I would be interested to know why you feel XML and relational databases to be "annoying and counterproductive".
Re: I dislike object-oriented programming in general
by MonkOfAnotherSect (Sexton) on Oct 19, 2007 at 01:37 UTC
    I suspect (and please don't take offense if I'm wrong - none is intended) that you are rebelling against formal Ivory Tower-ist training in Java. If that's the case, then you'll find many people that aren't mad about either Java or ivory-towerist OO ;-)

    Sensible OO-all-the-way-down ("SOOATWD") languages that are built on objects allow users to perform (at least) procedural programming without worrying about classes and objects. If you want, say, to operate on numbers in a loop in Python or Ruby without needing to think about "3" being an object under the covers you can (Ruby uses methods on its numbers, but code can still be fundamentally procedural in nature). Check out the Python and Ruby version of Chapters 2 and 3 of the Perl Cookbook v1, for instance, to see how non-threatening numbers as objects can be.

    In SOOATWD languages, your modules can be simply collections of procedures, or collections of higher-order functions, or, yes, classes. You can pass both functions and objects to and from other functions and objects. You can use whichever paradigm you need at any moment. Seriously. You don't have to think of "message passing". You can just call methods or member functions with parameters instead. Or whatever other terminology or world view makes you and others happiest.

    Part of the problem of OO equating to Inviolable Public Interface is Java's need for an object's variables to have accessor methods -- obj.getFoo(); obj.setFoo(x) -- since you may want to change the underlying implementation and if you don't use accessor methods then code which references those variables directly is screwed. This is far less of a problem with idiomatic code in SOOATWD languages which have properties, where underlying implementation can be changed and the changes remain invisible to any code which references those variables.

    Perl 6 won't be Java or Smalltalk ;-)

      He could also be rebelling against the (IMHO) pervasive silliness that infects parts of CPAN: glance through these for some prime examples. When you find yourself doing something like return DoStuffer->new->doStuff(), you're almost certainly doing it wrong, or at least much more painfully than necessary.
Re: I dislike object-oriented programming in general
by doom (Deacon) on Oct 20, 2007 at 19:52 UTC
    A few corrections:

    I don't think the right phrase is "anthropomorphic terminology", I think what you're really complaining about is the over-use of "metaphors" as the only way of understanding an abstraction. There are people who still think that "identify the nouns" is a good principle in OOP design, but that's kind of silly. If you look at the typical "objects" we really use, they tend to be totally made-up entities like "database handles" and "statement handles" and so on.

    Some of the things you're complaining about are already understood to be problems within the OOP crowd: it's understood that you should avoid over-reliance on inheritence. Slogans recommending "aggregation over inheritence" are pretty common (as is the point that "inheritence breaks encapsulation").

    A few comments:

    Myself, I tentatively suggest that inheritence should usually be reserved for fixing problems in the original design. You should code to allow sub-classing, but find other ways to share common code.

    An "object class" should just be thought of as a bunch of routines that need to share some data. There are a lot of ways of doing that (e.g. closures) and I largely just use OOP because I think it's familiar to more people, i.e. I use OOP for social reasons more so than technical reasons.

    However there is a technical advantage of OOP: the ability to generate multiple "objects" (i.e. data namespaces) all of the same "class" (i.e. using the same method namespace [1]) within a single perl process. (But I wouldn't be surprised to learn that "functional" programming world has it's own way of doing this).

    I don't think that OOP is tremendously useful for polymorphism, by the way, I think plug-in architectures work better, (ala DBI/DBD).

    [1] Looking at this again, I see I'm oversimplifying by ignoring class data... I almost never use class data, myself.

      I tentatively suggest that inheritence should usually be reserved for fixing problems in the original design.

      Well, the first project in my current job was adding features a semi-complex web application -- written in PHP. The previous developer, no doubt a good programmer otherwise, apparently felt too energetic, i.e. not lazy enough, when he wrote the original source files, because there is considerable overlap in functionality. Quite often this is because he used copy-paste to implement features on pages that lacked them. Needless to say, when I was asked to change the way some summary fields in reports are computed, I had to first manually read through all files and discover the five or six places where the same copy-pasted computation took place.

      Now, the job is nice, and I've actually had fun refactoring this. My very first inclination was to abstract the common code to functions, then pass these functions around, akin to higher-level programming. However, although PHP supports lambda expressions through eval, this quickly turned out to be infeasible. Instead, I implemented a couple of shallow class hierarchies and abstracted most common functionality to (abstract) base classes. It's not beautiful or elegant, but it is much cleaner than the original -- plus adding new features is considerably easier now.

      Arguably this is not refactoring the design that much; just implementing the design in a bit better way. However, it's a good example where knowing object-oriented programming (inheritance too!) saved the day.

      --
      print "Just Another Perl Adept\n";

        This does sound like an example of using inheritence to fix someone else's design, but I was actually thinking about using it in the other direction: if there's some code that doesn't quite do what you need, it's sometimes very convienient to create a mutant variant by subclassing it... but if the original author was inheritence happy, you find yourself dealing with unwieldly chains of subclasses of subclasses where in order to understand what the class at the bottom does, you need to learn about all the parents all the way up the chain.

        In your example, it sounds like I probably would've used "aggregation", i.e. move the common operations to methods in a new class, where the original code needs to create a "calculator handler object" (or somesuch) to access them. The advantage is that the new code is much more independant of the existing code (it has real "encapsulation").

Re: I dislike object-oriented programming in general
by Anonymous Monk on Oct 22, 2007 at 12:13 UTC

    "However, this is a sad price to pay, because anthropomorphic terminology leads to operational thinking. By operational thinking, I mean trying to understand a program in terms of how it is executed on a computer. Usually this involves keeping mentally track of variables and their values, following one statement after another checking what it does to data, and doing case analysis with if-then-else blocks. You are knee-deep in problems once you start trying to understand loops operationally (does it terminate? will the loop counter be always inside the bounds?)."

    It is sometimes argued that the inability of people to keep track of variables is one reason that methods should be kept short. Some have argued that the number of local variables should be reduced, others that they should be eliminated altogether. However, people have difficulty thinking mathematically, which is possibly why it took so long (in terms of the history of computing) for purely functional languages to arise. Both Object Oriented, and Functinal, programming may be regarded as approaches to making code understandable: OO by intuitive understanding about manipulating objects, funtional by mathematical proof.

    So when you suggest we abandon operational thinking, what do we put in its place? Traditionally, operational thinking is what it has meant to understand how a program works.

    This goes to the heart of what good software design is about, because if we are to design code to be read, more than to be executed, then we must design it to be understood. So if that understanding is reached other than by operational thinking, it will colour how we write.

    I don't have an account here, and I seem to be showing up in preview as vroom, whoever that is.
Re: I dislike object-oriented programming in general
by Anonymous Monk on Oct 23, 2007 at 01:50 UTC
    I think I can say that I'm an "OO proponent", so I'll try to address some of the issue raised here: w.r.t "sloppy thinking": The author wrote: "Perhaps there is a way to reason about the correctness of object-oriented programs in a non-operational way, but I have yet to see it. " Have you read about Design by Contract? Bertrand Meyer's OOSC book is a classic in that area. It is a non-operational way of thinking about OO.

    Regarding "Conceptual overhead", I think it comes down to a question of appropriate design. I grant that OO is mainstream and on the mainstream there is a tendency towards overengineering (usually borne out of insecurity from the develpers), but a good team of experienced and knowledgeable programmers can help avoid this problem.

    Lastly, as for OO being advertised as "The only way", I do agree with you. I hope that the future will be strongly multi-paradigm and my current favorite language reflects this: www.scala-lang.org. Check it out.


    PS: I don't have an account here, but I'm this guy.
A reply falls below the community's threshold of quality. You may see it by logging in.