Beefy Boxes and Bandwidth Generously Provided by pair Networks
"be consistent"
 
PerlMonks  

On Flyweights... (with sneaky segue to data modeling)

by herveus (Prior)
on Dec 18, 2002 at 14:00 UTC ( [id://220840]=perlmeditation: print w/replies, xml ) Need Help??

Howdy!

I've been following the discussion on Inside Out objects with considerable interest. The subthread about Flyweights and the Flyweight design pattern further sucked me in.

I'm a database guy, so that perspective colors my perceptions of how this concept maps into my world. I haven't noticed mention of that angle (database). ...so here goes!

Let us consider a small relational model:

Entity       Primary Key     Attributes
Blazon       Blazon_ID       Blazon (which may be long)

Registration Registration_ID Date_Registered
                             Owner_Name_ID
                             Blazon_ID
                             How_Used
                             Date_Released

Name         Name_ID         Name (which may be longish)
                             ...other attributes
Note that two of the attributes of a Registration are foreign keys from the Blazon and Name entities.

One possible object interpretation of this data model can be based around the Flyweight pattern. A Perl implementation could use a blessed scalar for the object, where the value of that scalar is the Blazon_ID, Name_ID, or Registration_ID. The Blazon class implements the Blazon table; it could have a real RDBMS backend, or it could use some other persistence mechanism. Similarly for the Name and Registraton entities.

For nuts-and-bolts access/update, object methods provide access to the attributes/columns of the individual objects (update). Class methods provide entity/table level access (insert/delete/select). The class can provide caching as appropriate.

...so where am I going with this?

If one looks at one's object model through a "relational data model (normalized)" lens, many entities/tables are strong candidates for this approach (flyweight and/or inside-out).

A normalized data model tries really hard to follow the dictum "each datum is recorded in exactly one place". Multiple references to a specific datum need to be by a "reference" of some sort, be it a Perl reference or a foreign key relationship. It can be helpful to have the key value be content-free -- an arbitrary value (often numeric).

In the example above, one could have each class (Blazon, Name, Registration) overload stringify to produce sensible output (the Blazon, Name, or a string compounded of the attributes).

OK. I'll admit that I've changed subjects on the fly (and revised my title about here). I'm advocating that relational data modeling be kept in mind when putting together a package that has complex data. Many cases will only have a single entity class, but many others will have multiple, interrelated entities. If you have a data model that clearly describes your entities and their relationships, you can then make informed choices that may leave you with a denormalized implementation, but the key is "informed".

yours,
Michael

Replies are listed 'Best First'.
Re: On Flyweights... (with sneaky segue to data modeling)
by rdfield (Priest) on Dec 19, 2002 at 11:01 UTC
    ...which leads to inefficient SQL and poor system performance. This pattern results in code of the form:
    $family->new_address($house_number,$postcode); ... (...somewhere deep inside the family object...) foreach my $person (@{$self->{members}) { $person->_new_address($house_number,$postcode); }
    Not the best example in the world, but I hope you get the general idea: the SQL that would normally be constucted with joins becomes decomposed and inefficient. And before anyone jumps up and down pointing out that they'd never implement code this way, I'll say this: most of the Monks that frequent and contribute to the Monestary are what I would consider to be (at least) above average programmers, in fact, Monks who've been here a while (even just lurking) would have picked up enough ideas, tips and tricks to stand head and shoulders above their cow-workers (or cow-students), and, indeed, would never implement decomposed SQL like the example, but if such object patterns are implemented, others will use them to write such decomposed database access. I know, I've seen it happen at almost every code shop I've ever worked in.

    I don't believe that there is a definitive way around this problem, but if one considers the tables as the classes, the rows as objects and group the SQL according to the needs of business logic to create "aggregated" methods, there will be a much less scope for less able coders to write decomposed methods. I realise that's a lot of forethought before coding begins (sorry about that) but it pays off in the end when building scalable, performant systems.

    rdfield

      Howdy!

      Implementation of the normalized data model does not demand that consequence. If you are using a single RDBMS as your back-end store, you can craft code to take advantage of SQL joins. On the other hand, one also needs to consider the load to be supported. The failure to take advantage of the efficiencies the RDBMS might offer may be overshadowed by other efficiency gains found in not depending on the back-end. One could conceivably have different entities stored using different mechanisms. DBI can make that practical.

      My main focus is on the data model itself. For many problems, a good data model is critical and will make the implementation choices clearer. Certainly, the design does not stop there; one has to consider whether the persistence mechanism(s) are homogenous or heterogenous.

      Class::DBI has appealed to me as a very handy layer to puy between a collection of objects and the data-store. Your mileage may vary.

      yours,
      Michael

      I agree that a one-to-one mapping between objects and tables is not always a good idea (not to say it's always a bad idea either - sometimes there is a one-to-one between tables and business objects).

      However, I don't think that was the point herveus was making. We're not talking about the process of mapping relational databases to objects.

      herveus was saying that there can be useful insights in looking at class/object hierarchies in the context of database design and normalisation.

      In particular, refactoring an object hierarchy with lots of objects with duplicate state, to one with fewer objects that share state (aka flyweight pattern) is basically normalisation under another name.

      That's how I read it anyway :-)


      Update: Judging by the reply it looks like I was misinterpreting herveus, hence stricken text.

Re: On Flyweights... (with sneaky segue to data modeling)
by adrianh (Chancellor) on Dec 18, 2002 at 14:25 UTC

    ++ (when tomorrows votes come it ;-)

    Never thought about it that way before. It does kind of make sense to look at applying the flyweight pattern as a sort of class-normalisation process. Neat.

    Adrian adds it to his Big Bag O'Analogies, and wanders off to read his C.J.Date again.

Re: On Flyweights... (with sneaky segue to data modeling)
by diotalevi (Canon) on Dec 18, 2002 at 17:47 UTC

    This is exactly why I was I posted Sparse arrays? last night. I was thinking of using some sparse arrays as object storage using object IDs (just an incrementing integer maintained by my database) as lookups into an array. It looks like that particular storage method might actually be more wasteful than needed but it was worth a shot when operating in this sort of environment.


    Fun Fun Fun in the Fluffy Chair

Log In?
Username:
Password:

What's my password?
Create A New User
Domain Nodelet?
Node Status?
node history
Node Type: perlmeditation [id://220840]
Approved by PodMaster
Front-paged by diotalevi
help
Chatterbox?
and the web crawler heard nothing...

How do I use this?Last hourOther CB clients
Other Users?
Others meditating upon the Monastery: (3)
As of 2024-03-19 04:00 GMT
Sections?
Information?
Find Nodes?
Leftovers?
    Voting Booth?

    No recent polls found