Beefy Boxes and Bandwidth Generously Provided by pair Networks
laziness, impatience, and hubris

Re: Reflections on the design of a pure-Moose web app...

by sundialsvc4 (Abbot)
on Mar 23, 2009 at 13:31 UTC ( #752585=note: print w/replies, xml ) Need Help??

in reply to Reflections on the design of a pure-Moose web app...

Sure... the first post was “a bit of a late night ramble,” so let me explain a little bit.

All of the database-access in the application does use an ORM tool: DBIx::Class. But the term, “object relational mapper,” in this case is entirely a misnomer. DBIx::Class is merely the tool that a Thing uses to avoid writing erroneous SQL queries.

An object ... a Thing ... is not “a database table,” nor “a query” nor “a view.” It is ... a Thing. You can instantiate it at any time, and having done so, because it is with Storage and somehow directly-or-indirectly associated with a session, you can keep it as long as you like. Each web-page request, or each AJAX request or what-have-you, can get access to “any Thing,” just as it was, without thinking about it. (The application actually uses a separate SQL database just for session and Thing persistence.)

In the application I've just finished, for example, a Seminar is something that you can sign-up for. The underlying database has several different tables that describe different kinds of seminar-offerings, but conceptually “a Seminar is a Thing that People can sign-up for.” It is not “conceptually, a database table or tables.” It is a well-formed, persistent Perl (Moose...) object, with methods to do any task and to answer any question that you may have with regards to it, anywhere in the application.

When you are dealing with a particular Seminar, the user may or may not ask to look at “the complete brochure.” If he does, we ask the Seminar object for its brochure information ... which the object retrieves (and remembers in its own properties) if it has not already done so. Some volatile information, like attendee counts, is always retrieved from the SQL database on-request, but most of the information is retained in the object itself. Information that you have not yet referred-to (like detailed brochure information) is probably not yet retrieved and may never be ... but if and when you do, the Thing knows it doesn't yet have the information you seek and the Thing knows how to get it. To the object making the request, “It Just Works.™”

The “shopping cart” is probably a better overall example. What is a shopping-cart, really? Well, it's a container in which you store things that you might eventually buy ... or it might be a container that you leave on Aisle 7, having walked out of the store. But it is a Thing that contains, among other things, a collection of other Things, and that can do things to that collection and/or answer questions about them. So, a Shopping-Cart, like all Things, is just a persistent, storable Perl object. And you work with it as a Perl object. Like all Things, it automagically serializes and de-serializes itself such that “it's always there when you need it.” When a sale is completed, the shopping-cart is responsible for “becoming a Real Boy” by creating appropriate entries in the SQL database.

I believe that the real “win” of this approach is that you are neither modeling an application that revolves around its database, nor an application that revolves around being a web-site. Both the registration application and the internal staff applications will use the same set of objects (and their descendents). The same coding techniques can be used in any context. And if the SQL database structure changes (as it inevitably must do from time to time), those changes don't ripple throughout the code.

The ORM, along with the database(s), has been kicked out of its center-of-the-universe position, and the web-framework has been reduced to being “just the engine that drives one-of-many possible user interfaces.” The central precept of this universe is: persistent Perl objects, built in Moose, which I call Things.

Does this help?

Replies are listed 'Best First'.
Re^2: Reflections on the design of a pure-Moose web app...
by perrin (Chancellor) on Mar 23, 2009 at 15:22 UTC

    So it sounds like the thing you're doing that's unusual is to store objects as JSON in a denormalized sessions table instead of normalized database tables. That can be good for some things. In general, I don't recommend it. Here are some reasons why:

    • Databases work hard to provide a sane model for concurrent access. You break that by using denormalized storage. It's usually ok if the cached objects are all related to a single user or read-only, but the chances for lost updates and other problems become high for shared read/write data. They can also get stale and miss updates in the underlying tables.
    • Serialized perl objects aren't easy to search. If you have no normalized storage of your shopping cart and you want to find all the carts that contain seminar 75, you have to walk through all carts in the system.
    • One reason that people often design from the database up is because the database is nearly always the bottleneck and the schema design has the largest affect on your application's scalability and performance. Trying to hide or bypass the database tables can lead to designs that don't take advantage of the details of your database's implementation.
    • Large sessions often perform badly. They can easily end up needing to serialize/deserialize lots of objects that aren't even used by the current request.

    I don't mean to rain on your parade. Just pointing out some messes I've made before with sessions and caching of database objects. Apologies if I've misunderstood what you're doing.

      The application is built on top of a fully normalized and very well-constructed database. (That I didn't even have to construct! It was already there!)

      The advantage of the approach that I took in this application is that “almost all of the web application” did not have to be concerned with the vagaries of the structure of that database. The “Things” know how (and when...) to retrieve information from it, and to post information back to it. Having done so, they can retain the information, and make not-yet-permanent contemplated changes to it, without “hitting” the production database at all.

      I posit that such an approach provides for excellent separation-of-concerns within the application, and that it strictly avoids the problem of having the software design be subtly linked-to, hence dependent upon, the schema of that production database. Unstructured information of a reasonable size (i.e. a Perl object) can be stored separately for a reasonable time, in an altogether separate data-store, and then posted to the database only when, and if, necessary. Information is retrieved only on-demand, yet only the object need be aware of when such a “demand” has actually taken place.

      So... am I arguing any of your points? Absolutely not. Instead, I am just here to say:   “Hey! Looky! This approach worked really well for me!” It is a very noticeable improvement that is going to stay at the top of my tool-bag for a while.

        To perrins point about concurrent access to the serialized JSON files, you might want to take a look at KiokuDB, the BDB backend provides full ACIDy transactional goodness and works even more transparently then MooseX::Storage does.

Re^2: Reflections on the design of a pure-Moose web app...
by BrowserUk (Pope) on Mar 23, 2009 at 13:42 UTC

      The distinct advantage that I found in using Moose was the ease with which the job could be accomplished reliably. The concept of “roles,” in particular, allows for a lot of useful functionality – in this case, particularly serialization – to be leveraged very easily and reliably.

      Since Moose is built on regular Perl 5.8, clearly anything that can be done in it can also be done without it, but this is a better mouse-trap.

Re^2: Reflections on the design of a pure-Moose web app...
by zby (Vicar) on Mar 23, 2009 at 13:46 UTC
    Isn't ORM a method for getting 'persistent Perl objects'?

      No, it is a way to take Objects and a Relational storage of some kind and Map them to one another. The (very well known) impedence mismatch between the O and the R makes truly generalize object persistence difficult.

        And why would one want to do that if not for having persistant objects?

Log In?

What's my password?
Create A New User
Node Status?
node history
Node Type: note [id://752585]
and the web crawler heard nothing...

How do I use this? | Other CB clients
Other Users?
Others wandering the Monastery: (8)
As of 2020-12-02 22:07 GMT
Find Nodes?
    Voting Booth?
    How often do you use taint mode?

    Results (47 votes). Check out past polls.