http://www.perlmonks.org?node_id=271671


in reply to Re: Re: Re: Clustered Perl Applications?
in thread Clustered Perl Applications?

Sure, REST sounds good, and perl structures can be easily converted to xml using XML::Dumper.

But why is it parsed so much faster? As you said it's also xml?

Especially i noticed the lack of documentation for REST!

But it might also be the frequent use of the word "rest" and the fact that Google is case insensitive. :)

Are there any examples/tutorials for REST/apache/mod_perl or even perl?
  • Comment on Re: Re: Re: Re: Clustered Perl Applications?

Replies are listed 'Best First'.
Re^5: Clustered Perl Applications?
by adrianh (Chancellor) on Jul 05, 2003 at 22:00 UTC
    But why is it parsed so much faster? As you said it's also xml?

    It's faster because you're removing a layer of indirection.

    In SOAP you have an HTTP request/response that contains your SOAP envelope which contains your data/methods.

    With a REST model you just have an HTTP request. What you're doing is defined by the URI and the HTTP request.

    REST is a different way of modelling your distributed applications. You break down your application into URI accessible resources, each accessible with a common set of methods (traditionally HTTPs GET, PUT, etc.).

    So rather than send a SOAP request to your central server with a next-job-id message, and getting a SOAP response that contains your job-id XML, you would just GET http://example.com/next-job-id/ and get your job-id XML as the response.

    All you need to write REST applications in perl is LWP. If you want to learn more about REST I'd browse the list of resources on the RESTwiki. Takes some effort to get into the mindset, but it can provide some elegant solutions.


    Update: and, of course, the data doesn't have to be XML. Use whatever seems appropriate / more efficient.

Re: Re: Re: Re: Re: Clustered Perl Applications?
by perrin (Chancellor) on Jul 05, 2003 at 21:17 UTC
    Don't worry about the acronym "REST" and just think about calling URIs on remote machines with LWP (or maybe HTTP::GHTTP for speed). You pass in some data, which could just be a big chunk of Storable if you don't want the overhead of XML, and get back some data, which again could be done with Storable. You implement the remote calls by writing mod_perl handlers (or whatever you like that handles HTTP requests and is fast).

    However, I don't really understand why you're passing around lots of data in the first place. I would implement this sort of thing by having all data in/out go through MySQL tables, and just use these remote calls to trigger operations, not to pass data.

      I'm passing around lots of data because I have a few stages of processing, and I'm processing a few terabyte on small machines.

      I have to send chunks of structured data of about 10kb - 1mb.

      I'm beginning to really like the easy lightweight idea of REST.
      Even if i have to use POST.
        I still think you could simply fetch the data in chunks from MySQL and store the result there, avoiding the need to pass it around in your control protocol.

        You will have to use POST to pass any significant amount of data. That shouldn't be a problem. The HTTP modules handle POST just fine.