punkish has asked for the wisdom of the Perl Monks concerning the following question:

The title is ambiguous. Here goes...

I am developing a complicated web site with both static and dynamic components like so --

http://mystatic/ http://mystatic/mydynamic/

All URIs under /mystatic are static html, except under mystatic/mydynamic/ which are powered by Perl (CGI::App).

I am developing this on my laptop, and also have a public facing web server with the http://mystatic stuff (no mydynamic stuff on the web server yet) already installed and running.

I would like to also share my developments with my team (my testers) via a staging server which would be accessible at http://mystaging/mystatic/mydynamic/

Obviously, this entails me having to set up and maintain 3 different Perl instances, which, along with different URLs (the web server has DNS names, while the staging server has directory aliases), makes for a royal pain in the ass from a maintenance perspective.

What suggestions do you have?

One thought I've had is to do away with the staging server. Instead, stage my work in progress right on the public webserver, but under a URI that is accessible only from an internal range of IP addresses. So, for example, put a "coming soon" page under http://mystatic/mydynamic/comingsoon.html, but put the work in progress under http://mystatic/mydynamic_dev/, and restrict access to mydynamic_dev using Apache's rules for Allow,deny. That way I have to deal with only two Perl installations. Of course, if I bugger up, I mean, if I experiment with a Perl module, and mess something up, there is the likelihood that everything will go kerplunk, but with care, it could work.

What would monks do?


when small people start casting long shadows, it is time to go to bed

Replies are listed 'Best First'.
Re: advice sought on perl web dev setup
by hesco (Deacon) on May 04, 2010 at 07:49 UTC
    I have certainly done the deal where I stage and deploy at distinct vhosts on the same server. But its not ideal.

    Having to deploy to both a staging server and a production server actually offers an opportunity to strengthen your Makefile.PL and other deployment tools.

    On my desktop, where my sandbox has grown to over 620mb over the past few years, it is easy to overlook dependencies. Some of my early releases of new modules get plenty of red marks from the smoke testers because my Makefile.PL failed to list all the code I was relying on without being being conscious that it was underneath what I was doing.

    Cloud providers like rackspacecloud or aws let you spin up a fresh server in a matter of moments (and for pennies an hour) where you get to build everything from scratch. Its an excellent environment in which to work out the details of your deployment code. And automating those builds helps to flush out the bugs.

    CGI::App can have you monitoring cpan builds for days it seems, but even that can be nearly automated if you configure cpan appropriately to follow dependencies.

    And every now and then, you can take your staging server, shut it down and spin up a new instance with which to test everything again. If you are careful to copy your /etc/shadow and other key config files from /etc, you can migrate your testbed from one staging server to another, without losing your team in the process.

    And everytime you do so, you will flush out issues you manually worked through last time, but will automate this time for the next time. After a few iterations of that, you'll have pretty strong deployment code.

    -- Hugh

    if( $lal && $lol ) { $life++; }
    if( $insurance->rationing() ) { $people->die(); }
Re: advice sought on perl web dev setup
by mr_mischief (Monsignor) on May 04, 2010 at 07:58 UTC

    My laptop has a similar setup of web server, languages, shell, etc. to both my hosting servers. My development desktop is almost exactly like my hosting servers, except it also has all my development tools and X on it (and PySol and Gweled, and even more browsers than my laptop). Then I have a testing server here in my office that I don't develop on and just test. I can completely wipe it and configure it exactly like my hosting servers or a client's third-party or in-house server.

    Nothing but simple text changes go live on a client's site until it works on at least two other systems. I find tar, gzip, and scp are very helpful tool for getting the files exactly where I need them. I suggest just biting the bullet and having multiple systems.

    YMMV, but I find that having clients call me and complain about lost business is a lot more hassle than an extra round of tests.

    You can always, so long as you're working with allowing and denying IP ranges, deny anything outside your needed ranges for the actual, in-place directory until you're ready to launch it publicly. You could also use URL rewriting or redirection possibly also keyed to the visitor's IP.

    Update: fixed a typo

Re: advice sought on perl web dev setup
by ww (Archbishop) on May 04, 2010 at 11:55 UTC
    My solution would probably depend in some measure on the sensitivity of the material to go into /mydynamic and the relationship between the pages in mystatic and those in /mydynamic (e.g., is there a subordination relationship between multiple static pages and one or more dynamic pages).

    If the sensitivity is low and the relationships dependant only links from only a very few static pages, it may even be practical to rely on (gaghhh! I know no-one here ever expected this from me.) the truly lousy option of "security by obscurity."

    The "few" consideration comes in here: Create a mystatic^test.htm based on each parent mystatic.htm, adding a link to the relevant child-generating CGI::App script, and deploy it and the CGI::App script (obviously, after careful testing on your laptop) to the public-facing server.

    Potential problems: all those you've listed, all those associated with any attempt at security by obscurity; and the need for cleanup later.

    But if the risks of penetration (associated with having artistes of evil or the random set of folks who make just the right typo access your new material) are truly low, this might present a lower cost solution than putting a pure dev-server on line.

Re: advice sought on perl web dev setup
by Your Mother (Archbishop) on May 04, 2010 at 20:30 UTC
    What would monks do?

    I'm not being flippant: I would use Catalyst. One of its great strengths is flexibility in deployment. You can have as many copies of an app running side-by-side under different web/app-roots as you like if you set it up right; all setting it up right means is you use uri_for() to construct URIs in templates so they move with the app's context, and you change your underlying config file (and possibly the engine/server) for each. You're only limited by the RAM you've got v what each copy needs (averages I've seen are 80-150MB).

Re: advice sought on perl web dev setup
by camenix (Acolyte) on May 04, 2010 at 11:58 UTC
    I has similar questions,here is a list about what I thought.
    1.Choose/Write a framework support your opinion.
      Not suit for a "complicated web site".
      As a designer.
    2.Use make/deploy/build/test tools to do it.
      Always a good solution,but needs hard work.
      As a coder.
    3.Use Apache's rewrite module.
      It's always a fast and easy way with some limits.
      As an administrator.
    4.Use the Greasemonkey addon of Firefox.
      It works,but this method can only use on your personal laptop.
      Deploy scrips to several machines/users would be a nightmare.
      As a user.
    These methods or roles can be mixed. //Add designer/coder/administrator/user