Beefy Boxes and Bandwidth Generously Provided by pair Networks
Just another Perl shrine

Writing/Debugging locally and then uploading

by coolmichael (Deacon)
on Jun 28, 2001 at 08:22 UTC ( #92178=perlquestion: print w/replies, xml ) Need Help??
coolmichael has asked for the wisdom of the Perl Monks concerning the following question:

I have a server running Linux, and Apache. I plan to work on some simple CGI scripts, but I don't want to have to upload them to the server every time I make a change just so I can debug them. I've also got a box with windos2k. I've been thinking about installing apache on that, writing and running/debugging the scripts locally before uploading them to the server for good.

How does everyone else deal with this? Are there any pitfalls I need to be aware of, besides making sure all the modules are installed in both places?

Sorry if this has been answered before, but I couldn't think of a way to describe this question for a search engine. Editors, if this is better suited somewhere else, like meditations, please let me know and I'll happily move it.

the blue haired monk.

  • Comment on Writing/Debugging locally and then uploading

Replies are listed 'Best First'.
Re: Writing/Debugging locally and then uploading
by clemburg (Curate) on Jun 28, 2001 at 13:11 UTC

    Easy. Treat your CGI programs like you would build an app in lower-level languages like C - use a Makefile!

    Basically, all you need to do is to create Makefile entries for:

    • testing your program (locally) (target test)
    • debugging your program (target debug)
    • uploading your program to the Linux server and running an LWP script to test your program on the Linux server (target publish)

    Then you just write your script, add it to the Makefile file entries, and say:

    make test

    If all is OK, you say:

    make publish

    Else you just say:

    make debug

    Ah, you are on Windows. Of course then you use "nmake" instead of "make".

    Christian Lemburg
    Brainbench MVP for Perl

Re: Writing/Debugging locally and then uploading
by toadi (Chaplain) on Jun 28, 2001 at 15:19 UTC
    clemburg gave a nice suggestion. But what I also do is: use samba. With this you can mount your linux dirs on you windoze...

    So I don't have to copy it each time to test it. Easy as hell.

    My opinions may have changed,
    but not the fact that I am right

      Try webdrive (, it allows you to map a drive letter to an FTP site, meaning you can edit files as if they were local. Saved me hours for sure.
Re: Writing/Debugging locally and then uploading
by RatArsed (Monk) on Jun 28, 2001 at 15:27 UTC
    My approach verges on the insane; I use apache on my desktop, have a shell window open to the folder containing the script under development; I then have two virtual hosts on the production server, one for testing, and one for live -- test in the test virtual server, before going live.

    With this approach, you get rid of most of the bugs before it leaves your development machine, testing on the test virtual server will check machine specifics (like module versions, file paths, etc.) without damaging the production system, which, if all goes to plan, is the next to get the script, which you can quickly do a sanity check on.

    As you can see, I go for the paranoid approach :o)


Re: Writing/Debugging locally and then uploading
by dvergin (Monsignor) on Jun 28, 2001 at 09:43 UTC
    My setup is like yours. Here's what I do. I keep a web browser, a telnet client, and my editor (CodeWright with integrated ftp client) open on my Windoze box.

    I code a while. Then hit upload to fire off my work to the server. Then I request the appropriate page in my browser.

    If there are any problems, I run this shell script (as root -- so it can write to the error log) on the server using the telnet client:

    if [ "$#" -eq 1 ]; then lines=$1 else lines=50 fi echo echo echo "----------printing $lines lines----------" log_file="/usr/local/apache/logs/" tail -n$lines $log_file echo "#####################################">>$log_file date >> $log_file echo "#####################################">>$log_file
    And then back to my editor to fix or build some more.

    The lines in the shell script that append to the log_file mark it and time/date stamp it each time I dump the tail. This way, I can easily see which error entries are new with the most recent page load attempt. Very handy.

    The rhythm of this entire cycle has become second nature to me. There is a 2-5 second delay with each save. But I don't even notice that any more. And the differences in the environment between the server and my box are enough that there would be sure to be surprises if I tested first on my local machine. It's just not worth it.

    Everybody's different but this works for me. HTH

Re: Writing/Debugging locally and then uploading
by TheoPetersen (Priest) on Jun 28, 2001 at 16:26 UTC
    It sounds like you're on the right track, though changing operating systems is inevitably going to add some difficulties here and there. But I do roughly the same thing.

    The key to make it work for me was to arrange the scripts, templates and modules so that everything was under one root directory that I could upload via rsync. That way I didn't mistakenly upload some pieces and not others and loose time debugging that kind of problem.

    If the site you are building is live, I'd suggest uploading to some kind of staging area, then cutting over to the live site after a quick test. Changing environments often reveals things that even diligent local testing won't find.

Log In?

What's my password?
Create A New User
Node Status?
node history
Node Type: perlquestion [id://92178]
Approved by root
and all is quiet...

How do I use this? | Other CB clients
Other Users?
Others about the Monastery: (4)
As of 2018-04-20 16:50 GMT
Find Nodes?
    Voting Booth?