Beefy Boxes and Bandwidth Generously Provided by pair Networks
P is for Practical

Simulating a Two-Domain Website on the Localhost

by shlomif (Beadle)
on Oct 31, 2005 at 17:18 UTC ( #504365=CUFP: print w/replies, xml ) Need Help??

The Problem

My homesite spreads across two domains: and Now, some pages are found on one domain and others on the other, with cross-links between the two places. Why it is so I'll tell later on, but meanwhile let's get to my problem.

The site is a static HTML one. I am building it from its source on my local machine, and then updating the live copies of both machines using rsync. I have both domains served on my localhost web-server under two different aliases, so I can test them there.

The problem is that they have the cross-links to the other domains and as a result point to the real pages on the Internet instead of the ones I've compiled and placed on my hard disk.

The Solution

So what do we do? After putting up with this behaviour for a long time, I decided that I'd like the links on one domain to point to their local alias instead of their full web address. I decided that I needed a way to serve the files in a special way that will re-write their text. I investigated various Apache modules - mod_rewrite, etc. but was told they don't do that. The best I could do using Apache was to write an output filter, but that was incredibly complicated.

So, instead I opted to write two Perl CGI scripts that will serve the files from the directory tree that contains them as modified by the simple transformations of both "" to "http://localhost/sites/hp/shlomif/" and "" to "http://localhost/sites/hp/vipe/".

I ended up writing it as a module and two very simple files. The module is:

package ShlomifServe; use strict; use warnings; use CGI; use IO::All; use MIME::Types; sub serve { my (%args) = (@_); my $dir_to_serve = $args{'dir_to_serve'}; my $cgi = CGI->new(); my $mimetypes = MIME::Types->new(); my $path = $cgi->path_info(); if (grep { ($_ eq ".") || ($_ eq "..") } (split /\//, $path)) { print $cgi->header(); print "<html><body>You suck! Don't use .. or . as path components</body></html>"; exit(0); } if ($path =~ m{/$}) { if (-f $dir_to_serve.$path."index.html") { $path .= "index.html"; } else { opendir D, $dir_to_serve.$path; my @files = (grep { $_ ne "." } readdir(D)); closedir(D); print $cgi->header(); my $title = "Listing for " . CGI::escapeHTML($path); my $files_string = join("", map { my $fn = CGI::escapeHTML($_) . ((-d $dir_to_serve.$path."/".$_)?"/":""); qq{<li><a href=\"$fn\">$fn</a></li>\n} } @files); print <<"EOF"; <html> <head> <title>$title</title> </head> <body style="background-color:white"> <h1>$title</h1> <ol> $files_string </ol> </body> </html> EOF exit(0); } } my $file_full_path = $dir_to_serve.$path; my $text = io()->file($file_full_path)->slurp(); my $mime_type = $mimetypes->mimeTypeOf($file_full_path); if ($mime_type eq "text/html") { $text =~ s!http://www\.shlomifish\.org/!http://localhost/sites +/hp/shlomif/!g; $text =~ s!http://vipe\.technion\.ac\.il/\~shlomif/!http://loc +alhost/sites/hp/vipe/!g; } print "Content-Type: $mime_type\n\n"; print $text; exit(0); } 1;

And the files are:

#!/usr/bin/perl use strict; use warnings; use lib '/var/www/perl/shlomi'; use ShlomifServe; ShlomifServe::serve( 'dir_to_serve' => "/home/shlomi/Docs/homepage/homepage/trunk/dest/t2-homepage" );


#!/usr/bin/perl use strict; use warnings; use lib '/var/www/perl/shlomi'; use ShlomifServe; ShlomifServe::serve( 'dir_to_serve' => "/home/shlomi/Docs/homepage/homepage/trunk/dest/vipe-homepage" );

The IO::All and MIME::Types modules were a great help.

The code of is freely available under the MIT X11 License, but will require some tweaking by someone who wishes to use it.

Running it as CGI was somewhat sluggish, but running it under mod_perl made it very responsive on my computer. Using this script, the original HTML, which needs to be uploaded to the server remains intact, while I can browse the two local domain sites at ease as served by the script.

Appendix: why the Two Domains?

In my last workplace before I started my B.Sc. studies in the Technion, I was allowed to host my personal site on the company's web-site under the /~shlomi/ alias. Later on, when I joined the Technion and stopped working at this company, I placed it under my account on the Technion's undergraduates' server. So the address was It was still plain HTML pages, without too much style and without a consistent look and feel.

After a while, I also got an account on which is a server maintained by the students. I started setting up some pages there, because I had a very limited quota on t2. So, then I started having links from one place to the other.

Shortly before the end of my undergraduate studies, knowing that my account on t2 is going to be terminated, I realized that I need a more permanent location for the homepage. So I started hosting it on a different server first at, which ended up causing me a lot of DNS problems and then on (which is a domain I bought). I changed all of the links I could (such as the one in my E-mail signatures) to point to the new place. Eventually, Google and other search engines adapted.

Meanwhile, however, there were still many resources remaining under, so I kept them there, so the links won't be broken. The fact that my homesite spans two domains affected the feature set of technologies I created to maintain my site like Latemp or HTML-Widgets-NavMenu.

Log In?

What's my password?
Create A New User
Node Status?
node history
Node Type: CUFP [id://504365]
Approved by herveus
and all is quiet...

How do I use this? | Other CB clients
Other Users?
Others musing on the Monastery: (4)
As of 2018-06-20 00:22 GMT
Find Nodes?
    Voting Booth?
    Should cpanminus be part of the standard Perl release?

    Results (116 votes). Check out past polls.