Beefy Boxes and Bandwidth Generously Provided by pair Networks
Just another Perl shrine
 
PerlMonks  

A simple web server with HTTP::Daemon and threads

by pg (Canon)
on Oct 30, 2003 at 21:36 UTC ( [id://303419]=CUFP: print w/replies, xml ) Need Help??

I was reading some of my old posts, and came across this thread HTTP::Daemon not working in threads?.

The entire thread leads people to believe HTTP::Daemon does not work with thread, including my own reply at that time, but that is wrong.

The problem with the code in the original post, is that it didn't detach those threads. With those huge memory leaks in 5.8 threads, that is fatal. But this could be avoided by detach, in which you clearly tell Perl that you don't want to join later.

I simplified my own web server, which is HTTP::Daemon + threads, and put it here. My original code has been running for months already and there is no problem.

use HTTP::Daemon; use threads; my $webServer; my $d = HTTP::Daemon->new(LocalAddr => $ARGV[0], LocalPort => 80, Listen => 20) || die; print "Web Server started!\n"; print "Server Address: ", $d->sockhost(), "\n"; print "Server Port: ", $d->sockport(), "\n"; while (my $c = $d->accept) { threads->create(\&process_one_req, $c)->detach(); } sub process_one_req { my $c = shift; my $r = $c->get_request; if ($r) { if ($r->method eq "GET") { my $path = $r->url->path(); $c->send_file_response($path); #or do whatever you want here } } $c->close; undef($c); }

Replies are listed 'Best First'.
Re: A simple web server with HTTP::Daemon and threads
by zentara (Archbishop) on Oct 31, 2003 at 22:36 UTC
    I don't know if it is because the root isn't set for the http server, but I needed to strip the leading slash off of filenames to get this to work, when the server is serving files out of it's working directory. Otherwise I would get "file not found". Example: /index.html needed to be index,html.
    if ($r->method eq "GET") { my $path = $r->url->path(); $path = substr($path,1); #strip leading slash print "$path\n"; $c->send_file_response($path); #or do whatever you want here }

      Leading slash might be a problem, might not. Three things could affect this (actually two and three ould be considered as one point):

      • How you organize and store your web content on your local driver, and how you map those URL’s to the actual storage location.
      • In your html files, how you specify links, full path or relative path.
      • How users specify the link from their browser, full or relative?

      In general, a path with a leading slash, when you use it directely without any mapping, the system understands it as absolute path starting from the root directory, which might not be what you want.

      A good design practice is to have a mapping function, to map URL/URI to the actual storage location. This could largely ease your maintenance effort. In the future, if you move your stuffs around, this could be the single point that requires change. Or at least have a constant to specify the base path.

      Hello Fellows, I am really not a thread nor an OO expert but I am still wondering what is wrong with the code below.

      My intention is to write a http proxy in perl. Once this is working I am going to use it to cache/serve png map tiles from e.g. www.openstreetmap.org

      Why is it so slow?
      Why does it stop running after a a few request?

      #!/usr/bin/perl use HTTP::Daemon; use LWP::UserAgent; use threads; my $proxy = HTTP::Daemon->new( LocalPort => 3128, Listen => 20, Reuse= +>1) || die; while (my $conn = $proxy->accept) { threads->create(\&process_one_req, $conn)->detach(); } sub process_one_req { my $conn = shift; my $request = $conn->get_request; my $ua = LWP::UserAgent->new; print $request->uri,"\n"; my $response = $ua->simple_request($request); $conn->send_response($response); $conn->close; undef($conn); undef($ua); } ## end sub process_one_req
        To get the code to work reliably on Windows7 (ActiveState Perl v5.12.4) I had to add a sleep of at least 15ms after the thread creation. I use 50ms for good measure:
        use Time::HiRes qw( usleep ); ... while (my $c = $d->accept) { threads->create(\&process_one_req, $c)->detach(); usleep(50_000); }
        Update: after attempting AJAX POSTs with some jQuery/JavaScript, the Perl web server code was not reliable. I seemed to loose AJAX messages. (As a workaround, I installed an AJAX error handler in the JavaScript code, which performs the same AJAX POST again (well, using setTimeout)... that worked, since I can tolerate multi-second delays -- the app updates graphs that are generated only every 4 seconds.)

        Update: after playing around with this some more, I was able to get reliable operation, and handle multiple browsers and multiple requests per connection/browser if I closed the client connection/socket in the server, and the daemon/server socket in the client:

        use HTTP::Daemon; use threads; my $d = HTTP::Daemon->new(LocalAddr => $ARGV[0], LocalPort => 80, Reuse => 1, Listen => 20) || die; print "Web Server started, server address: ", $d->sockhost(), ", serve +r port: ", $d->sockport(), "\n"; while (my $c = $d->accept) { threads->create(\&process_client_requests, $c)->detach; $c->close; # close client socket in server } sub process_client_requests { my $c = shift; $c->daemon->close; # close server socket in client while(my $r = $c->get_request) { if ($r->method eq "GET") { my $path = $r->url->path(); $c->send_file_response($path) or die $!; #or do whatever you want here } else { print "unknown method ".$r->method."\n" } } $c->close; }
Re: A simple web server with HTTP::Daemon and threads
by nikos (Scribe) on Feb 07, 2010 at 12:48 UTC
    Hi there,

    I'm trying to implement a very simple HTTP server for my own client. I tried the example provided. Unfortunately, it is still leaking as bad as 100 megabytes for 100-200 requests. It's also very slow processing requests even when it returns the very same file every time. I'm stuck. Please help! Any ideas what to change?

    I'm trying to run it on Windows and FreeBSD. On Windows platform it has a bug described here HTTP::Daemon not working in threads? regarding stuck threads.

    I'm using apache2(worker-mpm)/mod_perl2/perl(threads) right now to run my server-side script. Its performance is not what I expect even after numerous optimizations. That's why I was trying to implement a simple server to process these requests. But right now it's 10(ten) times slower than the current system.

    Thank you!

      It's hard to expect that HTTP server rewritten in Perl will have better performance than apache. Are you sure that the problem is in apache and not in your scripts? If your script often returns the same data, you can improve performance by placing reverse proxy in front of apache, nginx is widely used for this purpose.

        I was experimenting with threads in Perl. And Apache/mod_perl seemed to be an overkill for such a simple request/response script - that's why I tried to implement it purely in Perl. Of course, the main problem is DB requests. Though, apache/mod_perl takes a tremendous amount of memory. Anyway, I kept on using apache/mod_perl and got rid of DB requests at all. I stopped using mysql to keep my data and switched to memcachedb (not memcache but memcachedb). Still have this problem with memory leaks when using threads in perl. Lucky me, I don't experiment with it anymore. Thank you!

Log In?
Username:
Password:

What's my password?
Create A New User
Domain Nodelet?
Node Status?
node history
Node Type: CUFP [id://303419]
Approved by davido
Front-paged by jeffa
help
Chatterbox?
and the web crawler heard nothing...

How do I use this?Last hourOther CB clients
Other Users?
Others sharing their wisdom with the Monastery: (9)
As of 2024-04-23 09:50 GMT
Sections?
Information?
Find Nodes?
Leftovers?
    Voting Booth?

    No recent polls found