Beefy Boxes and Bandwidth Generously Provided by pair Networks
go ahead... be a heretic
 
PerlMonks  

DBD::SQLite - Can't Connect to file - Solved

by wjw (Priest)
on Dec 29, 2018 at 02:20 UTC ( [id://1227802]=perlquestion: print w/replies, xml ) Need Help??

wjw has asked for the wisdom of the Perl Monks concerning the following question:

I recently ran into an issue I have not seen before.

Scenario:

I am running a simple script on a Raspberry Pi which reads some data from some sensors located on an Arduino which come across serial via USB in the form of JSON to the RPi, The script simply uses Device::SerialPort::Arduino to read the data, decode the JSON, and pump the resultant values into my SQLite3 database. I am switching from an RPi2 to an RPI3 and have moved everything from one to the other. Oddly enough, my Perl script is the only thing that is failing. The environment has been thoroughly checked including the apache2 setup, file locations, permissions, mods enabled, Perl packages installed, etc... .

The Specific Issue

My script reaches the line where the execute on a query is to take place and I get an error which says "DBD::SQLite::st execute failed: unable to open database file at ./input_handler.pl line 68.". Line 68 is the execute portion of the SQLite interaction. All code (not much) is included below. What I don't get and have not seen before is where the opening of the database takes place just fine, but the execute directive fails with this kind of error.

Any input appreciated... Thanks in advance

Updated to correct a couple of typos/misquotes

#!/usr/bin/perl -w use Modern::Perl; use Device::SerialPort::Arduino; use DBI; #use Proc::Daemon; use DateTime; use JSON; my $dbfile = "/var/www/data/weather_data.sqlt"; #use Data::Dumper; #my $dt = DateTime->now(); my ($key,$val, $q, $sth, $rec_cnt, $char, $json); my $debug = 0; #set to 1 if debug print statements +are to be displayed my $delay = 29; #must be set to less than in the ar +duino rpt_ms/1000 ###################################################################### +############ # Set up the serial port if ($debug) { say "Setting up serial port\n"; } my $dev = Device::SerialPort->new("/dev/ttyACM0") || warn "Can't o +pen /dev/ttyACM0 $!"; my $profile = 0; # set to '1' when running NYTProf +ile. Limits run length... $dev->baudrate(115200); # you may change this value $dev->databits(8); # but not this and the two followi +ng $dev->parity("none"); $dev->stopbits(1); $dev->read_char_time( 0 ); # don't wait for each character $dev->read_const_time( 1000 ); # 1 second per unfulfilled "read" +call $dev->read_const_time( 1500 ); # 1.5 seconds per unfulfilled "rea +d" call ###################################################################### +############### while (1) { # Poll to see if any data is coming in if ($debug) { say "Entering polling...." } my $char = $dev->lookfor(); if ($char) { if ($debug == 1) { say "Raw input is -> $char"; } $rec_cnt ++; next if $rec_cnt == 1; # The first line incoming tends to b +e two lines combined, skip it. my $line = $char; chomp($line); $json = decode_json($line); process_data(\$json); # now process the competed inc +oming json object, inserting it into DB } if($debug == 1) { # give up its process resources u +ntil we need it. Change here if changed in Ardduino!! say "return after $delay seconds of sleeping\n"; } sleep($delay); # the arduino sends data eve +ry 30 second (currently). Let the Perl script } ###################################################################### +################# sub process_data { if ($debug) { say "Processing polled data...." } my $json_line = shift(@_); my $dt = DateTime->now(); $dt->set_time_zone( 'America/Chicago' ); my $sth; my $dbh = DBI->connect("dbi:SQLite:dbname=$dbfile" ,"",""); if ( (exists($json->{'bbl'})) and (not exists($json->{'ra'}))) { $q = "insert into weather_data values (?, ?, ?, ?, ?, ?, ?, ?, + ?, ?, ?, ?, ?, ?, ?, ?, ?)"; $sth = $dbh->prepare($q); $sth->execute(undef, $dt->datetime(), $dt->ymd(), $dt->hms(), +$json->{'wd'}, $json->{'wv'}, $json->{'tF'}, $json->{'tC'}, $json->{' +bp'}, $json->{'rh'}, $json->{'li'}, $json->{'ov'}, $json->{'lux'}, $j +son->{'bbl'}, $json->{'irl'}, 'null', 'null' ); } elsif ( (exists($json->{'ra'})) and (exists($json->{'bbl'}))) { + $q = "insert into weather_data values (?, ?, ?, ?, ?, ?, ?, ?, + ?, ?, ?, ?, ?, ?, ?, ?, ?)"; $sth = $dbh->prepare($q); $sth->execute(undef,$dt->datetime(), $dt->ymd(), $dt->hms(), $ +json->{'wd'}, $json->{'wv'}, $json->{'tF'}, $json->{'tC'}, $json->{'b +p'}, $json->{'rh'}, $json->{'li'}, $json->{'ov'}, $json->{'lux'}, $js +on->{'bbl'}, $json->{'irl'}, $json->{'ra'}, $json->{'rr'} ); } elsif ( (exists($json->{'ra'})) and (not exists($json->{'bbl'})) +) { $q = "insert into weather_data values (?, ?, ?, ?, ?, ?, ?, ?, + ?, ?, ?, ?, ?, ?, ?, ?, ?)"; $sth = $dbh->prepare($q); $sth->execute(undef,$dt->datetime(), $dt->ymd(), $dt->hms(), $ +json->{'wd'}, $json->{'wv'}, $json->{'tF'}, $json->{'tC'}, $json->{'b +p'}, $json->{'rh'}, $json->{'li'}, $json->{'ov'}, 'null', 'null', 'nu +ll', $json->{'ra'}, $json->{'rr'} ); } elsif ( (not exists($json->{'ra'})) and (not exists($json->{'bbl +'}))) { $q = "insert into weather_data values (?, ?, ?, ?, ?, ?, ?, ?, + ?, ?, ?, ?, ?, ?, ?, ?, ?)"; $sth = $dbh->prepare($q); $sth->execute(undef,$dt->datetime(), $dt->ymd(), $dt->hms(), $ +json->{'wd'}, $json->{'wv'}, $json->{'tF'}, $json->{'tC'}, $json->{'b +p'}, $json->{'rh'}, $json->{'li'}, $json->{'ov'}, 'null', 'null', 'nu +ll', 'null', 'null' ); } $sth->finish; $dbh->disconnect; }

...the majority is always wrong, and always the last to know about it...

A solution is nothing more than a clearly stated problem...

Replies are listed 'Best First'.
Re: DBD::SQLite - Can't Connect to file...
by bliako (Monsignor) on Dec 29, 2018 at 09:30 UTC

    File permissions, *Directory* permissions and who owns them (must be the owner of the process running the inserts, e.g. chown www-data /var/www/data) are the usual suspects.

      Thanks for those reminders. I have in fact checked to ensure that the permissions are correct. The process running inserts is owned by www-data.

      I have tested an insert both by hand and via phpLiteAdmin to ensure the database can take inserts as well.

      Again, thanks for the reminders.

      Update:I was wrong. Permissions/ownership on the files under /var/www were correct, however permissions on the directories themselves were incorrect. Such a basic mistake to make! Ah well, doing things under the light of midnight oil is my excuse... .

      Thanks again for the help!

      ...the majority is always wrong, and always the last to know about it...

      A solution is nothing more than a clearly stated problem...

Re: DBD::SQLite - Can't Connect to file - Solved
by roboticus (Chancellor) on Dec 30, 2018 at 18:43 UTC

    wjw:

    It sounds like you've already got your SQLite issues sorted out. But looking at your code, I thought I'd make a couple suggestions regarding your programs structure as well as the interaction of your process_data subroutine and the database.

    Before diving in, though, I'll apologize in advance as the node got long and I rambled on a bit. I've come upon a time deadline, though, so I can't finish editing/tuning my response. So the disorganized mess (i.e. Appendix) isn't where I wanted it to be. Also *none* of the code has been tested yet, so ping me if there are any problems/questions/etc. I'll be sure to check back this evening when I get back.

    First of all, the readability of your process_data subroutine isn't very good: there's a lot of redundant code in there that makes it hard to see what's actually happening in there. Since your insert statement is always the same, rather than having multiple copies of it and the associated prepare calls, just move them before your if statements:

    sub process_data { if ($debug) { say "Processing polled data...." } my $json_line = shift(@_); my $dt = DateTime->now(); $dt->set_time_zone( 'America/Chicago' ); my $sth; my $dbh = DBI->connect("dbi:SQLite:dbname=$dbfile" ,"",""); $q = "insert into weather_data values (?, ?, ?, ?, ?, ?, ?, ?, ?, +?, ?, ?, ?, ?, ?, ?, ?)"; $sth = $dbh->prepare($q); if ( (exists($json->{'bbl'})) and (not exists($json->{'ra'}))) { $sth->execute(undef, $dt->datetime(), $dt->ymd(), $dt->hms(), $json->{'wd'}, $json->{'wv'}, $json->{'tF'}, $js +on->{'tC'}, $json->{'bp'}, $json->{'rh'}, $json->{'li'}, $js +on->{'ov'}, $json->{'lux'}, $json->{'bbl'}, $json->{'irl'}, +'null', 'null' ); } elsif ( (exists($json->{'ra'})) and (exists($json->{'bbl'}))) { + $sth->execute(undef,$dt->datetime(), $dt->ymd(), $dt->hms(), $json->{'wd'}, $json->{'wv'}, $json->{'tF'}, $js +on->{'tC'}, $json->{'bp'}, $json->{'rh'}, $json->{'li'}, $js +on->{'ov'}, $json->{'lux'}, $json->{'bbl'}, $json->{'irl'}, + $json->{'ra'}, $json->{'rr'} ); } elsif ( (exists($json->{'ra'})) and (not exists($json->{'bbl'})) +) { $sth->execute(undef,$dt->datetime(), $dt->ymd(), $dt->hms(), $json->{'wd'}, $json->{'wv'}, $json->{'tF'}, $js +on->{'tC'}, $json->{'bp'}, $json->{'rh'}, $json->{'li'}, $js +on->{'ov'}, 'null', 'null', 'null', $json->{'ra'}, $json->{' +rr'} ); } elsif ( (not exists($json->{'ra'})) and (not exists($json->{'bbl +'}))) { $sth->execute(undef,$dt->datetime(), $dt->ymd(), $dt->hms(), $json->{'wd'}, $json->{'wv'}, $json->{'tF'}, $js +on->{'tC'}, $json->{'bp'}, $json->{'rh'}, $json->{'li'}, $js +on->{'ov'}, 'null', 'null', 'null', 'null', 'null' ); } $sth->finish; $dbh->disconnect; }

    OK, that's a bit better. Now it's a little easier to see what's going on. Looking at your logic, you've got two sets of variables that you're trying to handle the NULL case for. If the 'ra' date element is missing, then you want the 'ra' and 'rr' elements to be 'null'. Similarly, if the 'bbl' element is missing, you want to set 'bbl', 'irl' and 'lux' to 'null'. If you just edit the $json structure, you can fix the data in your if statements, and then execute the statement later, further simplifying your code to:

    sub process_data { if ($debug) { say "Processing polled data...." } my $json_line = shift(@_); my $dt = DateTime->now(); $dt->set_time_zone( 'America/Chicago' ); my $sth; my $dbh = DBI->connect("dbi:SQLite:dbname=$dbfile" ,"",""); $q = "insert into weather_data values (?, ?, ?, ?, ?, ?, ?, ?, ?, +?, ?, ?, ?, ?, ?, ?, ?)"; $sth = $dbh->prepare($q); if ( not exists($json->{'ra'}) ) { $json->{ra} = $json->{rr} = 'null'; } if ( not exists($json->{'bbl'}) ) { $json->{bbl} = $json->{irl} = $json->{lux} = 'null'; } $sth->execute(undef,$dt->datetime(), $dt->ymd(), $dt->hms(), $json->{'wd'}, $json->{'wv'}, $json->{'tF'}, $json-> +{'tC'}, $json->{'bp'}, $json->{'rh'}, $json->{'li'}, $json-> +{'ov'}, $json->{'lux'}, $json->{'bbl'}, $json->{'irl'}, $js +on->{'ra'}, $json->{'rr'} ); $sth->finish; $dbh->disconnect; }

    That's certainly easier to understand. Now let's take a look at performance/efficiency: The way you have things right now, for each record you're connecting to the database, preparing your statement, editing your data, executing the insert statement, and then closing the database connection. In typical database applications, you'd want to reuse your database connection and insert statement to reduce the amount of work your system has to perform. For your current application, it may not be a concern, but in the general case of database wrangling, you can often gain a good bit of performance by reusing connections and statement handles. So I'd restructure your code a little:

    # Up above your main loop my $DBH; my $STH = setup_database(); while (1) { . . . SNIP . . . } $DBH->disconnect; sub setup_database { my $dbh = DBI->connect("dbi:SQLite:dbname=$dbfile" ,"",""); my $q = "insert into weather_data values (?, ?, ?, ?, ?, ?, ?, ?, +?, ?, ?, ?, ?, ?, ?, ?, ?)"; return $dbh->prepare($q); } sub process_data { if ($debug) { say "Processing polled data...." } my $json_line = shift(@_); my $dt = DateTime->now(); $dt->set_time_zone( 'America/Chicago' ); if ( not exists($json->{'ra'}) ) { $json->{ra} = $json->{rr} = 'null'; } if ( not exists($json->{'bbl'}) ) { $json->{bbl} = $json->{irl} = $json->{lux} = 'null'; } $STH->execute(undef,$dt->datetime(), $dt->ymd(), $dt->hms(), $json->{'wd'}, $json->{'wv'}, $json->{'tF'}, $json-> +{'tC'}, $json->{'bp'}, $json->{'rh'}, $json->{'li'}, $json-> +{'ov'}, $json->{'lux'}, $json->{'bbl'}, $json->{'irl'}, $js +on->{'ra'}, $json->{'rr'} ); }

    (Note: you rarely need to use the $STH->finish() function. In fact, I can't recall a case where I ever needed it, and I've done a *LOT* of database wrangling with Perl/DBI, so I removed it. You also don't really have to call $DBH->disconnect, either, if it's at the end of your program. But there are many cases where you may wish to disconnect from the database, so I left that in, though I do it after the end of the loop.)

    One big concern I would have with your code is the insert statement. If your project grows, eventually you'll have multiple programs interacting with the database. As such, you may find yourself modifying the table structures and fixing up various parts of the program. However, your insert statement uses the default column set of your weather_data table. If you ever alter the order of columns in your table, you may get subtle bugs in your system.

    In fact, you've already encountered this issue to a minor extent: you're always passing in undef as the first parameter in your execute statement because the first column in your weather_data table is a value you're not getting in your data. It's considered to be better practice to explicitly list what you're going to pull out of or insert into your database. That way you can change column orders, add columns and delete columns with fewer chances for errors. So the first thing I'd do is change your insert statement to something like the following, and delete the first parameter from your execute statement:

    my $q = "insert into weather_data(" . " reading_DTM, reading_YMD, reading_HMS, wd," . " wv, tF, tC, bp," . " rh, li, ov, lux," . " bbl, irl, ra, rr" . ")" . "values (" . " ?, ?, ?, ?," . " ?, ?, ?, ?," . " ?, ?, ?, ?," . " ?, ?, ?, ?" . ")";

    You'll note that I've split the insert statement into multiple lines and added a lot of whitespace. (I do it this way to make things line up, as it helps me catch errors more easily. In fact, I'd normally add \n before the closing quote on each line, so if I happen to print $q during debugging, it'll look the way I expect it to.) I also just guessed the column names, and I'm certainly wrong, so you'll have to edit the names appropriately.

    With those changes, I'd be pretty happy with the code.

    Here's where I ran out of time, so the rest of my response is a few other bits and random thoughts that occurred to me while editing, that I couldn't integrate cleanly into my reply, but didn't want to delete, so I'm just leaving them as appendices...


    Appendix 1: Possible issue with editing the $json data in place

    One objection you might have to editing the $json structure is that you might reuse it later in your code, and the edits could be harmful to that other code. In that case, you might want to make a copy of the data before editing it:

    sub process_data { my $rData = shift; # Fetch the reference to the $json data my $json = { %$data }; # Make a copy of it . . . the rest of the subroutine . . . }

    The my $json line might be a bit much for a beginner in Perl, but it's telling perl to make a new hash reference and then copy the keys and values from the original into it:

    my $json = # create a new scalar { # and we'll make it a new hash reference + containing: % # the list of key/value elements in the +hash $data # in the $data hash reference };

    Copying it this way lets you use the rest of the subroutine essentially unmodified. But since using hash references adds that extra '->' for each member access, and you're making a copy of it anyway, why not copy it into a local hash instead and save a little typing? You could do it like this:

    sub process_data { if ($debug) { say "Processing polled data...." } my $data = shift(@_); my %JSON = %$data; ### Copy the data into a hash ### my $dt = DateTime->now(); $dt->set_time_zone( 'America/Chicago' ); if ( not exists($JSON{ra}) ) { $JSON{ra} = $JSON{rr} = 'null'; } if ( not exists($JSON{bbl}) ) { $JSON{bbl} = $JSON{irl} = $JSON{lux} = 'null'; } $STH->execute($dt->datetime(), $dt->ymd(), $dt->hms(), $JSON{'w +d'}, $JSON{'wv'}, $JSON{'tF'}, $JSON{'tC'}, $JSON{'b +p'}, $JSON{'rh'}, $JSON{'li'}, $JSON{'ov'}, $JSON{'l +ux'}, $JSON{'bbl'}, $JSON{'irl'}, $JSON{'ra'}, $JSON{'r +r'} ); }

    You'll notice that we got rid of a good few uses of '->' by using the local hash.

    Hash Slicing

    If you really want to avoid typing, there's an intermediate technique called "hash slicing" that lets you access multiple entries in a hash at the same time. So rather than listing each entry in the hash like this:

    $STH->execute($dt->datetime(), $dt->ymd(), $dt->hms(), $JSON{'w +d'}, $JSON{'wv'}, $JSON{'tF'}, $JSON{'tC'}, $JSON{'b +p'}, $JSON{'rh'}, $JSON{'li'}, $JSON{'ov'}, $JSON{'l +ux'}, $JSON{'bbl'}, $JSON{'irl'}, $JSON{'ra'}, $JSON{'r +r'} );

    you could tell perl to give you a list of values from the hash like this:

    $STH->execute($dt->datetime(), $dt->ymd(), $dt->hms(), @JSON{ 'wd', 'wv', 'tF', 'tC', 'bp', 'rh', 'li', 'ov', 'lux', 'bbl', 'irl', 'ra', 'rr' } );

    ...roboticus

    When your only tool is a hammer, all problems look like your thumb.

      Thanks so much for taking the time to review my code and offer up excellent suggestions. I apologize for not responding sooner. There have been some health issues that have kept me away for a while. I will be making use of these suggestions, of that you can be certain!

      I do want to explain (excuse) the bit about connecting and disconnecting from the DB each time. I too generally leave the connection open and reuse it. My concern has been power usage and power interruption. This system is powered by a solar panel/battery setup in an area where there can be some fairly extreme weather and intermittent supply of light. I have some protection on the Pi which shuts it down gracefully when voltage drops low enough in the battery. And yet I have had some instances where power suddenly drops out. My concern was that the DB file might get corrupted were it being accessed at the point at which power fails. Thus, I open and close that DB file the way that I do. I may be wasting my time on that, but it seemed reasonable to me.

      Power consumption is pretty high on a RPi3, and there is no good on-board way of reducing that. There are however some add-ons (which I have yet to order) which will act as 'wake-on' devices. I may use one of these to address battery life in the field. Those too, literally shut down the Pi and restart it on some detected event. I figured that when I implement that, the DB being shut down is one less worry.

      My long term intent is to use a MySQL (Maria) DB, instead of the SQLite DB. But that won't happen until I get LoRa enabled Arduinos doing the data gathering and communication. At that point, the Pi will be at a location where standard power is available and consumption will not be a concern. Going to be a while though... Lots of learning to do first.

      At any rate, I wanted you to know that your efforts are greatly appreciated. This to me is what Perl Monks is about. I never fail to learn something here...

      ...the majority is always wrong, and always the last to know about it...

      A solution is nothing more than a clearly stated problem...

        This system is powered by a solar panel/battery setup in an area where there can be some fairly extreme weather and intermittent supply of light. I have some protection on the Pi which shuts it down gracefully when voltage drops low enough in the battery. And yet I have had some instances where power suddenly drops out.

        That sounds like an electrical problem. Perhaps simply a loose connection or corrosion on contacts? Extreme weather really sounds like corrosion.

        Or maybe your battery is so worn out that its output voltage suddenly drops rapidly, much faster than you expect. What kind of battery do you use?

        • Li-Ion / Li-Polymer self-destruct within three to five years, no matter what you do. And they absolutely do not like being over-charged or deeply discharged. So lithium cells often come with build-in protection circuits that simply cut power if you abuse the cell too much.
        • Lead-acid batteries release hydrogen and oxygen while charging, especially when over-charging (more than 2.4 V/cell), so you need to refill them with destillated water from time to time. They don't like deep discharging (below 1.8 V/cell), but otherwise, they are robust and reliable.
        • Maintenance-free lead-acid batteries aren't. Once they have released too much hydrogen and oxygen, they are broken, as you can't refill water. The gel-based ones have the advantage that you can use them in any orientation.
        • NiCd is old technology, they don't like beging charged before they are completely discharged (memory effect). In a solar panel + battery setup, a NiCd battery is clearly the wrong choice.
        • NiMH is an improvement over NiCd, has nearly no memory effect, but can't hold has much energy as Li-Ion / Li-Polymer cells and isn't as robust as lead-acid batteries.
        • LiFePO4 is more rubust than Li-Ion/Li-Polymer, does not self-destruct as fast as Li-Ion/Li-Polymer, but has a slightly lower voltage and less nominal capacity. Still, expect to find protection circuits on LiFePO4 cells.

        My concern was that the DB file might get corrupted were it being accessed at the point at which power fails. Thus, I open and close that DB file the way that I do. I may be wasting my time on that, but it seemed reasonable to me.

        Sounds sane. There are other options, such as using a robust filesystem that can handle unexpected blackouts and guarantee that any change to a file is done atomically.

        For another option, that can be used in combination, see below.


        Power consumption is pretty high on a RPi3, and there is no good on-board way of reducing that. There are however some add-ons (which I have yet to order) which will act as 'wake-on' devices. I may use one of these to address battery life in the field. Those too, literally shut down the Pi and restart it on some detected event. I figured that when I implement that, the DB being shut down is one less worry.

        The Raspi3 is documented to need 5 V at a maximum of 2 A. One problem of the Raspi in general is the Micro-USB-Connector used for power supply. It was originally spec'd for 0.5 A, but the connector can transport 2.5 A to 3 A before burning. The contacts are tiny, and the wires in many USB cables are very thin, so you loose significant voltages on the cable.

        The Raspberry Pi foundation recommends "a good-quality power supply that can supply at least 2A at 5V for the Model 3B". Their official power supply delivers up to 2.5 A at 5.1 V, allowing a drop of 0.1 V at cable and connectors.

        A better way to supply the Raspi is the GPIO connector (at the 5V lines), using short, thick wires or a shield with short, wide traces from the voltage regulator to the pins.

        If unexpected blackouts may happen in the solar panel+battery system, consider using a local buffer battery connected directly to the Raspi. There are some ready-to-use solutions, see for example this article from the german c't magazine. You probably don't need a huge battery, it just has to reliably deliver sufficient power for a clean shutdown. Ideally, the battery can power the raspi for a few minutes so a short blackout does not cause an immediate shutdown. The raspi should be able to detect the remaining battery charge, so it can shut down before the battery is completely discharged.


        My long term intent is to use a MySQL (Maria) DB, instead of the SQLite DB.

        I would recommend using PostgreSQL. Here is why: Re^2: What is the best Perl library to use to connect and access MSSQL2008 Database from winXP PC?, Re^2: Perl and Database, Re^2: What is your favourite Linux or cross-platform database?. PostgreSQL is quite easy to set up and run, and it comes with excellent documentation. You may also want to install PgAdmin III, perhaps on a different computer.

        Alexander

        --
        Today I will gladly share my knowledge and experience, for there are no sweeter words than "I told you so". ;-)

        wjw:

        No worries on response time--real life often happens to get in the way of things.

        It sounds like an interesting project. I expect you already know, but just in case you don't, you might want to look over some of stevieb's stuff--he does a bunch of RPi/Perl stuff, including a book and some cpan modules.

        ...roboticus

        When your only tool is a hammer, all problems look like your thumb.

Re: DBD::SQLite - Can't Connect to file...
by Anonymous Monk on Dec 29, 2018 at 08:33 UTC
    A brute-force solution would be to run your code under strace and look at lines related to any kind of file I/O under /var/www/data/. My first guess would be the disk being full, but there are still a lot of other options.

Log In?
Username:
Password:

What's my password?
Create A New User
Domain Nodelet?
Node Status?
node history
Node Type: perlquestion [id://1227802]
Approved by stevieb
help
Chatterbox?
and the web crawler heard nothing...

How do I use this?Last hourOther CB clients
Other Users?
Others avoiding work at the Monastery: (4)
As of 2024-04-20 01:31 GMT
Sections?
Information?
Find Nodes?
Leftovers?
    Voting Booth?

    No recent polls found