Anonymous Monk has asked for the wisdom of the Perl Monks concerning the following question:

I have a directory with large text files generated by the server log with lines like the one above. The files are named by its date e.g. 07072002.txt - 07112002.txt Can I by reading one of the files, example reading 07112002.txt, read line by line, and place its content separated by the comma into a table in a database like SQL7? Basically the program will have to go to the specific location read all the files found, create tables for each file and place all the content of the file into the correct table cells. Is it possible? Can anyone give me a help start on this? Thank you!!!

Replies are listed 'Best First'.
Re: Database
by stephen (Priest) on Jul 16, 2002 at 16:12 UTC
    Absolutely, Perl is an excellent tool for this job.

    If I were you, I'd have a look at the standard modules Text::CSV and DBI. Text::CSV is a module that provides easy parsing for comma-separated files. DBI is the DataBase Interface module that gives Perl easy access to databases.

    If you would like someone to write the script for you, I'm sure that many monks would be happy to do so for a reasonable fee-- including myself.


Re: Import CSV to Database
by DamnDirtyApe (Curate) on Jul 16, 2002 at 17:23 UTC

    Take a look at the DBD::AnyData database driver for the Perl DBI. It has a ad_convert function which is specifically for translating between it's many supported database formats, which include DBI-supported DBMS'es and CSV format.

    If all that's a little overwhelming, you could also use Perl to simply generate the SQL, then run the generated code against the database. Something like the following: (untested)

    foreach ( @lines_from_text_file ) { chomp ; my @fields = split /,/ ; printf( "insert into my_table values\n" . "( %s, %s, %s, %s, %s, %s )\n", @fields ) ; }

    This doesn't take into consideration the quoting of the appropriate fields, but you get the general idea.

    D a m n D i r t y A p e
    Home Node | Email
Re: Database
by VSarkiss (Monsignor) on Jul 16, 2002 at 16:35 UTC

    Yes, you can do this with Perl, but if you have "well-behaved" text files, you may not have to write a program at all.

    If by "SQL7" you SQL Server 7, look at the data import tools that come with it (look under "DTS"). Importing CSV files into the database can be done very easily with the tools there. Even if you meant some other platform, it will probably still have tools.

    Where you may need to write a program is if the files are not "clean". (By that I mean they don't quote embedded commas and such.) In which case you'll have to resort to writing your own program, using some of the modules stephen has outlined above.

Re: Database
by screamingeagle (Curate) on Jul 16, 2002 at 16:57 UTC
    i'm not acquainted with the file structure you're using, but if every log file has the same data format, i would recommend against creating a new table for each file you process...that would be redundant... if you know the format beforehand , create a table using the Enterprise Manager in SQL 7 , and let your perl script insert data into that table for all the log files you process...
Re: Import CSV to Database
by flyboy (Novice) on Jul 18, 2002 at 16:57 UTC
    This code is from the Perl Cookbook. It returns an array.
    sub parse_csv { my $text = shift; my @new = (); push(@new,$+) while $text =~ m{ "([^\"\\]*(?:\\.[^\"\\]*)*)",? | ([^,]+),? | , }gx; push(@new,undef) if substr($text,-1,1) eq ','; return @new; } # Use like so: my ( $row1, $row2, $row3, ) = parse_csv($csv_line);