Beefy Boxes and Bandwidth Generously Provided by pair Networks
Your skill will accomplish
what the force of many cannot

Merging text file

by gon770 (Novice)
on Sep 05, 2012 at 21:56 UTC ( #991955=perlquestion: print w/replies, xml ) Need Help??
gon770 has asked for the wisdom of the Perl Monks concerning the following question:

Hello, PerlMonks!

I am trying to write a script that will merge text files into CSV (comma delimiter.)

FILE A: List of names separated by newline(\b)
FILE B: List of grades separated by newline(\b)



The script will create FILE C looks like below:

So far I have
use warnings; use strict; use utf8; local $/ = undef; open FILE1, "myfile" or die "Couldn't open file: $!"; binmode FILE1; $string1 = <FILE1>; close FILE1; open FILE2, "myfile" or die "Couldn't open file: $!"; binmode FILE2; $string2 = <FILE2>; close FILE2; #and I got stuck here. I am thinking using "for" syntax, couldn't find + details. but can anybody help me?

Replies are listed 'Best First'.
Re: Merging text file
by nemesdani (Friar) on Sep 05, 2012 at 22:09 UTC
    A newline is a \n, not a \b.
    If your files are text files, why are you using a binmode read?
    You are opening the same file twice.
    About your core problem: if your files have the same number of lines, you could simply use a while loop for example. while (defined $line1 = <FILE1>)
    Then you read a line from the second file the same way, join them, or concatenate them, then write out the resulting $joinedline to a third file. (Which you already opened for writing.)

    I'm too lazy to be proud of being impatient.
      Sorry for my mistake in the code. Thank you so much for the answer :) Have a good day
Re: Merging text file
by 2teez (Priest) on Sep 06, 2012 at 05:06 UTC

    Please, allow me add to what has been said.
    You might want to avoid using Barewords as filehandles, and also consider using 3 - arugment for open function.
    In merging these two text file, Hash can really come in handy here in such format as:

    ... @hash_data{@first_data} = @second_data; ...
    Code below show how the example:
    use warnings; use strict; my %std_data; my $std_names = 'names.txt'; my $std_grades = 'grades.txt'; ## call subroutrine read_file, passing different file name @std_data{ @{ read_file($std_names) } } = @{ read_file($std_grades) }; print $_,q{,},$std_data{$_},$/ for sort { $a cmp $b } keys %std_data; sub read_file { my ($file) = @_; my $data = []; open my $fh, '<', $file or die "can't open file:$!"; while ( defined( my $line = <$fh> ) ) { chomp $line; push @{$data}, $line; } close $fh or die "can't close file:$!"; return $data; }
    NOTE: Instead of the read_file function used here, there are other module like Tie::File, File::Slurp and other that could be used to get the lines of files into array used more effectively, because I don't know how large your text files are.
    Hope this helps.

      The larger the files are, the more likely that slurping them into arrays will be a bad idea. Since we have no reason to think that's necessary, and his example keeps them in the same order as fileA (which a hash would lose), it makes more sense to read them line-by-line:

      #!/usr/bin/perl use Modern::Perl; open my $fa, '<', 'file1.txt' or die $!; open my $fb, '<', 'file2.txt' or die $!; while(<$fa>){ chomp; print "$_,", scalar <$fb>; }

      Of course, since he wants a CSV file, he may need to watch out for commas or quoting in his data. If it has none of that, he's safe. If it does, he may be better off building the output file with something like Text::CSV.

      Aaron B.
      Available for small or large Perl jobs; see my home node.

      Thank you so much for the answers :) Have a good day
Re: Merging text file
by roboticus (Chancellor) on Sep 06, 2012 at 12:43 UTC


    If you're on a *nix machine, or have cygwin on windows available, you could use the paste command, like this:

    $ paste -d, file_a file_b DAVID KIM,94 CHARLIE DOE,75 AARON PITTS,87


    When your only tool is a hammer, all problems look like your thumb.

      Thank you so much for the answer :) Have a good day
Re: Merging text file
by aitap (Deacon) on Sep 06, 2012 at 17:40 UTC
    You don't need to read the whole file in the memory. Program will not work if the files become very big. Using Tie::File (or even DBD::CSV) to work with files is a good practice. Anyway, this is an example how to work with your files line-by-line:
    use warnings; use strict; open my $first, "<", "first.txt" || die "first.txt: $!\n"; # three-argument form of open is safer # scalar variables as filehandles are more modern than barewords open my $second, "<", "second.txt" || die "second.txt: $!\n"; while (! eof $first && ! eof $second) { # while both filehandles can b +e read chomp (my $fline = <$first>); # you can read the line and chomp at the same line chomp (my $sline = <$second>); print "${fline},${sline}\n"; } close $first; close $second;
    (tested only syntax).
    Sorry if my advice was wrong.

Log In?

What's my password?
Create A New User
Node Status?
node history
Node Type: perlquestion [id://991955]
Approved by Ratazong
and all is quiet...

How do I use this? | Other CB clients
Other Users?
Others wandering the Monastery: (6)
As of 2017-05-29 20:02 GMT
Find Nodes?
    Voting Booth?