Beefy Boxes and Bandwidth Generously Provided by pair Networks
Pathologically Eclectic Rubbish Lister
 
PerlMonks  

Re^3: Splitting the records into multiple worksheets

by NetWallah (Canon)
on Feb 20, 2021 at 04:54 UTC ( [id://11128590]=note: print w/replies, xml ) Need Help??


in reply to Re^2: Splitting the records into multiple worksheets
in thread Splitting the records into multiple worksheets

After creating the raw data table in SQLite, you could use SQL to generate your pivoted data.

The SQL "GROUP BY" clause can do the pivot functions to summarize.

The result can then easily export into a csv file suitable for import into Excel.

                "Avoid strange women and temporary variables."

Replies are listed 'Best First'.
Re^4: Splitting the records into multiple worksheets
by chandantul (Scribe) on Feb 20, 2021 at 05:29 UTC

    Hello, Do you have any specific example of pulling the raw data into DB? Please check below how i am pulling raw data and let me know next step if possible. How i should import into csv and then excel if possible.

    my $urlstringlog = $apiurllog . $sortor . $filter . $target . $appID . + $date2; $strExcelFilename = "C:/" . "Analysis-Details-automatic-bk" . $date . +".xlsx"; do { # Run get the users run_api_call($urlstringlog); @responsetext = parse_json ($client->responseContent()); push @responsetextall, @responsetext; $linkheader = $client->responseHeader("link"); if ($linkheader=~ m/next/) { (my $link1 = $linkheader) =~ s/ .*? (self)/$1/gx; (my $link2 = $link1) =~ s/self/$1/g; print "Post Pagination: $link10"; $urlstringlog = $link2; } } while ($linkheader=~ m/next/); for my $i (0..$#responsetextall) { $responsetextall[$i] =~ s/]\[/,/g; for my $j (0..$#{$responsetextall[$i]}) { @responsests1 = $responsetextall[$i][$j]{tat}; $responseid = $responsetextall[$i][$j]{act}{id}; $responsdisp = $responsetextall[$i][$j]{act}{Name}; $responsalter = $responsetextall[$i][$j]{act}{Id}; $responseclientipadd = $responsetextall[$i][$j]{cln}{ipAd +}; $responseappsdebug = $responsetextall[$i][$j]{deb}{Data}; + for my $m (0..$#responsests1) { for my $n (0..$#{$responsests1[$m]}) { $responseapps2id = $responsests1[$m][$n]{id}; $responseapps2 = $responsests1[$m][$n]{ty}; $responseapps1 = $responsests1[$m][$n]{Name}; if ( $responseapps2 eq 'AppInstance' ){ @responseapps3 = $responsests1[$m][$n]{displayName}; $responseapps5 = $responseapps3[0]; print "Its Matches Appinstances"; $responseapps4 = $responsests1[$m][$n]{Name}; if ( $responseapps2 eq 'AppUser'){ $responseapps3id = $responsests1[$m][$n]{a +Id}; } } push @responseapps3id,$responseapps3id; push @responseapps5 , $responseapps4; push @responseapps7, $responseapps7; } } push @responsalter, $responsalter; push @responseclientipadd,$responseclientipa +dd; } }
      Do you have any specific example of pulling the raw data into DB?

      I've not had cause to use Sqlite as I usually have another DB available. But this is the sort of thing you need. Untested and without checking for errors creating connections and opening files etc. Hopefully it will give you a start.

      use DBI; use DBD::Sqlite; # Connect to your database my $dbh = DBI->connect("dbi:sqlite:$db_name:localhost:$port", $db_user +, $db_password); # Create a temporary table $dbh->do("CREATE TEMPORARY TABLE IF NOT EXISTS ExcelData (name VARCHAR +(80), data INT)"); # Populate temporary table from your datasource foreach my $row(@responsetext) { my ($name, $data) = split / +/, $row $dbh->do("INSERT INTO ExcelData SET name = '$name', data = $data") +; } # Create a query to sort the data how you want it and output to CSV fi +le open $fh, '>', 'myfile.csv'; my $query = $dbh->prepare("SELECT name, data FROM ExcelData WHERE data + > 20 ORDER BY data"); $query->exceute; my ($n, $d); while (($n, $d) = $query->fetchrow_array) { print $fh qq["$n",$d\n]; } close $fh;

        That's very good suggestions but i will have to calculate the .csv file with around 2000000 rows of data in order to compare against another excel . I will not have any server to install database. Can i create a temporary .db file in my system and save the data and then fetched the data as .csv. How i can handle the .xlsx file's row limitations in case i will have 2000000 rows of records to handle? Can i create dynamic worksheets by checking the records in array?

Log In?
Username:
Password:

What's my password?
Create A New User
Domain Nodelet?
Node Status?
node history
Node Type: note [id://11128590]
help
Chatterbox?
and the web crawler heard nothing...

How do I use this?Last hourOther CB clients
Other Users?
Others studying the Monastery: (2)
As of 2024-03-19 04:03 GMT
Sections?
Information?
Find Nodes?
Leftovers?
    Voting Booth?

    No recent polls found