Beefy Boxes and Bandwidth Generously Provided by pair Networks
Keep It Simple, Stupid

Automation for pulling huge data from greenplum database in perl.

by sarf13 (Beadle)
on Sep 21, 2012 at 19:09 UTC ( #994979=perlquestion: print w/replies, xml ) Need Help??
sarf13 has asked for the wisdom of the Perl Monks concerning the following question:

Hi monks,

I hope you all doing well. I have to write automation script for pulling data from greenplum, process the pulled data and generate report as an excel sheet in perl.
Currently I am embedding sql code on shell script and generating csv file. Once the csv file is genretaed we are manually format it and generate report as as excel sheet. These all task I am doing in unix server. For generating csv file of around 28 lacks (around 3 millions) it took 12 to 15 min.

Now few things which I want to discuss here is :-
1. How efficient would be the perl script for this task.
2. If I would wrote whole things like pulling data, processing and generating report would take more time or shall I break it half into shell and half in perl.
3. For generating report in excel I am planning to use spreadsheet::WriteExcel module or any other module are there in cpan which is more suitable then this module.
4. Is there any other way or technology through which I can proceed.

Thanks in advance for your kind advice and feedback

  • Comment on Automation for pulling huge data from greenplum database in perl.

Replies are listed 'Best First'.
Re: Automation for pulling huge data from greenplum database in perl.
by roboticus (Chancellor) on Sep 21, 2012 at 21:21 UTC


    It would be hard to say whether it would be faster or slower since you've given such a high-level description. However, I find that shell programming is *quite* a bit harder than perl. (In fact, one of the reasons I began coding in perl was to avoid shell scripts.)

    If your spreadsheets are going to be large, you might consider Excel::Writer::XLSX instead of Spreadsheet::WriteExcel. The chief advantage is that it uses the .xlsx format which allows for more rows and columns in your spreadsheets.

    What you're wanting to do looks like a perfect task for perl. In fact, just a couple of days ago I posted a script to create a spreadsheet from a database query.


    When your only tool is a hammer, all problems look like your thumb.

      Hi roboticus thanks for your reply

      i agree with your point that shell script is bit tough but its the best current option for us.

      you also mentioned about Excel::Write::XLSX module actually i haven't done R&D on this module.the ExcelWrite module can generate excel file for office 2003 i guess as per my knowledge it can accommodate less then 3 lacks records .lets see how its work.


        Excel::Write::XLSX is written by the same John McNamara (he's here frequently) responsible for Spreadsheet::WriteExcel, and both modules are quite handy. If you need Office 2003 compatibility, then you're probably better off with Spreadsheet::WriteExcel. But you'll have to keep the row limits in mind (not really all *that* big a deal, you can just start a new worksheet once you hit a certain row number).


        When your only tool is a hammer, all problems look like your thumb.

Log In?

What's my password?
Create A New User
Node Status?
node history
Node Type: perlquestion [id://994979]
Front-paged by bulk88
[marto]: time for coffee

How do I use this? | Other CB clients
Other Users?
Others studying the Monastery: (10)
As of 2018-05-21 08:08 GMT
Find Nodes?
    Voting Booth?