http://www.perlmonks.org?node_id=994979

sarf13 has asked for the wisdom of the Perl Monks concerning the following question:

Hi monks,

I hope you all doing well. I have to write automation script for pulling data from greenplum, process the pulled data and generate report as an excel sheet in perl.
Currently I am embedding sql code on shell script and generating csv file. Once the csv file is genretaed we are manually format it and generate report as as excel sheet. These all task I am doing in unix server. For generating csv file of around 28 lacks (around 3 millions) it took 12 to 15 min.

Now few things which I want to discuss here is :-
1. How efficient would be the perl script for this task.
2. If I would wrote whole things like pulling data, processing and generating report would take more time or shall I break it half into shell and half in perl.
3. For generating report in excel I am planning to use spreadsheet::WriteExcel module or any other module are there in cpan which is more suitable then this module.
4. Is there any other way or technology through which I can proceed.

Thanks in advance for your kind advice and feedback