http://www.perlmonks.org?node_id=1000991


in reply to Download references list in pdf format with script

I'd strongly recommend that you split your task into three separate processes:

  1. Finding the urls to match the references.

    YOu are probably better off using one of the search engine APi's for this bit.

    And using a human being to review the search results and pick out the appropriate urls.

  2. Downloading the PDFs.

    Once you have your urls, there is no real advantage to using Perl rather than (say) wget for doing the downloading.

    Though perl is ideally suited to driving the process of using wget; checking the success; repeating for failures etc.

  3. Processing the PDFs into your database.

    Once you have the PDFs; whether you use Perl or your DBs bulk uploader to populate the DB will very much depend on what exactly information you are going to store in the DB; and where you will be getting that information from.


With the rise and rise of 'Social' network sites: 'Computers are making people easier to use everyday'
Examine what is said, not who speaks -- Silence betokens consent -- Love the truth but pardon error.
"Science is about questioning the status quo. Questioning authority".
In the absence of evidence, opinion is indistinguishable from prejudice.

RIP Neil Armstrong