betmatt has asked for the wisdom of the Perl Monks concerning the following question:

I am looking to scrape pages from the web, and process, to put info into databases. I know that Perl does a great job with that. In fact I have used Perl for this purpose before. I want to stick with Perl's regular expression capabilities which I believe to be superior to Python's. However I was wondering if I should consider using both Perl and Python together for a wider set of tasks. I don't need to do complex machine learning while scraping the web. I just want to find comparable products that are for sale. That could be on Twitter, the web sites of small businesses, ebay, trade websites and Facebook. Basically, because of the range of data sources, I might need to use Perl and Python together. Does anyone here have experience with the sort of thing I want to do and do they agree that using both languages might be a good idea?

Replies are listed 'Best First'.
Re: Web Scraping
by Fletch (Chancellor) on Jul 12, 2019 at 15:12 UTC

    Break your pipeline into chunks and define a suitable interface for inputs and outputs at each stage and it doesn't matter how any particular phase of the pipeline is implemented so long as it's producing correctly formatted products. Somewhat handwave-y but based on your example you have two or maybe three steps:

    • Scrape a source for some sort of raw data
    • Munge that raw data into a standard form
    • Store those standard format results to your DB

    Working backwards, you might figure out a JSON representation of your stored DB format. The munging stage would need to produce that from the raw data. Likewise have the munge stage expect a common raw input format and it doesn't matter what's doing the actual scraping (e.g. if you find a python Twitter source that works best for something you can have that feed results along side something producing results with WWW::Mechanize from something that perl easily scrapes). You could also have a scraper go ahead and produce the canonical DB format directly if that's appropriate.

    The cake is a lie.
    The cake is a lie.
    The cake is a lie.

      I go with this.

      This is unix. Simple programs taking input, doing simple processing, passing output to next in line of the pipeline. Avoid behemoths. That does not mean to just create "scripts" (whose internal functions can not be re-used by other scripts) - that's really really bad. Generalise and make freely available your algorithms by creating C libraries or Perl Modules. Then create pipeline scripts to utilise(=call functions from) your libraries, preferably publish your libraries.

Re: Web Scraping
by marto (Cardinal) on Jul 12, 2019 at 14:51 UTC

    I have no idea why you'd want to use both languages, you don't make any case for this. You'll likely get blocked or be in something of an arms race with many of these commercial sites, as this sort of thing tends to violate terms of service.

      You have given the reason in your answer. Python might be good for one thing. Perl could be good for another because of the reasons that you say. It is the sort of challenge and arms race that I am sure that you would love to be involved with.

        I do have experience using perl to scrape lots of things, having done so for years. I don't believe there's anything Python brings to the party that perl can't already do.

        I think marto's use of the phrase "arms race" referred to the commercial sites you seem to want to interact with and their terms of service, and believe me, that's a tar-baby you don't want to start a fight with!


        Give a man a fish:  <%-{-{-{-<

Re: Web Scraping
by Anonymous Monk on Jul 13, 2019 at 02:03 UTC
    Hi, Have you asked this on any other forums? I'd like to see what they said