Beefy Boxes and Bandwidth Generously Provided by pair Networks
more useful options
 
PerlMonks  

Re: Script to scrape data

by hippo (Archbishop)
on Dec 02, 2024 at 09:42 UTC ( [id://11162967]=note: print w/replies, xml ) Need Help??


in reply to Script to scrap data

my $fecha_tramite_min = '24/11/2024'; my $fecha_tramite_max = '27/11/2024'; my $fecha_matricula_min = '01/01/1900'; my $fecha_matricula_max = '31/12/2000'; # ... if ($fecha_tramite ge $fecha_tramite_min && $fecha_tramite le +$fecha_tramite_max && $fecha_matricula ge $fecha_matricula_min && $fecha_matricu +la le $fecha_matricula_max && $prov_matriculacion eq $prov_matriculacion_filtro) {

At least one of the problems is that you are comparing dates lexically. This will not work at all where the least significant part of the date (the day of the month) comes first. eg. in your current code, any $fecha_tramite which is the 25th or 26th of any month of any year will match which is presumably not what you want.

There are any number of ways to solve this. I would reach for Time::Piece, convert all the dates into Time::Piece objects and then use the less-than and greater-than operators (< and >) to compare them.

Looking at the big picture, ensure that you have the right to extract these data sets in the first place. And if you do, consider searching for an API which would be less brittle than parsing HTML for the results.


🦛

Log In?
Username:
Password:

What's my password?
Create A New User
Domain Nodelet?
Node Status?
node history
Node Type: note [id://11162967]
help
Chatterbox?
and the web crawler heard nothing...

How do I use this?Last hourOther CB clients
Other Users?
Others chilling in the Monastery: (3)
As of 2025-12-06 06:18 GMT
Sections?
Information?
Find Nodes?
Leftovers?
    Voting Booth?
    What's your view on AI coding assistants?





    Results (85 votes). Check out past polls.

    Notices?
    hippoepoptai's answer Re: how do I set a cookie and redirect was blessed by hippo!
    erzuuliAnonymous Monks are no longer allowed to use Super Search, due to an excessive use of this resource by robots.