Thanks for the pointer, Corion.
Here's a simple example to try and compare my hypothetical API to Web::Scraper:
Example: Get the 5 most recent tweets from someone's Twitter page
The idea is that the demo script should take a twitter nickname as a command-line argument, and print the time, author, and text body of the last five tweets from that person's Twitter page to STDOUT.
The implementation with Web::Scraper:
use URI;
use Web::Scraper;
my $tweets_url = "http://twitter.com/%s";
my $tweets_query = scraper {
process 'li[data-item-type="tweet"]', 'tweets[]' => scraper {
process '*[data-name]', 'name' => '@data-name';
process '*[data-time]', 'time' => '@data-time';
process '.content p', 'text' => 'TEXT';
};
};
my $tweets = $tweets_query->scrape(
URI->new(sprintf $tweets_url, $ARGV[0]));
for my $tweet (@{$tweets->{tweets}}[0..4]) {
last if !$tweet;
my $date = strftime('%b %d', localtime $tweet->{time});
print "\n$tweet->{name} tweeted on $date:\n $tweet->{text}\n";
}
The implementation with my proposed API:
use My::Query qw(register_query);
register_query 'recent_tweets' => {
url => "http://twitter.com/%s",
items => '//li[@data-item-type="tweet"]',
parse => { 'name' => '//@data-name',
'time' => '//@data-time',
'text' => '//*[@class="content"]/p' },
}
my $it = recent_tweets( $ARGV[0] );
for (0..4) {
my $tweet = $it->() or last;
my $date = strftime('%b %d', localtime $tweet->{time});
print "\n$tweet->{name} tweeted on $date:\n $tweet->{text}\n";
}
(I haven't used the "return rows as objects" thing here, as I've decided it should be optional and it doesn't gain us anything in simple cases like this.)
Sample output (same for both implementations):
$ ./recent-tweets.pl TimToady
Larry Wall tweeted on Mar 15:
@anocelot Lemme guess, only the first word is different...
Larry Wall tweeted on Mar 13:
I need to ask the Guinness folks what the current world record is fo
+r number of invitations to connect on LinkedIn ignored.
Larry Wall tweeted on Feb 14:
@genespeth \o/
Larry Wall tweeted on Feb 12:
Let us not forget that the perfect is also the enemy of the bad and
+the ugly.
Larry Wall tweeted on Feb 03:
Wow. Just...wow. #sb48
Both get the job done, and it's certainly not a difference as day and night, but I do prefer my API even for simple cases like this, because...
- ...of its declarative nature, i.e. instead of saying "Perform these operations, and while going along store these values", you say "I want a list of items/rows as output, and each of them should have these fields, so here are the parsing rules for extracting each of them". Which feels more elegant to me, but that's of course subjective.
- ...it doesn't make you jump through hoops to keep all information about the query (including how to construct the URL) in one place.
What do you think?