Going through the links pointing to external sites is also interesting to scan for spam links, hence this is indeed desireable.
I currently lack the time to do it myself, and database access is somewhat scarce, but the relevant DB schema is (roughly):
create table node (
node_id integer not null unique primary key,
type_nodetype integer not null references node,
author_user integer not null references node,
create table user (
user_id integer not null references node
create table note (
note_id integer not null unique primary key references node(node_i
doctext text not null default ''
And Real, Working SQL to query these tables is (also at Replies with outbound links, but that's for gods only to access):
select node_id, doctext
left join document on node_id = document_id
where lastedit > date_sub( current_date(), interval 10 day )
and type_nodetype = 11 -- note
and doctext like '%http://%'
order by lastedit desc
This SQL should be refined to also catch https:// links, and then some Perl code needs to be written to verify that the text is an actual link.
Test cases for text with links would be for example:
<p>It's right [https://cpants.cpanauthors.org/release/GWHAYWOOD/sendma
Negative test cases would be:
Ideally, we will be able to refine this code later to highlight outbound links that are not on the whitelist of Perlmonks links.