In almost any situation, there are those who prefer maxims to thinking---ignore them. With specific regard to the odd web scrape, there are a couple of things that can be a problem.
- The need to handle nesting.
- The need to survive arbitrary changes in the source.
The first can be handled by tight expression bounding or by using something like Text::DelimMatch
. You manage the second by vigilance---patch it when needed.
As for those who knee jerk instead of thinking, since many of them have no experience witting parsers (formal or ad hoc) they fail to see that the regex approach is a form of parsing, just without the overhead of dealing with things that don't matter.
"Never try to teach a pig to sing...it wastes your time and it annoys the pig."