I know SFA about NLP, but I'd imagine that Perl would
be quite suitable: it has excellent text segmentation
support, which I'd guess would be useful. (Although I'd
think semantic structure would be more important than
text chunking... argh. Too much image processing in
brain.)
As far as Perl's "suitability" goes: Perl's really
good for hashing out proof-of-concept code and dealing
with text, which would (likely) make it good for some
aspects of AI programming. By the same token, Lisp is
good for, say, genetic programming (since its code/data
separation is even thinner than Perl's), Prolog's good
for planning/reasoning (because its logical syntax
makes encoding knowledge easier), and C's good for
search-based and general number-crunching (neural nets,
anyone?), because, well, you can write some bloody fast
C code.
In my (limited) experience, a lot of AI involves a
lot of number crunching, which Perl isn't so good at.
(Not that it can't crunch numbers, but it can't crunch
them fast enough. And let's face it: you're going to
be a happier camper if your program takes one second
to run than if it takes five minutes, especially when
you're tweaking parameters.) I'd be most likely to
write a prototype in Perl (provided Lisp, Prolog, and
Haskell are unsuited for the job), make sure that I'm
on the right track, then recode in C for the efficiency
win. As always, YMMV.
--
:wq