|XP is just a number|
Re: Extract data from website and transfer it to Outlookby punkish (Priest)
|on Apr 28, 2006 at 02:38 UTC||Need Help??|
I have no experience with constructing Perl code... I think only Perl can save meWhy do you think so? I am curious. If you have no experience with Perl, what makes you think that it can save you. From my experience, assuming I take your statements at face value, you probably will make your life miserable trying to accomplish this task with Perl. Why not do it with some other programming language that you might already know?
That said, yes, Perl definitely can assist you in this task. But Outlook fields are a mess. All kinds of esoteric information is stored in them, and you have to make sure fields map properly.
Is the website which has the information in your control? It is probably being powered by a database. Could you wrangle access to that database? If yes, your job will be much easier. If not, read on...
Do the following... instead of mucking around with Perl right-away, launch MS-Winword (assuming you have a reasonably modern copy of it). Open your website URL in Winword, and suck the entire website down, traversing each link till you have all the information. Now you will have all the info on your 'puter in one big mongo file.
Save that file as text, and scan through that looking for patterns. See if you can have Excel parse all that crap into columns.
Actually, you can also try opening the URL directly in Excel. It supports these web queries where it tries to parse out tabular info from html (if that is applicable to you).
Truly, Perl may be the most inappropriate tool for you given that you are "currently looking up how to even start a Perl script."
Good luck. You will need it.
when small people start casting long shadows, it is time to go to bed