Beefy Boxes and Bandwidth Generously Provided by pair Networks Frank
Think about Loose Coupling
 
PerlMonks  

Re: how do i parse multiple files and write all of the results into single file

by Grygonos (Chaplain)
on Jul 08, 2005 at 15:36 UTC ( [id://473553]=note: print w/replies, xml ) Need Help??

This is an archived low-energy page for bots and other anonmyous visitors. Please sign up if you are a human and want to interact.


in reply to how do i parse multiple files and write all of the results into single file

You need to show some effort. Based on the two questions you have posted recently.. I cannot tell that you have ever coded one line of Perl in your life. Show us what you have tried.

Also, a plea to other monks. Stop doing people's work for them. This site (to me anyhow) is about learning and self-betterment in regards to Perl. Answering someone's question in that fashion when they have shown zero effort does the OP no good.

  • Comment on Re: how do i parse multiple files and write all of the results into single file

Replies are listed 'Best First'.
Re^2: how do i parse multiple files and write all of the results into single file
by my_perl (Initiate) on Jul 11, 2005 at 10:01 UTC
    I am real beginner, and i really appreciate this forum, because people answer your questions without any prejudice, and whichever level they are at. It is hard for me to show you my code, because i am trying to do something for work, and i don't think that would be proper. I would like to thank one more time to all people that put their time and answered my questions, without judging. As you said this site for you means one thing, for them obviously other, they are willing to help.

Log In?
Username:
Password:

What's my password?
Create A New User
Domain Nodelet?
Node Status?
node history
Node Type: note [id://473553]
help
Sections?
Information?
Find Nodes?
Leftovers?
    Notices?
    hippoepoptai's answer Re: how do I set a cookie and redirect was blessed by hippo!
    erzuuliAnonymous Monks are no longer allowed to use Super Search, due to an excessive use of this resource by robots.