Beefy Boxes and Bandwidth Generously Provided by pair Networks chromatic writing perl on a camel
Clear questions and runnable code
get the best and fastest answer
 
PerlMonks  

s///

by THuG (Beadle)
on Aug 08, 2000 at 17:04 UTC ( [id://26823]=note: print w/replies, xml ) Need Help??

This is an archived low-energy page for bots and other anonmyous visitors. Please sign up if you are a human and want to interact.


in reply to RE: Length Field
in thread Parsing and \G and /g and Stupid User Tricks

Yeah,
If I'm having to read in the entire file anyway, then I might as well break it into seperate lines. I imagine s/($RID)/\n$1/g would do the trick (is that valid?). Then I don't have to hunt for the next record if the current one is FUBAR.

-Travis

Replies are listed 'Best First'.
RE: s///
by jlistf (Monk) on Aug 08, 2000 at 17:17 UTC
    you can do this without reading in the entire file all at once. start at a $RID code, read until you hit the next $RID code. what you just read will be one full record from the file. test it, etc. then continue. the only problem will be figuring out how to stop reading once you hit a rid code. you could do some combination of seek and read to read in some data, find a $RID code and seek backwards through the file to the beginning of the code. something like:
    $currpos = 0; while ( read( $dpf, $input, 80, $currpos ) ) { # get 80 characters from $currpos till EOF $input =~ m/($RID)(.*?)(?:$RID)/g; #grab a code, data and set pos # test $1 and $2 for errors $currpos = pos( $input ) # set pos in file to beginning of next id }
    i think that'll do it... i might be missing something though.

    jeff

Log In?
Username:
Password:

What's my password?
Create A New User
Domain Nodelet?
Node Status?
node history
Node Type: note [id://26823]
help
Sections?
Information?
Find Nodes?
Leftovers?
    Notices?
    hippoepoptai's answer Re: how do I set a cookie and redirect was blessed by hippo!
    erzuuliAnonymous Monks are no longer allowed to use Super Search, due to an excessive use of this resource by robots.