Beefy Boxes and Bandwidth Generously Provided by pair Networks Cowboy Neal with Hat
Pathologically Eclectic Rubbish Lister
 
PerlMonks  

Re: Line noise

by liverpole (Monsignor)
on Sep 22, 2005 at 21:25 UTC ( [id://494364]=note: print w/replies, xml ) Need Help??

This is an archived low-energy page for bots and other anonmyous visitors. Please sign up if you are a human and want to interact.


in reply to Line noise

How in the world did you do this?  I have no idea where to start parsing!

Well, the message is very well hidden, in any event!  I tried to de-obfuscate it first, but finally gave up and ran it. (++ tomorrow, when I get more votes to cast!)

Replies are listed 'Best First'.
Re^2: Line noise (use B::Deparse)
by grinder (Bishop) on Sep 23, 2005 at 08:18 UTC
    I have no idea where to start parsing

    You may not know it, but it's very easy to tap into perl's own parser, and find out what it thinks of a given piece of code:

    % perl -MO=Deparse chester chester syntax OK $+ + s/\\\s q\\+//s + s/$/$$/s / s///s . s[{q`!^/s.`^q\s][. s]s . s/\d*/do { 'JAPH' }/es + 'es)/$' + s/$_\$/$_$_/s + s///s + print($_);

    (whitespace added for clarity).

    Once it's been through the wringer it becomes clear what is happening (well, to me at least). As it turns out, it looked like chester threw a couple of diversions in the mix.

    It used to be that a particularly clever obfu will cause B::Deparse to crash, which would prevent such casual analysis. As a result, for better or worse, B::Deparse has lots more smarts these days and it's just about impossible to trip it up. Much kudos is earnt by managing to do so.

    • another intruder with the mooring in the heart of the Perl

Log In?
Username:
Password:

What's my password?
Create A New User
Domain Nodelet?
Node Status?
node history
Node Type: note [id://494364]
help
Sections?
Information?
Find Nodes?
Leftovers?
    Notices?
    hippoepoptai's answer Re: how do I set a cookie and redirect was blessed by hippo!
    erzuuliAnonymous Monks are no longer allowed to use Super Search, due to an excessive use of this resource by robots.