Beefy Boxes and Bandwidth Generously Provided by pair Networks
Problems? Is your data what you think it is?
 
PerlMonks  

Re: Including code without packaging?

by mikfire (Deacon)
on May 16, 2001 at 00:54 UTC ( [id://80718]=note: print w/replies, xml ) Need Help??


in reply to Including code without packaging?

I could be slightly annoying and point out Config::IniFiles already exists. Then again, "rolling your own" can frequently be fun.

As to your exact question, I would make a package anyway. It only adds a few lines to the file and I can almost promise you this code will grow. More functions will be added, maybe some different variables to control how errors are handled, etc. If you start with a package now, it will save you some hassle later.

Of course, ymmv
mikfire

Replies are listed 'Best First'.
Re: Re: Including code without packaging?
by traveler (Parson) on May 16, 2001 at 01:12 UTC
    When I check cpan for this I can only access the README, the module gives a "Not Found". Is this a CPAN problem or has the module been removed?
      Go here and click on the Latest Release link. That worked for me
      mikfire
Re: Re: Including code without packaging?
by Madams (Pilgrim) on May 18, 2001 at 07:04 UTC
    I agree with your viewpoint..I personally have a module named MyUtilities where i just stuff algos i use over and over. One day a REAL module may just spontaneously jump out of it. Meanwhile i use it just like a junk drawer for code... :)
    _________________
    madams@scc.net
    (__) (\/) /-------\/ / | 666 || * ||----||

Log In?
Username:
Password:

What's my password?
Create A New User
Domain Nodelet?
Node Status?
node history
Node Type: note [id://80718]
help
Chatterbox?
and the web crawler heard nothing...

How do I use this?Last hourOther CB clients
Other Users?
Others admiring the Monastery: (3)
As of 2025-06-21 14:35 GMT
Sections?
Information?
Find Nodes?
Leftovers?
    Voting Booth?

    No recent polls found

    Notices?
    erzuuliAnonymous Monks are no longer allowed to use Super Search, due to an excessive use of this resource by robots.