Beefy Boxes and Bandwidth Generously Provided by pair Networks
more useful options
 
PerlMonks  

Meditations

( #480=superdoc: print w/ replies, xml ) Need Help??

If you've discovered something amazing about Perl that you just need to share with everyone, this is the right place.

This section is also used for non-question discussions about Perl, and for any discussions that are not specifically programming related. For example, if you want to share or discuss opinions on hacker culture, the job market, or Perl 6 development, this is the place. (Note, however, that discussions about the PerlMonks web site belong in PerlMonks Discussion.)

Meditations is sometimes used as a sounding-board — a place to post initial drafts of perl tutorials, code modules, book reviews, articles, quizzes, etc. — so that the author can benefit from the collective insight of the monks before publishing the finished item to its proper place (be it Tutorials, Cool Uses for Perl, Reviews, or whatever). If you do this, it is generally considered appropriate to prefix your node title with "RFC:" (for "request for comments").

User Meditations
search.cpan.org, metacpan and PAUSE all broken in different ways?
3 direct replies — Read more / Contribute
by Sixes
on Jul 19, 2014 at 14:25

    Starting with PAUSE, I have uploaded several new modules to PAUSE. Each time, I get a message that says:

    This distribution name can only be used by users with permission for the package <My::Package>, which you do not have.

    The packages are in new namespaces. As I understand it, simply uploading a new module should allocate that namespace to me on a "first-come" basis. But it isn't.

    This doesn't seem to matter to search.cpan.org, when it's working: it still indexes the module so that it can be found and downloaded via the cpan utility.

    However that doesn't seem to apply to metacpan. It uses 02packages.details.txt which isn't being updated, presunably because of the PAUSE issue. Thus my modules are not appearing on metacpan in their search. Metacpan's help says:

    MetaCPAN uses the PAUSE generated 02packages.details.txt file. If it's not in there, then the module author will need to fix this,

    Does anyone know if it's fixable? I have mailed modules@perl.org a couple of times but no response.

The problem with "The Problem with Threads"
3 direct replies — Read more / Contribute
by BrowserUk
on Jul 18, 2014 at 07:26

    This started life as a reply to Re^2: Which 'Perl6'? (And where?), but it seems too important to bury it down there in a long dead thread as a reply to an author I promised to resist, and whom probably will not respond. So I'm putting it here to see what of any interest it arouses.


    1. Is concurrency appropriate? There are two basic motivations ... and 2) to speed things up. In the latter case, if the problem being tackled is really IO bound, turning to concurrency probably won't help.

      That is way too simplistic a view. If the problem is IO bound to a single, local, harddisk, and is uncacheable, then concurrency may not help.

      But change any of the four defining elements of that criteria; and it might -- even: probably will -- be helped by well written asynchronicity. Eg.

      1. If the IO data is, or can be, spread across multiple local physical drives; concurrency can speed overall throughput by overlapping requests to different units.
      2. If the disks are remote -- as in SAN, NAS, cloud etc. -- then again, overlapping requests can increase throughput by utilising buffering and waiting time for processing.
      3. If the drives aren't harddisks, but SSDs; or SSD buffered HDs; or PCI connected virtual drives; then overlapping several fast read requests with each slower write request can more fully utilise the available bandwidth and improve throughput.
      4. If the IO involved displays temporal locality of reference -- that is, if the nature of the processing is such that a subset of the data has multiple references over a short period of time, even if that subset changes over the longer term -- then suspending the IO for new references until re-references to existing cached data play out comes about naturally if fine-grained concurrency is used.

      And if some or all of the IO in your IO bound processing is to the network, or network attached devices; or the intranet; or the internet; or the cloud; -- eg. webserving; webcrawling; webscraping; collaborative datasets; email; SMS; customer facing; ....... -- then both:

      • Preventing IO from freezing your processing;
      • And allowing threads of execution who's IO has completed to continue as soon as a core is available -- ie. not also have to wait for any particular core to become available;

      Is mandatory for effective utilisation of modern hardware and networks; even for IO-bound processing.

      Only kernel(OS) threading provides the required combination of facilities. Cooperative multitasking (aka. 'green threads'; aka. Win95 tech) simply does not scale beyond the single core/single thread hardware of the last century.

    2. The Problem with Threads.

      The problem with "The Problem with Threads", is that it is just so much academic hot air divorced from the realities of the real world.

      Only mathematicians and computer scientists demand total determinacy; and throw their arms up in refusal to work if they don't get it.

      The rest of the world -- you, me, mothers and toddlers, doctors, lawyers, spacemen, dustmen, pilots, builders, shippers, movers & shakers, factory workers, engineers, tinkers, tailors, soldiers, sailors, rich & poor men, beggars and thieves; all have to live in the real -- asynchronous -- world, where shit happens.

      Deliveries are late; machines break down; people are sick; power-outs and system-downs occur; the inconvenient realities of life have to be accepted, lived with and dealt with.

      The problem is not that threading is hard; the problem is that people keep on saying that "threading is hard"; and then stopping there.

      Man is very adept at dealing with hard and complex tasks

      Imagine all places you'd never have been; all the things you'd never have done; if the once wide-spread belief that we would suffocate if we attempted to travel at over 30mph.

      Too trivial an example for you? Ok. Think about heart transplantation. Think about the problems of disconnecting and reconnecting the (fragile, living) large bore pipes supplying and removing the pumped liquid; the wires carrying electrical control signals; the small bore pipes carrying the lubricants needed to keep the pump alive and removing the waste. Now think about the complexities of doing a pump change whilst keeping the engine running; the passengers comfortable and the 'life force' intact. And all the while contending with all the other problems of compatibility; rejection; infection; compounded diagnosis.

      Circa. 5000 coronary transplants occurred last year. Mankind is good at doing difficult things.

      Asynchronicity and non-determinism are 'solved problems' in almost every other walk of life

      From multiple checkouts in supermarkets; to holding patterns in the skies above airport hubs; to off & on ramps on motorways; to holding tanks in petro-chemical plants; to waiting areas in airports and doctors and dentists surgeries; to carousels in baggage claims and production lines; distribution warehouses in supply chains; roundabouts and filter-in-turn; {Add the first 10 things that spring to your mind here! }.

      One day in the near future a non-indoctrinated mathematician is going to invent a symbol for an asynchronous queue.

      She'll give it a nice, technical sounding name like "Temporally Lax Composer", which will quickly become lost behind the cute acronym and new era of deterministic, asynchronous composability will ensue.

      And the academic world will rejoice, proclaim her a genius of our time, and no doubt award her a Nobel prize. (That'd be nice!)

      And suddenly the mathematicians will realise that a process or system of processes can be deterministic, without the requirement for every stage of the process (equation) to occur in temporal lockstep.

      'Safety' is the laudable imperative of the modern era.

      As in code-safety and thread-safety, but also every other kind of predictable, potentially preventable danger.

      Like piety, chastity & sobriety from bygone eras, it is hard to argue against; but the world is full (and getting fuller) of sexually promiscuous atheists who enjoy a drink; that hold down jobs, raise kids and perform charitable works. The world didn't fall apart with the wane of the religious, moral and sobriety campaigns of the past.

      In an ideal world, all corners would be rounded; flat surfaces 'soft-touch'; voltages would be low; gases non-toxic; hot water wouldn't scald; radiant elements wouldn't sear; microwaves would be confined to lead-lined bunkers; there'd be no naked flames; and every home would be fire-proof, flood-proof, hurricane-proof, tornado-proof, earthquake-proof, tsunami-proof and pestilence-proof.

      Meanwhile in the real-world, walk around your own house and see all the dangers that lurk for the unsupervised, uneducated, unwary, careless or stupid and ask yourself why do they persist? Practicality and economics.

      Theoreticians love theoretical problems; and eschew practical solutions.

      When considering concurrency, mathematicians love to invent ever so slightly more (theoretically) efficient solutions to the 'classical' problems.

      Eg. The Dining Philosophers. In a nutshell: how can 6 fil..Phillo.. guys eat their dinners using 5 forks without one or more of them starving. They'll combine locks and syncs, barriers and signals, mutexs and spinlocks and semaphores trying to claw back some tiny percentage of a quasilinear factor.

      Why? Buy another bloody fork; or use a spoon; or eat with your damn fingers.

      The problem is said to represent the situation where you have 6 computers that need to concurrently use the scarce resource of 5 tape machines. But that's dumb!

      Its not a resource problem but a capital expenditure problem. Buy another damn tape machine and save yourself 10 times its cost by avoiding having to code and maintain a complex solution. Better still, buy two extra tape machines; cos as sure as eggs is eggs, it'll be the year-end accounting run; or the Black Friday consumer spending peak when one of those tape machines defy the 3 sigma MTBF and break.

      Threading can be complex, but there are solutions to all of the problems all around us in the every day, unreliable, non-deterministic operations of every day modern life.

      And the simplest solution to many of them is to avoid creating problems in the first place. Don't synchronise (unless you absolutely have to). Don't lock (unless it is absolutely unavoidable). Don't share (unless avoiding doing so creates greater problems).

      But equally, don't throw the baby out with the bath water. Flames are dangerous; but oh so very useful.

    3. Futures et al are the future. There are much simpler, safer, higher level ways to do concurrency. I haven't tried Paul Evans' Futures, but they look the part.

      And therein lies the very crux of the problem. Most of those decrying threads; and those offering alternative to them; either haven't tried them -- because they read they were hard -- or did try them on the wrong problems, and/or using the wrong techniques; and without taking the time to become familiar with and understand their requirements and limitations.

      Futures neither remove the complexity nor solve the problems; they just bury them under the covers forcing everyone to rely upon the efficacy of their implementation and the competence of the implementors.

      And the people making the decisions are taking advice from those thread-shy novices with silver bullets and employing those with proven track records of being completely useless at implementing threaded solutions.

      The blind taking advice from the dumb and employing the incompetent.

    4. Perl 5 "threads" are very heavy. This sometimes introduces additional complexity.

      The "heaviness" of P5 threading is a misnomer. The threads aren't heavy; the implementation of shared memory is heavy. And that could easily be fixed. If there was any interest. If there wasn't an institutionalised prejudicial barrier preventing anyone even suggesting change to improve the threading support; much less supporting those with the knowledge and ideas to take them forward.

      They've basically stagnated for the past 8 or more years because p5p won't allow change.


    With the rise and rise of 'Social' network sites: 'Computers are making people easier to use everyday'
    Examine what is said, not who speaks -- Silence betokens consent -- Love the truth but pardon error.
    "Science is about questioning the status quo. Questioning authority".
    In the absence of evidence, opinion is indistinguishable from prejudice.
Selling swimsuits to a drowning man
8 direct replies — Read more / Contribute
by sundialsvc4
on Jul 16, 2014 at 07:19

    I like to go to “networking events” now and again (if the price is cheap), but sometimes it is depressing to watch the hawkers.

    The thing that they are selling are little TLA’s (three-letter acronyms) to attach after your name.   And, apparently, they’re not cheap.   But one comment that was announced to the crowd, by someone who I’m pretty-much sure was a “shill,” went something like this:

    “The very best thing I did for my career was to become a “SCRUM Master Level II”!!   I had to dig to find the money, but you know, you have to ‘invest in yourself to get ahead.’   You don’t even have to know how to program.”

    (Blink ...)

    And then there was this other guy – let’s be kind and say he didn’t exactly look like Chuck Norris – who nevertheless styled himself a “third-degree black belt master” about some sort of management-theory or another . . .

    So, is “the blind leading the blind” an accepted business-practice these days?   Even though the person who runs the shop where I get my car serviced might not be holding an grease-gun when he introduces himself to me, I do want to know that he has held one in the fairly recent past.   And, while I know that jobs like “car servicing” require a tremendous amount of intellectual study these days, at the end of the evening it’s all still all about:   a machine.   You can’t abstract-away the experience of having actually done this and replace it with a little piece of very-expensive paper.   You can’t learn to swim by reading a book about it.

    Now, I, likewise, do not wish to here “abstract away” the value of intellectual study, or employee training, or personal self-education.   That’s not my point.   But I feel like I was watching swimsuits being sold to drowning people.   A little friendly-conversation around that room showed me that the length of the courses was usually “over the weekend,” and the prices of those weekends were high – $1000 to $3300 (USD).   Ouch.   If I had thought that I could just buy a bottle of “SCRUM sauce” and pour it over my career (with or without saying, “Shazam!”), I might at one time have been fairly-easily convinced to do so.   Maybe I’m just too old and gray to believe it now.   I do not see a credible value-proposition here, despite the intense sales pressure.

    So ... can I now ask the Monks for what is your perspective here?   Whether you, yourself, bought a Golden Ticket, or (I think more-likely) hired one, or didn’t, what were your experiences and insights into this matter?

CPAN down again... (search.cpan.org)
4 direct replies — Read more / Contribute
by cLive ;-)
on Jul 14, 2014 at 08:29

    I don't know what the full back story is behind this, but I've noticed that CPAN is going down a lot lately.

    As the first port of call for anyone searching Perl documentation, I think this reflects really badly on the community - I mean, if I wasn't sold on Perl, started looking into it and hit errors on the language's main (edit: "main module documentation") web site, I'd be thinking something along the lines of, "If they can't keep up a web site, how can we rely on Perl as a technology?".

    Is there no way that maybe the error pages could be updated if the server situation is not easily fixable? Surely it would be better to display more than just "504 Gateway Time-out" ??? Maybe something more descriptive, in a formatted HTML page that links to other Perl resources (Perlmonks, perl.com. perldocs.net etc).

    It a little embarrassing when I'm excitedly discussing a cool aspect of Perl and then try to send that person the CPAN page, only to hit an error message. What can be done to fix this?

    (edited title for clarity)

Perl talk topics bank on Github
No replies — Read more | Post response
by perlfan
on Jul 13, 2014 at 12:12
    After YAPC, I had a thought that it'd be cool to start a bank of Perl talk topics that would be of interest for various audiences.

    The primary use I can think is that for someone who likes to give Perl talks and wants to give a talk (about anything), they may consult this list to find something that strikes their fancy.

    Perl Talk Topics Bank on Github

    Another user could be someone who would give a talk if they knew that something about which they felt comfortable talking was of interest to other people, they may step up.

    The immediate problem being solved here is a local one - at Houston.pm we're constantly asking the mailing list for topics or speakers to volunteer. Even though it is a local topics bank (act locally, right?), I'd like to get ideas from anyone willing to contribute.

    Eventually, I'd like to make this a list that is appropriate for any event or meeting where people talk about Perl and the interesting things we're doing with it.

    If you're uncomfortable with forking, editing, and issuing a pull request; please just create an issue with the topics you'd like to add.

    Hope to see you contribute to the topics list.

    Thank you!

Nobody Expects the Agile Imposition (Part VII): Metrics
8 direct replies — Read more / Contribute
by eyepopslikeamosquito
on Jul 13, 2014 at 04:31

    Not everything that can be counted counts and not everything that counts can be counted

    -- William Bruce Cameron

    What's measured improves

    -- Peter Drucker

    Three recent events got me thinking about software metrics again:

    • Management use individual KPIs to reward high performers in Sales and other departments. They are contemplating doing the same for Software Developers.
    • Performance Appraisals often seem subjective. Would metrics make them more objective? Or do more harm than good?
    • Larry Maccherone was in town recently promoting his company's approach to Agile metrics.

    I'm interested to learn:

    • How does your company reward Software Developers? Are the rewards team-based, individual-based, department-based, whole-company based? How well does it work?
    • Do you have Performance Appraisals? Do they use metrics? Do your Software Developers/Teams have KPIs?
    • Do you use metrics to improve your Software Development Process?

    I've done a bit of basic research on these topics, which I present below.

    Software Metric Gaming

    Key performance indicators can also lead to perverse incentives and unintended consequences as a result of employees working to the specific measurements at the expense of the actual quality or value of their work. For example, measuring the productivity of a software development team in terms of source lines of code encourages copy and paste code and over-engineered design, leading to bloated code bases that are particularly difficult to maintain, understand and modify.

    -- Performance Indicator (wikipedia)

    "Thank you for calling Amazon.com, may I help you?" Then -- Click! You're cut off. That's annoying. You just waited 10 minutes to get through to a human and you mysteriously got disconnected right away. Or is it mysterious? According to Mike Daisey, Amazon rated their customer service representatives based on the number of calls taken per hour. The best way to get your performance rating up was to hang up on customers, thus increasing the number of calls you can take every hour.

    Software organizations tend to reward programmers who (a) write lots of code and (b) fix lots of bugs. The best way to get ahead in an organization like this is to check in lots of buggy code and fix it all, rather than taking the extra time to get it right in the first place. When you try to fix this problem by penalizing programmers for creating bugs, you create a perverse incentive for them to hide their bugs or not tell the testers about new code they wrote in hopes that fewer bugs will be found. You can't win.

    Don't take my word for it, read Austin's book and you'll understand why this measurement dysfunction is inevitable when you can't completely supervise workers (which is almost always).

    -- Joel Spolsky on Measurement

    The anecdotes above are just the tip of the iceberg. I've heard many stories over the years of harmful gaming of metrics. It is clear that you should not introduce metrics lightly. It seems best to either:

    • Define metrics that cannot be effectively gamed; or
    • Win people's trust that metrics are being used solely to improve company performance and will not be used against anyone.
    Suggestions on how to achieve this are welcome.

    Performance Appraisals

    At a recent Agile metrics panel discussion, I was a bit surprised that everyone agreed that their teams had some "rock stars" and some "bad apples". And that "everyone knew who they were". And that you didn't need metrics to know!

    That's been my experience too. I've found that by being an active member of the team, you don't need to rely on numbers, you can simply observe how they perform day to day. Combine with regular one-on-ones plus 360-reviews from their peers and customers and it is obvious who the high performers are and who needs improvement.

    Though I personally feel confident with this process, I admit that it is subjective. I have seen cases where two different team leads have given markedly different scores to the same individual. Of course, these scores are at different times and for different projects. Still, personality compatibility (or conflict) between the team lead and team member can make a significant difference to the review score. It does seem unfair and subjective. Can metrics be used to make the performance appraisal process more objective? My feeling is that it would do more harm than good, as indicated in the "Software Metric Gaming" section above. What do you think?

    Software Development Process Metrics

    Lean-Agile City runs on folklore, intuition, and anecdotes

    -- Larry Maccherone (slide 2 of "The Impact of Agile Quantified")

    It's exceptionally difficult to measure software developer productivity, for all sorts of famous reasons. And it's even harder to perform anything resembling a valid scientific experiment in software development. You can't have the same team do the same project twice; a bunch of stuff changes the second time around. You can't have two teams do the same project; it's too hard to control all the variables, and it's prohibitively expensive to try it in any case. The same team doing two different projects in a row isn't an experiment either. About the best you can do is gather statistical data across a lot of teams doing a lot of projects, and try to identify similarities, and perform some regressions, and hope you find some meaningful correlations.

    But where does the data come from? Companies aren't going to give you their internal data, if they even keep that kind of thing around. Most don't; they cover up their schedule failures and they move on, ever optimistic.

    -- Good Agile, Bad Agile by Steve Yegge

    As pointed out by Yegge above, software metrics are indeed a slippery problem. Especially problematic is getting your hands on a high quality, statistically significant data set.

    The findings in this document were extracted by looking at non-attributable data from 9,629 teams

    -- The Impact of Agile Quantified by Larry Maccherone

    Larry Maccherone was able to solve Yegge's dataset problem by mining non-attributable data from many different teams, in many different organisations, from many different countries. While I found Larry's results interesting and useful, this remains a slippery problem because each team is different and unique.

    Each project's ecosystem is unique. In principle, it should be impossible to say anything concrete and substantive about all teams' ecosystems. It is. Only the people on the team can deduce and decide what will work in that particular environment and tune the environment to support them.

    -- Communicating, cooperating teams by Alistair Cockburn

    By all means learn from Maccherone's overall results. But also think for yourself. Reason about whether each statistical correlation applies to your team's specific context. And Larry strongly cautions against leaping to conclusions about root causes.

    Correlation does not necessarily mean Causation

    The findings in this document are extracted by looking for correlation between “decisions” or behaviors (keeping teams stable, setting your team sizes to between 5 and 9, keeping your Work in Process (WiP) low, etc.) and outcomes as measured by the dimensions of the SDPI. As long as the correlations meet certain statistical requirements we report them here. However, correlation does not necessarily mean causation. For example, just because we show that teams with low average WiP have 1/4 as many defects as teams with high WiP, doesn’t necessarily mean that if you lower your WiP, you’ll reduce your defect density to 1/4 of what it is now. The effect may be partially or wholly related to some other underlying mechanism.

    -- The Impact of Agile Quantified by Larry Maccherone

    "Best Practices"

    There are no best practices. Only good practices in context.

    -- Seven Deadly Sins of Agile Measurement by Larry Maccherone

    I've long found the "Best Practice" meme puzzling. After all, it is impossible to prove that you have truly found the "best" practice. So I welcomed Maccherone's opening piece of advice that the best you can hope for in a complex, empirical process, such as Software Development, is a good process for a given context. Which you should always be seeking to improve.

    A common example of "context" are business and economic drivers. If your business demands very high quality, for example, your "best practice" may well be four-week iterations, while if higher productivity is more important than quality, your "best practice" may be one-week sprints instead (see the "Impact of Agile Quantified Summary of Results" section below for iteration length metrics).

    Team vs Individual Metrics

    From the blog cited by Athanasius:

    (From US baseball): In short, players play to the metrics their management values, even at the cost of the team.
    Yes, Larry Maccherone mentioned a similar anecdote from US basketball, where a star player had a very high individual scoring percentage ... yet statistics showed that the team actually won more often when the star player was not playing! Larry felt this was because he often took low percentage shots to boost his individual score rather than pass to a player in a better position to score.

    Finding the Right Metrics

    More interesting quotes from this blog:

    The same happens in workplaces. Measure YouTube views? Your employees will strive for more and more views. Measure downloads of a product? You’ll get more of that. But if your actual goal is to boost sales or acquire members, better measures might be return-on-investment (ROI), on-site conversion, or retention. Do people who download the product keep using it, or share it with others? If not, all the downloads in the world won’t help your business.

    In the business world, we talk about the difference between vanity metrics and meaningful metrics. Vanity metrics are like dandelions – they might look pretty, but to most of us, they're weeds, using up resources, and doing nothing for your property value. Vanity metrics for your organization might include website visitors per month, Twitter followers, Facebook fans, and media impressions. Here's the thing: if these numbers go up, it might drive up sales of your product. But can you prove it? If yes, great. Measure away. But if you can't, they aren't valuable.

    Good metrics have three key attributes: their data are consistent, cheap, and quick to collect. A simple rule of thumb: if you can't measure results within a week for free (and if you can't replicate the process), then you’re prioritizing the wrong ones.

    Good data scientists know that analyzing the data is the easy part. The hard part is deciding what data matters.

$class = ref($class) if ref($class) Redux
3 direct replies — Read more / Contribute
by boftx
on Jul 04, 2014 at 00:51

    It has taken me a while to come around to the position that using code such as that in the title (or similar) is really cargo-cult programming. Given that seems to be the case, then I was somewhat dismayed to see that Moo automatically includes such when it generates a constructor.

    I initially got around this by using before new to detect the presence of a ref instead of a string and issuing an appropriate croak message. (tip of the hat to the folks on #moose for that!)

    However, several others on #moose, notably some of the authors of Moo, said that was not a wise practice, and indeed, I found that the code was position dependent in relation to where BUILDARGS is defined. Specifically, the code for before new must come after BUILDARGS is defined or else the return value from that gets blown away.

    After asking a couple of questions, and getting the expected response of "Well, write it yourself!" I decided to do just that. I have a working patch to Moo that disables the code in question, replacing it with an exception.

    My question at this point is this: Is the ability to force a constructor to be a "class only" method a viable feature to have in Moo (and be extension, Moose)?

    The solution I arrived at after reading the code for Moo and a few other MooX modules led me to use an option passed into Moo's import method. This was mainly due to the fact that other than a significant refactoring of Method::Generate::Constructor it does not seem feasible to alter the code being generated.

    While I am waiting for a review of my proposed patch, I'd like to hear the thoughts of my fellow monks if this is a desirable option to have.

    You must always remember that the primary goal is to drain the swamp even when you are hip-deep in alligators.
RFC: Converted Fiscal5253 from Classic OO Perl to Moo
1 direct reply — Read more / Contribute
by boftx
on Jul 02, 2014 at 22:53

    I have just finished converting DateTimeX::Fiscal::Fiscal5253 from classic OO Perl to Moo. I would greatly appreciate any comments that result from taking a look at the changes. The test suite required almost no changes (mainly tightening up) so I am fairly confident that I have it close to being correct.

    You can find the Moo branch here on GitHub: https://github.com/boftx/DateTimeX-Fiscal-Fiscal5253/tree/moo-delegation (The current release is in the master branch as one might suppose.)

    I would like to get some feedback before I release even as a devel version since I am fairly certain it will pass the CPAN testers without a problem as it is passing a make disttest on my platform.

    Update: Of special interest is that I had to place a modifier on "new" in order to force "new" to be a class-only method. That is, to ensure that one could not call it as $obj->new. This works as desired, but what is interesting is that the call to before new must come after the code for BUILDARGS or else the returned arg list from that gets blown up somehow before the object is instantiated.

    You must always remember that the primary goal is to drain the swamp even when you are hip-deep in alligators.
why Perl is good, it works
1 direct reply — Read more / Contribute
by zentara
on Jul 01, 2014 at 20:59
    Well, just to give a glimpse into what complexities we are
    being faced with, with ever increasingly complex and
    interdependent libraries, I would like to relate to
    you my day.

    So I'm going to try and make my own Gtk3 theme. A
    noble effort.

    So I here that every release breaks the internal
    css engine ( yes, CSS !!!) ... the bane of us all

    So in order to get a theme going I figured I would get
    the latest version of Gtk+ 3.13. Ok, after figuring out how
    to install it, I find the themes don't work as
    discussed in the forums, so I decide to backtrack
    to the latest stable version 3.12

    ... lo and behold, the themes worked as advertised

    ... So I get gtkparasite too, to really play with
    the settings

    Now, of course I wanted to share my new great theme
    with the world, so I fire up my trusty gftp, and
    for some reason involving the use of ftps, the
    program would hang at "receiving files".

    Now, where to find a ftps gui program that worked?
    I eventually settle on FileZilla as the only
    recommended GUI, but it came down as a binary
    Wx file

    I wanted source code, so I finally delve
    down 5 layers of menus and find the FileZilla
    source code

    I download it, try to build it and it fails with
    an error, needs WxWidgets

    no problemo, I get the latest version of Wxwidgets,
    and it installs fine

    I go back to FileZilla, try to compile, and it says
    "soory, you have version 3.0 of Wx, and we need version
    2.8.12"

    No problem, I get Wx-2.8.12 and try to compile:
    It fails with <gtk.h> cursor.h not found

    ...exasperated, I search for commandline solutions
    to do whole directory uploads via ftps

    Everyone recommends lftp, so I googled and followed
    everyones ftps lftp set commands, but nothing
    worked ... lftp's ftps would fail with unknown
    protocol
    everytime

    I finally deduced after a git download of lftp's
    git repository, and the required git clone of libtool,
    that my Perl script using Net::FTPSSL, was the only
    thing that worked

    What really tore at me, was when the libtool output
    lines were flashing by on the screen, they said
    Libtool: Doing nothing

    There must be an intelligence behind it all. :-)

    Perl just does it right, it's my only way to
    do ftps ... the C programs be damned. :-)

    My understanding now, is that my lsftp, which
    comes stock with my Slackware, was built for
    SSL, and not TLS encryption, and my ftp
    server uses auth TLS

    So I spent 5 hours, searching for a complicated
    software fixes, because sites are using TLS, and
    other software comes with OpenSSL only

    But Perl worked, I am proud to report. Out of all
    the mucked up software out there, only Perl
    would connect, and let me do what I need to do

    Finally, this whole diatribe just goes to show,
    the value of Tk. Simple to install, no collections
    of libs required, and the Tk::Zinc Canvas is the only
    decent, documented Perl GUI canvas available to
    Perl

    Working with software is the game, a maze to get
    thru, and a prize waits at the end

    I was looking for a Angry Bird's sized new app,
    how about a software maze ... you need to show
    intelligence and ingenuity to get to some prize
    at the center of the software maze

    Work thru the javascript, the css, the various
    maps that only can be seen with difficult to
    install software

    I give the idea free to the world, to stimulate
    hackers and students everywhere.

    I mean it's a great idea, but I don't want to do it. ;-)



    I'm not really a human, but I play one on earth.
    Old Perl Programmer Haiku ................... flash japh
Advice needed
5 direct replies — Read more / Contribute
by baxy77bax
on Jul 01, 2014 at 10:56
    Hi,

    As in every discussion there is always an out of framework post, not particularly related to other posts but a person asking needed to start somewhere so there you have it. This is that type of a post.

    So I finally decided it is time to move on. I am not young enough to change the job and start all over with my 90' programming skills (like someone nicely phrased it in response to one of my previous posts here) and not old enough or should I say not arrogant enough to steal someones else's position in the company I currently work at. So I decided to start my own business. Since I've been in system administration and search engines for a while now and know a thing or two about it I decided to frame my business idea around it. However for that I need a financial injection and/or an access to a machine (hopefully not too expensive) where I can set up my Internet business. Since this is not a typical internet sales framework I cannot just set up a web site and start selling. I need a full unix/linux environment with ca. 100GB of disc space 4-8 cores 4-8 GB of RAM and an Internet domain. Is there any service that you know that can rent something like that? Second I never programmed any money transfer app. And have no idea how to do it right now so is there any perl module that can be used to do something like that? Or how do people setup something like that so a client can safely make a money transfer. Essential I need some basic first hand advice on how to start an online business. How to register it even if I am looking at some donation based system like here at perlmonks. I have been googling a lot about the subject but there are bunch of advices like

    1. Have an Idea.
    2. Register a company
    3. Make a website
    4. Make a gazillion dollars

    Well maybe it is just me, but this is so vague that I cannot even begin to rationalize what the author meant or what was he/she thinking when writing something like that. So if there is anyone here that has some experience with it and willing to share I would gladly take some advice. (Positive or negative, both are welcomed)

    cheers,

    baxy

[RFC: Cool Uses for Perl] A weighted randomized word generator for conlangers
1 direct reply — Read more / Contribute
by flowdy
on Jun 20, 2014 at 18:32
    Hi,

    A decade or so ago, I was a conlanger. I liked to construct a language spoken by a virtual people I didn't however, in contrary to J.R.R. Tolkien, shape regarding their culture. Obviously I had a plenty of mind capacity for such things, and they called me quixotic. No more. Now I prefer programing and hobbies and a real life, too. I remember that the grammar part was in favour compared to developing a comprehensive lexicon so the language is talkable. Inflection and word order and features I liked in other languages however were far more interesting. The best part about it I found were its 35 cases, actually two case systems interacting like cogwheels.

    I just programmed a word generator as a kind of tribute to that young nerd's passion. It respects how words can be shaped in a language and how not. In English for instance there is no word like quirge, but it still seems more English than, er, dampfschiff which is by the way, believe it or not, actually a word in a spoken language. These unwritten principles are characteristic for every language. Despite of their being unwritten, you can approximate these principles by probabilistic shares and the rand() of perl.

    Even if you are not interested in conlanging, you might want to study this example how curried closures can lead to efficient code, i.e. code of which I am quite sure it's efficient, until someone will prove me wrong. For those who do not know yet: Closures are anonymous subroutines that cling, for their own life-time, to any used lexical variables which would have been garbage-collected otherwise on going out of their respective scopes. Higher Order is when you pass these subroutine references to other functions which call them back when appropriate. Learned that once from the High Order Perl book by Mark J. Dominus (available online but if you can effort it, consider buying it).

    Update: Corrected name of the Higher Order Perl author. Further confused Currying and Higher Order. Will have to re-read myself the very book I cited.

    -- flowdy

Perl Job Marketability Question - very important for me!
6 direct replies — Read more / Contribute
by o2bwise
on Jun 13, 2014 at 17:55

    Hi,

    I got laid off a couple of months ago and am in the midst of the whole job search process. With that in mind, one thing I am wanting to have a good idea of is my marketability.

    I have been programming in Perl for close to 15 years now, but I think my use of the language has been pretty limited in scope.

    Now, my last job included an application called CA eHealth. I did a ton of Perl programming involving this application. (In fact, I largely lost my job due to my employer’s decision to replace this application.)

    There is one particular kind of scripting that I just love, but I am frankly unsure if it is sufficiently marketable. What I love doing is having Perl read data from a number of sources (databases, flat files) and produce desired data. I just LOVE that kind of coding. I love the whole portion in between those two points. Data aggregation, coming up with one or more nested hash data structures, looping through the hashes and writing out data as needed. Sometimes needing to do math and/or stats as part of the data aggregation. Mining through lines of text and db queries, and so on.

    My question is, is just the above thing I love to do marketable? I honestly do not know, but I sure hope so.


    Tony

Bring back the smartmatch operator (but with sane semantics this time)!
6 direct replies — Read more / Contribute
by smls
on Jun 10, 2014 at 12:52

    No I'm not kidding, please hear me out... :)

    History of smartmatch in Perl

    Smartmatching was invented for Perl 6 where it turned out to be a very useful and well-loved1 feature, but the attempt to backport it to Perl 5.10 in 2009 did not turn out so great (and it was consequently deprecated again in Perl 5.18). Among the Perl 6 community the commonly accepted explanation for that is, from what I heard:

    • Perl lacks a strong & fine-grained type system (which in Perl 6, where it exists, adds much sanity to the concept of dynamic dispatch)
    • Perl lacks composable related features like Junctions (which allow the Perl 6 smartmatching rules to be simpler and less arbitrary than the rules that would be needed to facilitate the same set of use-cases in Perl)

    These limitations are hard to circumvent, but I don't think that means Perl should have no smartmatching at all, it just means it should have less ambitious / more focused smartmatching.

    I wasn't around at the time, but it looks to me as if the Perl 5.10+ smartmatching was designed with these goals:

    1. Support all use-cases that the Perl 6 smartmatch operator supports
    2. Use it as an opportunity to sneak in useful new comparison/searching operations into the core, without having to invent separate operator names for them, and without having to justify them individually

    ...and was thus doomed to failure.

    Some later proposals for re-designing the smartmatch operator (like this 2011 post by brian d foy) tend to avoid mistake no. 1, but still fall into the second trap.

    If there are comparison/searching operations that are deemed worthy of being added to Perl (say, "deep comparison" of two arrays, or checking whether an array contains a given scalar), then each of them should get its own operator. That's the normal Perl way: One operator per type of operation (that's why we have both == and eq for example).

    How smartmatch should be designed

    Smartmatching explicitly breaks with the conventional "one meaning per operator" rule by dynamically deciding what operation to perform based on its arguments. This means it should be carefully designed around use-cases where you actually need to dynamically decide what operation to perform. Operations that you would likely never want to mix-and-match, have no business being part of the smartmatch operator, even if they would be useful to have in core by themselves.

    So, what are those use-cases where you actually need dynamic smartmatching? I can think of two major ones:

    1. When you want to avoid writing out  ($_ <operator> ...)  in a given/when construct, for the purpose of brevity/elegance:

      use v6; given $username { when 'root' { dostuff } when /^guest\d*$/ { die "You're not allowed to do stuff." } when any(<http apache>) { authenticate :web; dostuff } default { authenticate :local; dostuff } }

      Of course it is only elegant when the meaning is self-evident without consulting a manual, so this use-case only makes sense for commonly used & unambiguous comparison operations.

    2. When you want your code to test things against a "filter/pattern/rule" that is passed in from the outside, and you don't want to restrict it to just one way of filtering (e.g. only by string comparison, or only by regex, or only by callback etc.)

      For example, consider the Perl 6 built-in function dir, which lists the contents of a directory in the filesystem. It takes an optional 'test' argument, against which it promises to smartmatch each filename and only return the matching ones. Since smartmatch is built into the language, Perl 6 programmers need no further documentation to understand that parameter; they know they can use anything that would be valid as the right-hand-side argument of ~~ as the test, for example:

      use v6; dir '/some/directory', test => /\.txt$/; # a regex dir '/some/directory', test => none('.', '..'); # a junction² dir '/some/directory', test => &validate_filename; # a coderef

      The result is a very flexible but still elegant and predictable API that is easy to imitate in your own functions/modules that want to allow their users to "match" or filter stuff: Just use smartmatch as your filter implementation!

    We can make new Perl 5 smartmatching rules useful for those use-cases, while still keeping them sane and predictable, by adhering to the these two principles:

    1. Decide what operation to perform, based on the type of the right-hand-side argument (and nothing else!)
      (Put another way, this means that  LHS ~~ RHS  can always be expressed in words as the question "Does LHS fit the constraint/template defined by RHS?")

    2. Blindly coerce the left-hand-side argument to the type that the chosen operation requires, just as normal Perl operators like eq also coerce their arguments.
      (So, for example, @foo ~~ /foo/ would be the same as @foo =~ /foo/, even though that may not be useful, rather than doing anything special just because it's an array!)

    Sensible smartmatch rules

    With that in mind, we can start to think about the kind of right-hand-side "things" that it should be possible to smartmatch against.

    The following are no-brainers imo:

    if RHS is an... (example) then  LHS ~~ RHS  should do...
    undefined scalar $x ~~ undef !defined(LHS)
    simple scalar $x ~~ 'foo' LHS eq RHS
    regex (literal or reference) $x ~~ /foo/ LHS =~ RHS
    code reference $x ~~ sub { ... } RHS->(LHS)
    an object that overloads ~~ $x ~~ $object call the overload method, with LHS as argument

    The 'simple scalar' case is not as elegant as one might wish it to be; Ideally it would be able to dynamically decide between string or numeric comparison like it does in Perl 6, but I don't think that is possible to do safely in Perl (its type system being what it is), so we need to take what we can get.

    The following two rules also tend to be pretty useful in Perl 6, and it might make sense to add them to our hypothetical new Perl smartmatch, but I'm unsure about them because range literals and typename barewords are not usually treated as first-class "things" in Perl, so it might feel strange:

    if RHS is a... (example) then  LHS ~~ RHS  should do...
    bareword $node ~~ XML::LibXML::Node ref(LHS) eq "RHS"
    range literal $age ~~ 0..17 interpret LHS as a number, and check if is within the range

    Lastly, the lack of junctions in Perl could be partially remedied by interpreting an array/list on the right-hand-side like an any() junction:

    if RHS is an... (example) then  LHS ~~ RHS  should do...
    array or list $switch ~~ qw(yes true on 1) (grep { LHS ~~ $_ } RHS) >= 1

    Of course, a better solution would be to add junctions to Perl together with re-adding smartmatch... :)
    (Perl6::Junctions already exists on CPAN, but it relies on at least one awful hack due to the fact that it is non-core).

    Anyway, the above rules would be more or less a subset of both Perl 6 smartmatching and the deprecated Perl 5.10+ smartmatching, but without the craziness of the latter.

    And that's it; All cases not handled by these rules should generate a runtime error.
    I don't think any other special cases need to be added - in particular, all the arbitrary behaviors that Perl 5.10+ smartmatching added for when one or both arguments were arrays/hashes, only served to confuse people and made the operator "not safe to use" in practice. Let's not repeat that mistake.

    PS: In case you want to get a "feel" for what this kind of smart-matching is like in practice, check out Toby Inkster's match::simple module which implements very similar rules to what is discussed here (but suffers from some unavoidable limitations due to the fact that it is not in core).

    ---

    1) Among the small but passionate fan base of Perl 6 :)
    2) This particular junction is in fact used as the default when the 'test' argument is omitted.
RFC - early draft of Menu::Simple
2 direct replies — Read more / Contribute
by RonW
on Jun 06, 2014 at 13:13

    A while back, to further simplify making menus for occasional Tk driven Perl tool, I started Menu::Simple as an alternative to the -menuitems option in Tk::Menu::Items. It has significantly helped others to add their own Tk GUIs to Perl tools. (None of us here do much GUI coding, so anything that makes it easier helps us.)

    Recently, while adding a Tk GUI to yet another tool I created using Perl, it occurred to me that Menu::Simple might be worth sharing.

    It still needs more documentation, but I do have a simple example that shows the basic usage. (It also needs some clean up. Over time, I have changed and enhanced it. Just haven't thought about sharing outside my team at work.)

    So, here is the module and usage example. (Warning: It currently uses the 'current_sub' feature, so at least Perl 5.16 is needed.)

    Update: Determined that Perl 5.16 is currently needed. Will try to make it work with earlier versions.

    Update 2: After clean-ups, of course, would this be worth submitting to CPAN? Maybe change the name to Tk::Menu::Simple?

    Update 3: Removed requirement for Perl 5.16, but not sure how old a Perl it will work with.

    Update 4: Added a draft of documentation. Also, changed name to Menu::Builder. (I am planning to contact the maintainer of the Tk namespace on CPAN about having this included as Tk::Menu::Builder.)

at continue, last
7 direct replies — Read more / Contribute
by djerius
on Jun 06, 2014 at 00:53
    Update The imprecise nature of the language in this post has muddied the waters. The point of the post was to enumerate some ways of ensuring a single path for cleanup from a conditional clause, not all of them practical. My thanks to the respondents who persevered in spite of the muddiness of the water.

    On to the original post...

    I need to ensure that a variable is decremented regardless of how an if clause is exited, which might be in the middle of the block. I can code it this way:

    if ( $condition1 ) {{ ... --$loop, last if $condition2; #bail! ... --$loop; }}
    The repeated decrements are not DRY. This one is better:
    if ( $condition1 ) { { ... last if $condition2; ... } --$loop; }
    Along the way I stumbled upon these monstrosities:
    for ( ; $condition1 ; $loop--, last ) { ... next if $condition2; ... }
    Even worse:
    while ( $condition1 ) { ... next if $condition2; ... } continue { --$loop; last; }
    A last in a continue? Who knew?

Add your Meditation
Title:
Meditation:
Use:  <p> text here (a paragraph) </p>
and:  <code> code here </code>
to format your post; it's "PerlMonks-approved HTML":


  • Posts are HTML formatted. Put <p> </p> tags around your paragraphs. Put <code> </code> tags around your code and data!
  • Read Where should I post X? if you're not absolutely sure you're posting in the right place.
  • Please read these before you post! —
  • Posts may use any of the Perl Monks Approved HTML tags:
    a, abbr, b, big, blockquote, br, caption, center, col, colgroup, dd, del, div, dl, dt, em, font, h1, h2, h3, h4, h5, h6, hr, i, ins, li, ol, p, pre, readmore, small, span, spoiler, strike, strong, sub, sup, table, tbody, td, tfoot, th, thead, tr, tt, u, ul, wbr
  • Outside of code tags, you may need to use entities for some characters:
            For:     Use:
    & &amp;
    < &lt;
    > &gt;
    [ &#91;
    ] &#93;
  • Link using PerlMonks shortcuts! What shortcuts can I use for linking?
  • See Writeup Formatting Tips and other pages linked from there for more info.
  • Log In?
    Username:
    Password:

    What's my password?
    Create A New User
    Chatterbox?
    and the web crawler heard nothing...

    How do I use this? | Other CB clients
    Other Users?
    Others rifling through the Monastery: (7)
    As of 2014-07-24 06:36 GMT
    Sections?
    Information?
    Find Nodes?
    Leftovers?
      Voting Booth?

      My favorite superfluous repetitious redundant duplicative phrase is:









      Results (158 votes), past polls