Beefy Boxes and Bandwidth Generously Provided by pair Networks
P is for Practical
 
PerlMonks  

Re^4: Tk-804.036 build failure

by ibm1620 (Hermit)
on Jun 29, 2023 at 22:11 UTC ( [id://11153217]=note: print w/replies, xml ) Need Help??


in reply to Re^3: Tk-804.036 build failure
in thread Tk-804.036 build failure

Thank you, marto. I ran prove as recommended:
chap@Retsina:...rk/1688073614.30084/Tk-804.036$ pwd /Users/chap/.cpanm/work/1688073614.30084/Tk-804.036 chap@Retsina:...rk/1688073614.30084/Tk-804.036$ prove -v PNG/t/basic.t PNG/t/basic.t .. 1..5 ok 1 - use Tk::PNG; Failed 4/5 subtests Test Summary Report ------------------- PNG/t/basic.t (Wstat: 11 Tests: 1 Failed: 0) Non-zero wait status: 11 Parse errors: Bad plan. You planned 5 tests but ran 1. Files=1, Tests=1, 0 wallclock secs ( 0.03 usr 0.01 sys + 0.16 cusr + 0.02 csys = 0.22 CPU) Result: FAIL
I am by now well out of my depth, so I can't dig any further into this :-). I'm inclined not to suspect the clang compiler, since it fails in the same test for either compiler.

I did install perl-5.30.3 on M1 MacOS 13.4.1, and successfully ran cpanm Tk.

To OP: is installing a more recent version of Perl an option?

Replies are listed 'Best First'.
Re^5: Tk-804.036 build failure
by marto (Cardinal) on Jun 30, 2023 at 07:44 UTC

    Perhaps this would help to investigate the segfault:

    perl -d:Trace /usr/bin/prove -v PNG/t/basic.t

    Requires Devel::Trace.

      Thanks for this. Unfortunately it was not evident to me from the output where any failure occurred. No mention of segfault.

Log In?
Username:
Password:

What's my password?
Create A New User
Domain Nodelet?
Node Status?
node history
Node Type: note [id://11153217]
help
Chatterbox?
and the web crawler heard nothing...

How do I use this?Last hourOther CB clients
Other Users?
Others cooling their heels in the Monastery: (5)
As of 2025-07-10 21:02 GMT
Sections?
Information?
Find Nodes?
Leftovers?
    Voting Booth?

    No recent polls found

    Notices?
    erzuuliAnonymous Monks are no longer allowed to use Super Search, due to an excessive use of this resource by robots.