Beefy Boxes and Bandwidth Generously Provided by pair Networks
laziness, impatience, and hubris

local::lib & multiple architectures

by mikeman (Acolyte)
on Dec 05, 2011 at 09:37 UTC ( #941809=perlquestion: print w/replies, xml ) Need Help??
mikeman has asked for the wisdom of the Perl Monks concerning the following question:


I use local::lib on Linux to create a self-contained library for an application. The development environment and main web servers are x86_64. I now have a requirement to install the same application on an i386 architecture.

Is it safe for the self-contained library to contain both i386 and x86_64 installations of the same modules? Or would I be better creating an alternative directory structure for the i386 modules?

Replies are listed 'Best First'.
Re: local::lib & multiple architectures
by moritz (Cardinal) on Dec 05, 2011 at 10:18 UTC

    I haven't tested it, but it should be safe to install both i386 and x86_64 modules to the same local::lib dir.

    In my installation (using perlbrew instead of local::lib), pure perl modules are directly under lib/$version/, but XS modules are under lib/$version/x86_64-linux/, so there should be no conflict.

      /me nods... ... but, on the other hand ...

      Many packages in Perl are actually compiled in-place, and as such, they could be quite dependent upon their surroundings.   Whether a particular external DLL or .so exists, exactly what one was linked-in, and so on.   So, what I think that I’d do, instead, is to make a build on an actual representative i386 system, and then clone that to the i386 systems as needed.   I’d feel a whole lot more confident about that.

        Many packages in Perl are actually compiled in-place, and as such, they could be quite dependent upon their surroundings.

        Compiled in-place? You mean they override themselves during compilation? I've never encountered that.

        If not, could you maybe elaborate on what you mean by "compiled in-placed"? Are you talking about dynamic linking?

Re: local::lib & multiple architectures
by tospo (Hermit) on Dec 06, 2011 at 10:49 UTC
    I use local::lib in exactly that way to maintain a repository on a central server for 64bit and 32bit architectures and it works fine because the architecture-specific stuff ends up in architecture-specific directories, as moritz pointed out. However, it is also true that you want to make sure that you compile on a machine that is as close to the machine on which they will run as possible. In particular, you want the same compiler to be used. At work, we have one machine to compile 64bit and one for 32bit software, which run the same version of the OS we use in production.
Re: local::lib & multiple architectures
by sundialsvc4 (Abbot) on Dec 06, 2011 at 13:16 UTC

    I finally thought of a succinct way to say it ...   ./configure

    We’ve all seen “the miracle of automake” at work, where we see a bunch of messages like this one:
    Checking to see if libxyzzy is installed ... yes.

    ... and what is actually happening here is that environment-specific decisions are being made on-the-fly to decide exactly how to build and then link this source-code package together, so that, in the end, It Just Works™ Everywhere.   So, “the same C-language software,” when compiled on this machine vs. that one, might be of altogether different construction, and have an altogether different set of library dependencies, even though the resulting .dll or .so file-name is exactly the same.

    Everybody, as far as I know, uses this or something like it.   It all “just happens,” courtesy of CPAN, but you don’t necessarily know what each individual package-designer decided to do and you don’t want to have to care.   Therefore, build on a “typical target” where automake can “see” whatever it needs to see and you don’t have to care exactly what it chooses to react to.

Re: local::lib & multiple architectures
by chrestomanci (Priest) on Dec 06, 2011 at 21:03 UTC

    I used local::lib on a cross platform project about 6 months ago, where the contents of that lib directory where checked into subversion along with the main project source code, scripts etc.

    In the main it worked well, but there where occasions where perl packages that compiled binary XS code would break things quite comprehensively for other platforms, while appearing to work fine for the platform I compiled an tested on.

    In a typical scenario, In my development (on Linux x86_64), I decided that I needed package X from CPAN, so I would start the CPAN tool and install it into the local lib directory of my subversion checkout. CPAN would follow dependences and also install package Y which was an XS module. The install would appear to work fine. Once I finished my development and all the unit tests passed, I checked the code back into the trunk.

    Shortly after another developer who works on windows (i686) would update to the latest head, and discover that everything was broken because the binary portions of package Y where missing.

    Best case scenario he would be forced to stop what he was doing and start CPAN to re-install package Y. On one occasion we managed to break things so badly that local::lib would not work at all under windows, and we had to revert the whole change and start again.

    The work around we came up with was to create three branches from trunk (n+1 where n is the number of supported platforms). We would then do the CPAN install into the local lib using a VM checked out on each supported platform on a different clean branch. All the branches where then merged into the staging branch that contained the binary XS code for all platforms. That was then unit tested on all platforms before being merged back into the trunk.

Log In?

What's my password?
Create A New User
Node Status?
node history
Node Type: perlquestion [id://941809]
Approved by Corion
Front-paged by Corion
and all is quiet...

How do I use this? | Other CB clients
Other Users?
Others surveying the Monastery: (2)
As of 2018-05-28 00:37 GMT
Find Nodes?
    Voting Booth?