I won’t reiterate the back-and-forth from other threads. I assume the reader hasn’t been living under a rock and is aware that i386 support is due to go away in
19.10, and of the issues of supporting 32-bit binaries such as games.
There have been several calls for the community to step up and offer i386 support. I am interested to know what would be involved, and would this be viable? Is anyone with a deep knowledge of
apt able to advise here? My particular questions are:
- How many libraries are we talking about?
- Can we package i386 libraries with the same name as their amd64 equivalents and supply them in a different repository?
- Would the i386 library updates have to be kept in perfect synchronisation with the amd64 repo?
- Could the testing infrastructure for the amd64 libraries be used by the community for i386?
- Would this have to be delivered as a PPA, or would it be “properly” hosted by Canonical in the same way as Multi/Universe?
- Is it viable to “steal” the prebuilt binaries from Debian Sid?
- What sort of man-hour commitment would be estimated to build/support, say, 120 libraries?
- Are these efforts likely to be enough to persuade 32-bit software vendors like Valve or GOG to continue supporting Ubuntu as a target distribution?
- How do we handle edge cases like the proprietary nVidia drivers providing opengl libraries?
I suppose those question crystallise to (1) is it possible; (2) is it viable; (3) is it worthwhile? I’d respectfully request that this thread only focuses on the practical issues of a community-supported i386 repos. Please discuss the rights and wrongs of the decision to drop i386 elsewhere.
Technically it’s all pretty straightforward. You could bootstrap an i386 only archive with the current packages in eoan and from that point on build only whatever new packages land in eoan, but only build for i386.
I don’t know exactly how many packages actually “need” to be in there, but it’s possible to see what other distros do here, and hand-wavy numbers between ~300 and ~1000 depending on the coverage you’re after. Could be as low as ~100 but you’d have the possibility some 32-bit applications won’t work.
If community maintained it would effectively be like any other repository or ‘ppa’ style archive. It could just follow packages that land in Ubuntu, build them and land them soon after or relatively in sync.
Someone needs to maintain that though, and test it. Who is going to do that? Would users trust it? Would it be allowed to use the Ubuntu trademark? Would it be installed by default? Is it worth having if it’s not installed by default?
If it were hosted by Canonical then users (and customers) would have an expectation of support, and we’re right back where we started. If hosted by the community, who is going to cover the cost of bandwidth (and maintenance) of that repo? Currently Canonical covers that cost, clearly. Repositories are expensive things to run, when nobody is paying at the point of use. Eoan i386 binaries + source is ~150GB in size today. Push out some updates and you have the cost of thousands (millions) of machines hitting that service. There’s a reason we have mirrors and CDNs.
Possible, maybe, viable, unlikely, worthwhile, who knows.
I understand Debian will still have an i386 PPA. Wouldn’t it just be possible to add that PPA to our “sources?”
Short answer: No.
Long answer: In general, you cannot just install an arbitrary package from the Debian repositories and expect it to work perfectly. Ubuntu maintains its own patch sets, so compatibility is not always guaranteed. In addition to that, apt doesn’t allow you to install different versions of the same package for different architectures, so if the versions in Debian and Ubuntu don’t match, you’re out of luck.
Could i386 libraries from that PPA be provided to Ubuntu users via an (unconfined) snap?
I’ve similar question, can 32 library from core18 being use by regular unconfined apps?
If the aim is to provide 32-bit libraries for amd64 systems then the “better” approach would be to package
lib32-* versions, e.g.
No extra repository or architecture is needed, just more/different packages, and those packages can be built and updated alongside their 64-bit counterparts by the same infrastructure.
This is already done for a number of packages in the Ubuntu repos, and it’s the approach taken by other distros, so it works.
If I can find time next week I might put together a PoC PPA…
My only concern with that strategy is related with config files. If lib32 packages gets behind the amd64 version, there may be a change that can break some config files. Of course, this is a library specific issue, but something to be aware of.
I have yet to see a library which won’t build with
-m32 so I suspect there is little danger of a 32-bit library “getting behind” the 64-bit counterpart.
Indeed, if the
lib32-$library packages are part of the same
debian/control file then they can’t get out of sync and will both be packaged at the exact same time.
The GCC packages already do this (and GCC is hardly a trivial bit of software) so it’s a “solved problem”.
What I don’t understand is why netboot image is being dropped. Netboot only contains basic libraries and is required for compiling/fixing patch from community. Can anyone throw some light on it ?
i386 kernel is insecure, how do you netboot if i386 kernel is dropped?