Statement on 32-bit i386 packages for Ubuntu 19.10 and 20.04 LTS

Reading all the toxic comments and hateful replies on the Internet in last couple of days, I just wanted to put this out:

Thank you, Ubuntu Developers, for your effort! You are doing an amazing job! (with and without 32bit support, btw).

And to cheer up the mood, take a minute for Ricky Gervais: https://www.youtube.com/watch?v=L3dxMGzt5mU

8 Likes

This is not the correct attitude.
Whether, wine can or cannot be run on Ubuntu doesn’t make Ubuntu less better. That’s a wine problem, not an Ubuntu problem.

I’d recommend Ubuntu to anyone, who care to listen and try. Most would dual boot. I won’t ask them to stop playing Windows games in Windows or hate Windows. All, I’d want would be these new users to try Ubuntu and get attached to it, and without any demands. Let them start liking/loving Linux and Ubuntu without any restrictions. Some would stay, and some would still play Windows games in Windows.

1 Like

I personally have i386 arch removed both from my work and home machines (even “apt update” works faster). Of course, games and Wine are super important and should work. But it will be cool to see some fresh polls/stats on how many users rely on 32bit apps and how many just want to play games.

Year 2038 problem, progressively increasing difficulty of supporting 32bit, security issues, etc. will not disappear. So, this should be addressed. I like Snap, and it should be a good solution for this.

2 Likes

And when it comes to taking that last step and run a couple of old windows programs they’d have found out that those things used to be possible on Ubuntu, that it is possible on other distros but Ubuntu doesn’t consider their need worthy of the effort

What sort of message does that send to new users, and even to existing users that get their perfectly functional setups wiped out from under them?

I’m glad the decision has been reversed, and that Ubuntu has committed to working with other projects like WINE to find a longer term solution

But that should have been done in the first place. That it wasn’t, that they thought they could just dictate their will onto others gives me severe reasons to doubt whether I can in good conscious recommend this distro to others anymore

4 Likes

Ppl should come to Linux, and Ubuntu with an open mind, not just to run away from some other distro. Those who run away from one OS, would run away again from the next OS in time. Ubuntu is an established high quality distro.

Wine, on the other hand, advertises Windows apps, so of no interest to me. And, should be of no interest to those who really wants to use Linux.

1 Like

I’d like to address a few points from the announcement:

community process to determine which 32-bit packages are needed […] add to that list post-release if we miss something

Sounds like a lot of work, TBH, even if it’s mostly the community (= unpaid volunteers) doing it, lots of hassle. Definitely more than just leaving the i386 buildds running.

use container technology to address the ultimate end of life of 32-bit libraries […] Snaps and LXD enable us both to have complete 32-bit environments, and bundled libraries

Again, why? Why drop something that took years to design & implement, but most importantly already exists and works well – only to replace it with an as yet vague idea of a solution that’s at best over-engineered and at worst does less, less well?
By all means, isolate legacy software as much as is feasible, but I’m not keen on bundling everything and the kitchen sink with applications. I much prefer the traditional way.

There is real risk to anybody who is running a body of software that gets little testing.

How is this a problem for the 32-bit repository but not, say, for 64-bit universe or random PPAs?

The facts are that most 32-bit x86 packages are hardly used at all. That means fewer eyeballs, and more bugs.

One would hope that any bugs don’t hide in the automatically generated packages but in the source code of the software contained therein, source code that is largely shared between other architectures. In my experience, targeting a multitude of architectures makes for better code, as cut corners don’t tend to be very portable.

Software continues to grow in size at the high end, making it very difficult to even build new applications in 32-bit environments.

(New) software that can’t be built in 32-bit isn’t going to require it or be a required dependency, is it? Besides, nobody is asking you to build something that cannot be built. For now, Debian manage just fine, despite having ~10 release architectures, a much larger fully supported repository, and being non-profit.

Don’t get me wrong, it’s your OS, you should be free to take it in any direction you see fit and you shouldn’t even have to explain yourselves. But you did – and as it stands your arguments aren’t very convincing.

2 Likes

What I don’t understand is why netboot image is being dropped. Netboot only contains basic libraries and is required for compiling/fixing patch from community. Can anyone throw some light on it ?

1 Like

Further, Valve have posted an update.

https://steamcommunity.com/app/221410/discussions/0/1640915206447625383/

6 Likes

You’re just ignoring the arguments on one side. Saying that the argument is “not worthºy” doesn’t really represent correctly the arguments that were made, and saying it’s not possible to run it on Ubuntu is simply to ignore the path that was presented to actually enable users to continue to use them. It’s not fair to put things that way and doesn’t represent the reality.

By the way other distros have been walking the same path, and some have also done the same (minor distros in popularity like OpenMandriva), but the fact that the others haven’t looked more into doing something like this says more about the other distros disregard for security and stability risks, than says whatever about Ubuntu. When you lag maintenance at the upstream projects and QA is almost null because nobody does it, then those risks increase a lot.

1 Like

It doesn’t work well and it’s not maintained correctly often on upstream projects, and there’s lack of QA

Would you mind elaborating? I.e., in what way does it not work well? Your upstream in this case is Debian, they still maintain i386 (dpkg arch, not “would run on an actual i386”) as a first class arch, and you don’t hear “Debian” and “not maintained correctly” in the same sentence very often. As long as the i386 packages are built from the same source as the amd64 ones, they stay current, most fixes trickle down to them as well. The exposure is limited to issues that only manifest in the i386 version of a package. That’s much better odds than amd64 universe and PPA packages, that may not receive any updates at all and be of questionable quality to begin with.

The fact is, current amd64 systems can run 32-bit code natively, removing support on the OS level is a regression in hardware support. I’d be against it in principle even if there weren’t any major use-cases – and there are plenty. (One could even make the case for re-introducing i386 as an installable architecture, if not for 32-bit/hybrid Atom boards and netbooks then for lighter [= more] VMs, but I digress.) The point is, 32-bit code is not legacy, hardware that cannot run 64-bit code is – that’s a big difference. Even if it were legacy, it looks like many people still want or even need to run 32-bit software, and why shouldn’t they? When choosing an OS, they ask “Does it run my stuff, and does it run it well?”, not “Is it pure 64-bit?” – and for once, they got it right.

IMHO, running both 32-bit and 64-bit code on amd64 at the same time should be fully supported and as seamlessly as possible – Windows does that very well – for the foreseeable future, and multiarch is, as of now, the best way to do that. Compiling the 32-bit packages to target amd64 CPUs is an option, as is selectively dropping non-essential packages that won’t compile and would be too much effort to fix.
Bundles (Flatpack/Snap/Appimage) currently negate all advantages of centralised package management, containers/VMs aren’t there yet, either. Too much overhead, too much complexity to manage, too fragile for the desktop, not at all well integrated, let alone seamless.
It’s highly likely that a better solution than multiarch will emerge, maybe in conjunction with a paradigm shift to containerise everything transparently (cf. Qubes), who knows, but it isn’t even on the horizon yet, so why the rush?

2 Likes

I for 1 am glad to see support for running 32bit on amd64.
While I do like the thought of moving towards a pure 64bit environment, I don’t think the time is right.
I wish 32 programs could just use the 64bit library, but I guess that just isn’t going to happen.

Frankly I’m annoyed by all this “it’s not ubuntus problem if a given application doesn’t work/run”. This is not kosher in my view to the community guidelines. Regardless of a person’s view toward a particular program , these comments are not helpful. Some users need these programs, your personal bias against Windows and games is just detrimental to the conversation.

As was pointed out in the legitimate concerns category there are some 64bit apps that have 32bit dependency issues. Wine being just 1 example. Luckily the apt system can enumerate such dependencies and a solution can be found that covers most use cases.

1 Like

If it’s so little maintenance to take Debian’s i386 packages and run them in Ubuntu, perhaps you could make a respin of Ubuntu 19.10 which has full i386 support? Or is it not as easy as you’re making it out to be and it is actually time-consuming to do?

You could call it Ubuntu i386 Remix 19.10 :slight_smile:

This was already explained on the thread about dropping 32 bit packages, I’ll not repeat it. The upstream of Ubuntu is not only Debian, and the upstream I was referring was the software developers.

That was never what was at play here. Ubuntu was never removing the ability to run 32bit software on Ubuntu.

You don’t say which thread exactly but I’ve read the predecessor to this one, and it doesn’t go into any detail, doesn’t give reasons, just asserts.

Yes, you were, and are still going to. Without a comprehensive set of 32-bit drivers, libraries, frameworks … 32-bit applications cannot run. Providing these is what an OS does. No 32-bit libs = no OS-level 32-bit software support. By that logic, you might just as well remove the 64-bit libs, because 64-bit code would still run on the hardware & kernel.
No, containers, VMs, emulators and such don’t count, at that point you’re running an OS inside an OS (and both of them might just as well not be Ubuntu). By that logic, Ubuntu also has the ability to run Windows software, SNES ROMs, …

But never mind. As I’ve said before, it’s your decision. It’s just that I prefer sound reasoning to underpin decisions – and so far all I’ve seen is “We’re desperately low on manpower & other resources” [worrying, as that tends to lead to cut corners and bad decisions]; “64 bits are more than 32, that must be better” & “removing compatibility equals progress” [are there’s only 20-year-olds left at the helm?]; “just upgrade, the hardware, the software, whatever, it’s cheap & easy!.” [total disconnection from reality, also ironic] & “use-cases that aren’t mine are irrelevant”.
Also, communication was abysmal. No, discussions on internal mailing lists do not constitute advance warning, much less demonstrate a consensus among users.

In short, the whole thing has tarnished my image of Ubuntu as professional and dependable, trustworthy.

The thread with the initial announcement.

Don’t say you, because, neither I was involved on the decision (I knew there were discussions regarding it), nor do I represent Ubuntu, or work for Canonical, I’m just an individual involved with the Ubuntu community expressing my personal opinions.

Removing libraries doesn’t prevent those distributing software to add them back, or users from doing also, and Ubuntu provided several options on ways to use those libraries even from the Ubuntu archive itself.
If Ubuntu was to drop support to i386 in a way to prevent applications to run it would do it at Kernel level (switching kernel configurations to achieve that goal.

Yes containers, VMs, chroots, and even and snaps do count, because they offer a way to do it, which is what really matters to users.
Containers even run on the host system kernel, they just have their own isolated run time (which is what windows does to run 32bits). And the sugestion was to use an older version of Ubuntu, or a container, or a runtime, not another different Operating System.

Noted, I’m sorry I got the wrong impression.

Could you please link it, so we’re on the same page?

I said removing the libs removes the OS-level support for 32-bit software (by definition, really), nothing about preventing it from running. Just because a feature can be added that doesn’t make it a supported feature.

Agree to disagree?

I don’t know much about Windows internals, but I do know that 64- and 32-bit software is not isolated in any user-visible way.

How is another version of an OS not another OS? How is keeping an older and sooner or later unmaintained version of an OS around not a maintenance and security nightmare? Why use Ubuntu when another distro can do it all via multiarch?
Even when isolation is desired: Why use an old 32-bit version of Ubuntu over a current 32-bit version of another distro as a base? Why not use the same distro for both, then, for simplicity?

What are these 32bit apps that you want to run on 64bit Ubuntu, which doesn’t have 64bit equivalents?

1 Like

Well, because of this I have to do a long complicated process to trick one of my 32-bit laptops into downgrading from Ubuntu 18.10 to 18.04, because I am stuck in limbo where all of the repositories don’t exist and I can’t do any updates, making this way more complicated than it has to be.

Also, sorry for necroposting, I’m just pissed about this.

This is a good explanation!