New SSD, Legacy Bios and rebooting

24.04
KDE Plasma

Problem is when I try to reboot or restart my PC, my SSD goes to sleep or shutwdown and does not work again until the PC is powered off and then powered back on. During this state, the bios doesn’t even see the SSD. It is like the SSD is hanging in an unfinished state because it didn’t receive the proper command from the “restart” command or it is switched off and wont switch back on until power has been reset. If I just use shutdown and then power it back on right after using the power button on the PC, the SSD works normally. So right now when I install udpates i have to install updates, shutdown and then power on to reboot.

When using a 13+year old SATA SSD (OCZ Vertex4) the system reboots with zero issues. When using a new SSD like Samsung 870 Evo or WD Blue, the restart issue happens. The HDDs dont have any issues.

Relevant System Information:

-CPU Phenom II 1075T

  • MB ASrock 890GM pro3 (legacy bios, not UEFI, I think this is a big part of the problem)
  • Old SSD: OCZ Vertex4 64GB SATA SSD (this is so old it acts more like an HDD, it doesn’t even track wear/life remaining)
  • New SSD: WD Blue 500GB SATA SSD
  • 1TB WD Blue HDD & 2TB WD Blue HDD
  • 32GB RAM

Screenshots or Error Messages:
The only real indicator of the issue is that the bios doesn’t see the SSD after restarting when using new SSDs. Old SSDs and HDDs work fine.

What I’ve Tried:
I have tried searching. It seems this is a common issue but I haven’t really been able to understand how to fix it or how to determine the cause. Windows10 works fine. I have verified that BIOS and all hardware is in perfect working order. I have verified the problem exists on a clean and most recent Live USB version so it isn’t just something unique to my install. I think I need to add a command somewhere to tell the bios/SSD how to properly restart.

Thank you for any help this wonderful place can provide.

Hi!
As you wrote the fault happens with different new SSDs. Perhaps there is an incompatibility of your bios and firmware of newer SSDs.

You may check your bios. Is it up to date? What about power saving settings? What sata mode (AHCI or something else) is used for the SSD?

Did you check other possible hardware issues? (for example faulty data cable)

I have made sure newer SSDs are fully updated and that they work perfectly under windows. It is just the reboot command in Linux that is causing the issue.

Keep in mind this is a legacy bios (not UEFI) that is 15 years old so options are limited for settings.

SATA mode = AHCI

I have verified that RAM, CPU, MOBO, SSDs and cables are all good. I tried multiple cables, multiple SATA ports and multiple PSU connectors. There are no data integrity issues or disk errors or any other issues to indicate the SSDs are not working properly. Performance/speeds are perfect. Everything works absolutely perfect when it is running. It is just the SSDs going offline on reboot command and requiring power off to wake them up again.

Thank you for your suggestions.

One thing I found Ways to shut down/sleep and wake up hard drives on command?

In first answer under ‘Some Background’ the second point it reads ‘This mode needs a reset or a power cycle to recover from.’
Don’t know if this applies to SSDs, too. Seems that some parameters of HDDs can be changed using hdparm but I do not know anything about hdparm.

Only my thoughts: Perhaps Ubuntu sends your new SSDs into this mode which makes sense for power down. But if you reboot and the power to the drive isn’t cut off before booting again the drive won’t wake up. Do newer systems cut of power for a short moment on reboot? Do older systems (for example yours) cut of power for a short moment on reboot? Just guessing …

If that old of a system, is coin battery dead? If it forgets things that is most likely the issue.

Did you check actual firmware version. My new Samsung SSD needed firmware update.
Compare firmware versions to vendors support site

sudo dmidecode -s bios-version
udisksctl status

Coin battery is good. The samsung firmware was verified using the samsung update software. Same with the WD Blue SSD. I sent the samsung back before getting the WD because I thought it was a bad drive issue but then after getting the WD SSD, I realized the pattern and realized it was not a bad drive issue.

That sounds exactly like the issue. I guess I’ll try and go find out more about hdparm and the settings. Thank you.

While I still sometimes use hdparm, there is nvme.

sudo apt install nvme-cli

nvme - the NVMe storage command line interface utility (nvme-cli)
nvme --help
sudo nvme list

More info:
https://wiki.archlinux.org/title/Solid_state_drive/NVMe

1 Like

Reading through the link, I dont think nvme-cli will see my SATA SSD.

Thank you for the link. It is something I had never heard of.

Should have realized you only had SATA SSDs not newer NVMe.

My build in 2017 had the new M.2 port, but NVMe drives were very expensive as new, then. Used smaller SATA SSD, but then a year or so ago upgraded to a larger NVMe drive. Used old SATA SSDs in USB to M.2 adapter and had fast external drive.