Backup Options Guidance

Currently I’m using external drives to backup the just the data on the NFS.
So I’m exploring other software to do a additional backup method in conjunction with the external drives.

Basically I’m really asking for what do most use and the actual easiest to setup and operate.

I know Bacula is one option, bareos, and the list goes on and on.

Most if not all of my data is static and doesn’t change very much for the 6+ TB currently.

The medium I have currently (excluding hard drives) is a dvd - rw burner drive, a couple of LG M-disc burners. That are currently sitting on the shelf that I can install /pop-in in the system (which I’ve really only used as a dvd player in the past never used the M-Disc capability).
I would like to control the operation on /from a Ubuntu desktop which I’m dual booting with win 10 (OS’s are on differing drives so I select which drive/OS to boot on startup).

And of course I have considered maybe adding a tape drive. If not now maybe later when I locate one at a (my) desirable price point. And then again maybe not, as the other two medium may just work, for my purposes. Oh yes this is just for a home lab setup so no “production” setup.

1 Like

Hi sgt-mike

I just removed a DVD burning drive from one of my systems, I had not used it in years. Perhaps still useful on some level.

You will get other suggestions I’m sure. My favorite is rsync to create a duplicate of mostly static data, music, photos, videos, etc. There is a Grsync for a gui version.

For system files, running on ext4 + LVM, I like rdiff-backup, run with elevated permissions to preserve all file ownerships & permissions. For system level backups, my strategy is to backup /home /usr & /etc, mountpoints, & installed packages list. At restore time, I do a fresh OS reinstall and put back my custom files & scripts & reinstall packages from the package list. Since system files are usually in use when backing them up, I use LVM snapshots which freezes the data, then run rdiff-backup against the snapshot. Once i get my backup I delete the LVM snapshot. There is a little bit of a learning curve for rdiff-backup but it is a powerful tool. No gui version on it though.

There are many other options. Hopefully, you will get other ideas here. Cheers!

1 Like

I use BackInTime on Ubuntu 24.04. I’m using version 1.4.3, the latest version that’s in the standard Ubuntu repositories, although there are newer versions available in its github repository. It uses rsync and hard-links, which is very space-efficient.

2 Likes

It’s not exactly a backup in the traditional sense, but I’ve found it invaluable to keep notes on how I configured things and why. That, along with having a backup of /etc, makes it a lot simpler to restore a system, or setup a new one that does the same thing.

Otherwise, just a list of installed packages, and a backup of user data. These days I use rdiff-backup, but in the past have used (and successfully restored from) DAR, which is a development of tar intended for disks rather than tapes.

Of course, the best backup software is … the one you actually use. An “ideal” package which you find so intimidating that you just put off doing anything, is no help at all.

2 Likes

Seems I forgot to add that on the NFS I have a mixture of 1 LVM group and two different ZFS pools. (but the OS in not on a ZFS boot, just the data storage).
Researched the medium’s , Tape which I haven’t used in three decades, and the M-disc media.
I was in sticker shock on the LTO’s as the best candidate seemed to be LTO7… IF I go tape I’ll probably need to step back several gen from 7 to get in a comfortable price point.
Not completely sold on going back to tape but it is a option.

M- discs was the next runner up at 100GB (but I’m pretty sure that the drives I have will be a bit less for the correct medium than what is stated in the wiki). Then of course there is the good ole DVD at it’s 4.7ish GB per disc.
Looking at the two LG Mdisc drives I have …
1. model GH24NS72
2. model GH24NSC0
I’ll delev a little bit into the hardware side… (might have to use the DVD-RW drive already in the Ubuntu/win10 box)
HUH DVD Ram is discontinued? and the LG’s yeah max 8GB, which I just checked the ASUS DVD multi drive that is in yep does everything the LG’s do. I can see a blu ray burner with either BD-RE in either SL or DL media in my very near future although might have a line on affordable LTO4 SAS external tape drives
(the only reason to bring up hardware is it could/would effect software suggestions /choices)

But thus far from the listed software mentioned I’ll need to look harder at what is suggested.
The Grsync looks promising, BackInTime, rdiff I’ve heard good things.

Which honestly will be Downloading to the desktop the suggestions and playing with small samples to the existing hardware for now.
Of course I do have 1 source of backup on hard disc (spinners) already.
Just was looking at software/media solution to go alone with the Hard drives.
Notes yeah agree here at my age I have to use notes.
Great suggestions thus far.
Update
installed the Grsync software, after some head banging with hardware looks like I will have to reconfigure the Desktop (hardware -wise less CD/DVD/Blue Ray/ tape) to achieve a desired result, which will now have to be spinning rust. I’ll have to devise a strategy that reduces the time the drives are actually powered up. Which should be easy. I’ll install BackInTime after I set the hard ware up. Both look promising as both are rysnc based. I’ll explore the rdiff-backup as well.

But please continue suggestions on the software and not let a bump in the road stop the suggestions for software solution.

+1 for rdiff-backup. I use it daily to auto backup to a backup HD when I shutdown. There are many posts on rdiff-backup on the old forum site.

I have never bothered to backup my OS as it is so quick to reinstall if I should do anything disastrous.

What I do have however, is regular backups of everything in my /home partition which in my case is separate from the OS and is how I’ve been running my chosen version (Xubuntu) for many years. Any system files which I have edited are backed up and copied to a folder in my /home so I can quickly restore any personal configuration changes I have made by restoring those system files

I am just a simple home desktop user and do not have anything on my machine now that is worth a versioned backup system, though I accept that for many, including business users, versioned backups can be important.

Your requirements may be very different from mine so think about what you might need your backups for if a disaster occurs and make sure you can do what you need; it’s pointless having backups if you find out when you need them that they can not be used!

2 Likes

@ajgreeny
Yeah I’m not backing up the the OS, Yes, I agree I can set the server back up in less than a hour to full functional, I agree with you on the OS part. But I’m rather after the media files, documents etc,etc on (from) the NFS server (headless) to a different location (hard drives). about 7+ TB worth.

@quarkrad Yeah sadly I can’t seem to get to the old site. I know others can, and they say it’s slow. but I seem to time out loading.

Unless you have changed things Namely ZFS then this may be of interest:

https://dev.to/ikus060/how-to-configure-zfs-for-rdiff-backup-43dh

2 Likes

@1fallen
Nope Rick only added a additional pool and a LVM group (using up the Seagates drives in raidZ vdevs and some old sata drives for the LVM group). no other changed on the NFS.
I’ll give that link and advice serious look over.

@aljames I did finally get around to testing the Grsync a bit even though I don’t have the Hardware configured /developed yet that I want to slave to the desktop. It did well, except when I dumped the full 7 TB data to it. System sort of choked LOL, when I broke it into smaller segments worked well though. I liked that it only wrote when there was a change in the data sets. Similar in nature to the ZFS snapshots. Good suggestion thank you

@1fallen
LOL well Rick I read the articles and then followed them setting the data sets etc as advised.
Then went to install per the instruction on downloading and installing rdiff-web ran into this

$ sudo curl -L https://www.ikus-soft.com/archive/rdiffweb/public.key | gpg --dearmor > /
usr/share/keyrings/rdiffweb-keyring.gpg
-bash: /usr/share/keyrings/rdiffweb-keyring.gpg: Permission denied
  % Total    % Received % Xferd  Average Speed   Time    Time     Time  Current
                                 Dload  Upload   Total   Spent    Left  Speed
100     5  100     5    0     0     16      0 --:--:-- --:--:-- --:--:--    16
100     5  100     5    0     0      8      0 --:--:-- --:--:-- --:--:--     8
100  2773  100  2773    0     0   3210      0 --:--:-- --:--:-- --:--:-- 39056
curl: Failed writing body
 sudo echo "deb [arch=amd64 signed-by=/usr/share/keyrings/rdiffweb-keyring.gpg] https:/
/nexus.ikus-soft.com/repository/apt-release-$(lsb_release -sc)/ $(lsb_release -sc) main" > /etc/apt/so
urces.list.d/rdiffweb.list
-bash: /etc/apt/sources.list.d/rdiffweb.list: Permission denied
sudo apt update
Get:1 http://security.ubuntu.com/ubuntu noble-security InRelease [126 kB]
Ign:2 https://download.webmin.com/download/newkey/repository stable InRelease
Hit:3 https://download.webmin.com/download/newkey/repository stable Release
Hit:4 http://us.archive.ubuntu.com/ubuntu noble InRelease
Get:6 http://us.archive.ubuntu.com/ubuntu noble-updates InRelease [126 kB]
Get:7 https://esm.ubuntu.com/apps/ubuntu noble-apps-security InRelease [7,547 B]
Get:8 http://us.archive.ubuntu.com/ubuntu noble-backports InRelease [126 kB]
Get:9 https://esm.ubuntu.com/apps/ubuntu noble-apps-updates InRelease [7,468 B]
Get:10 http://us.archive.ubuntu.com/ubuntu noble-updates/main amd64 Components [151 kB]
Get:11 http://us.archive.ubuntu.com/ubuntu noble-updates/restricted amd64 Components [212 B]
Get:12 http://us.archive.ubuntu.com/ubuntu noble-updates/universe amd64 Components [310 kB]
Get:13 http://security.ubuntu.com/ubuntu noble-security/main amd64 Components [7,236 B]
Get:14 https://esm.ubuntu.com/infra/ubuntu noble-infra-security InRelease [7,462 B]
Get:15 http://us.archive.ubuntu.com/ubuntu noble-updates/multiverse amd64 Components [940 B]
Get:16 http://us.archive.ubuntu.com/ubuntu noble-backports/main amd64 Components [208 B]
Get:17 http://us.archive.ubuntu.com/ubuntu noble-backports/restricted amd64 Components [216 B]
Get:18 http://us.archive.ubuntu.com/ubuntu noble-backports/universe amd64 Components [11.7 kB]
Get:19 http://us.archive.ubuntu.com/ubuntu noble-backports/multiverse amd64 Components [212 B]
Get:20 http://security.ubuntu.com/ubuntu noble-security/restricted amd64 Components [212 B]
Get:21 http://security.ubuntu.com/ubuntu noble-security/universe amd64 Components [51.9 kB]
Get:22 https://esm.ubuntu.com/infra/ubuntu noble-infra-updates InRelease [7,461 B]
Get:23 http://security.ubuntu.com/ubuntu noble-security/multiverse amd64 Components [208 B]
Fetched 942 kB in 1s (838 kB/s)
Reading package lists... Done
Building dependency tree... Done
Reading state information... Done
All packages are up to date.
sudo apt install rdiffweb
Reading package lists... Done
Building dependency tree... Done
Reading state information... Done
E: Unable to locate package rdiffweb

Google reveled that back in Nov 2024 several had issues when they update/upgrade to 24.04 Noble ( and yes I’m using 24.04 Noble). Of course that was mentioned on the UF site. link here

Which again I can’t access UF as it literally times out on my end (could it be because I’m using SSO sign-in IDK). So if anyone that can access the link take a look and see what I’m doing wrong and paste it here (I don’t know if it was solved or what) I would be grateful.
P.S.
I’m not whining about not being able to get to UF, but the answer may very well be there. I am unable to access it is all.

I am able to access the UF link you posted.

Rick says the issue was resolved via a PM: https://prnt.sc/IVEHCCvU1GlA

1 Like

Cool then hopefully he will see this, and shoot a PM to me on the fix.
Thanks For Looking on UF for me @rubi1200 I do appreciate it. a LOT

1 Like

I find qt-fsarchiver serves my home use requirements. I too keep my /home folder on a separate partition and just back both up both. QT-fsarchiver can do a live backup from a working system, but I usually just whack in an USB live version of QT-fsarchiver and boot from that.

I have never had an issue with a failed backup, and have been using fsarchiver since it’s cli only days, many years ago. With the various compression algorithms it employs, the backups (particularly system ones) take very little disc space.
Here is a link to Dieter Baum’s application on Sourceforge: QT-Fsarchiver

Cheers Tony

Nope you don’t want to do this, I’ve seen the OP on that thread here, and they are using something different currently.
I’m not sure why though?
For my ZFS root I use this sanoid and snycoid:

apt show sanoid
Package: sanoid
Version: 2.2.0-2
Priority: optional
Section: universe/admin
Origin: Ubuntu
Maintainer: Ubuntu Developers <ubuntu-devel-discuss@lists.ubuntu.com>
Original-Maintainer: Michael Jeanson <mjeanson@debian.org>
Bugs: https://bugs.launchpad.net/ubuntu/+filebug
Installed-Size: 206 kB
Depends: perl:any, pv, lzop, mbuffer, zfs-fuse | zfsutils-linux | zfsutils, libconfig-inifiles-perl, libcapture-tiny-perl
Homepage: https://github.com/jimsalterjrs/sanoid
Download-Size: 53.1 kB
APT-Sources: http://archive.ubuntu.com/ubuntu plucky/universe amd64 Packages
Description: Policy-driven ZFS snapshot management and replication tool
 Sanoid is a policy-driven snapshot management and replication tool for ZFS
 filesystems.
 .
 More prosaically, you can use Sanoid to create, automatically thin, and
 monitor snapshots and pool health from a single eminently human-readable
 TOML config file. Sanoid also includes a replication tool, syncoid, which
 facilitates the asynchronous incremental replication of ZFS filesystems.

It takes a beat to understand though.
One link from reddit with descent information:

https://www.reddit.com/user/kaihp/comments/fyhxdj/howto_setting_up_a_raspberry_pi_4_to_pull_backups/

Just use a different source rather than " Raspberry Pi 4"

My /home/$USER is just a copy over to different storage Drives

 zpool status
  pool: bpool
 state: ONLINE
config:

	NAME                                          STATE     READ WRITE CKSUM
	bpool                                         ONLINE       0     0     0
	  ata-WD_Blue_SN570_500GB_22394E807289-part2  ONLINE       0     0     0

errors: No known data errors

  pool: dozer
 state: ONLINE
config:

	NAME                                          STATE     READ WRITE CKSUM
	dozer                                         ONLINE       0     0     0
	  usb-SanDisk_Ultra_4C530001050423123075-0:0  ONLINE       0     0     0

errors: No known data errors

  pool: rpool
 state: ONLINE
config:

	NAME                                              STATE     READ WRITE CKSUM
	rpool                                             ONLINE       0     0     0
	  usb-WD_Blue_SN570_500GB_012345678909-0:0-part4  ONLINE       0     0     0

errors: No known data errors

  pool: tank
 state: ONLINE
config:

	NAME        STATE     READ WRITE CKSUM
	tank        ONLINE       0     0     0
	  sdc1      ONLINE       0     0     0

errors: No known data errors

cd /tank && ls | grep me-plucky
me-plucky

I always keep at least 2 at different locations:

cd /dozer && ls
etc  etc-buntu-lvm  etc-mint  home-mint  me

@1fallen
A beat I say LOL I’ll have to dig deeper.
But here is the only changes I actually did, hopefully didn’t /won’t negatively effect any thing going forward.

2024-12-30.22:05:48 zfs create -o utf8only=off mediapool1/backups
2024-12-30.22:06:37 zfs set compression=lz4 mediapool1/backups
2024-12-30.22:09:46 zfs set primarycache=metadata mediapool1/backups
2024-12-30.22:10:12 zfs set secondarycache=metadata mediapool1/backups
2024-12-30.22:11:32 zfs set dedup=on mediapool1/backups

Which I’m pretty sure I can resort back, especially the dedup if need be. Never did get rdiffweb to install. Which honestly just means I’m at the CLI (scrips etc) to use Rdiff-backup which I did install.

Try this:https://ubuntuforums.org/showthread.php?t=2446164

Hmmm again UF times out. does that on every computer in the house.

Hi, I’m unable to offer anything of substance regarding the thread topic, but I just want to comment on this. As a retiring UF admin, I’ve been checking into UF almost every day since the transition started. Most times it comes up quickly. At the moment there’s no trouble at all from where I am and the connection I am using. This is just a constructive observation to help you track down what is blocking you at your end. UF is working (mostly) just fine.

Good luck with finding a solution to both that and the current thread subject.