During my first test-run of IncusOS in an Incus VM, I got the same “Secure Boot is disabled” issue at the beginning of IncusOS install when using this parameter. Changing it to true made the installation proceed.
I brushed it off as something unimportant as I was following Trying out Incus OS , but reading this thread makes me realize this shouldn’t have happened… I have no clue so as to why it didn’t work. I’m on Arch x64 if it matters.
So far, not really, it’s kinda like storage backends, they all suck in different ways, you’ve got to find the one that sucks the least for your use case
In reality most of our customers tend to use whatever backup solution came from their storage vendor. So we’ve seen a fair bit of various generations of stuff out of DELL and HP, some IBM stuff too. For the generic ones, I just looked at my list and Veeam is actually on those we’ve been asked about before, along with Commvault, Cohesity, Rubrik and a few others.
ZFS is pretty darn good for local storage so long as you are on a platform that supports it. It not being mainline is really the main downside, that and the slightly higher memory usage. It used to lack delegation support compared to btrfs, but that’s been fixed a couple of years ago now.
On the network storage, then it’s a bit more of a mixed bag. We have basically 4 options each with their compromises:
Ceph has all the features but very high latency and is quite complex
Linstor can be used on the same kind of hardware as Ceph but it doesn’t have a filesystem or object storage story
TrueNAS has both block and filesystem (NFS) but we don’t currently have NFS support in Incus (it’s being worked on), it lacks object storage and it needs an external storage server
LVM cluster works with all external storage options (iSCSI, FC, NVMe-over-Fabric) but again, no filesystem or object storage handling
Basically what we’d want for software defined storage is “Ceph but fast”. There is upstream work to completely rework the OSD part of Ceph to deal with high performance NVMe SSDs, so hopefully that will help with the latency issue.
Heh funny you mention that, we just got ourselves a MSL2024 library, but I think we’ll go with Bareos (installed on Incus of course!). I still have to work out the finicky details on how to make it work nicely with Incus, though.
Nothing changed.
I can complete the first stage until the message “IncusOs was succesfully installed”, the next reboot hangs on ‘IncusOS is starting’ even in VGA console.
Hmm, I’m not sure why it’s not working now. Maybe try deleting the VM and re-creating it, if you haven’t already?
I’m not super familiar with Void, but running as root I can successfully get IncusOS to start. (Since the packaged version of Incus is still 6.17 on Void, I’m using the query command to show the raw IncusOS API response.)
[root@void gibmat]# lsb_release -a
LSB Version: 1.0
Distributor ID: VoidLinux
Description: Void Linux
Release: rolling
Codename: void
[root@void gibmat]# incus version
Client version: 6.17
Server version: 6.18
[root@void gibmat]# incus remote list
+-------------------+------------------------------------+---------------+-------------+--------+--------+--------+
| NAME | URL | PROTOCOL | AUTH TYPE | PUBLIC | STATIC | GLOBAL |
+-------------------+------------------------------------+---------------+-------------+--------+--------+--------+
| IncusOS (current) | https://10.22.6.35:8443 | incus | tls | NO | NO | NO |
+-------------------+------------------------------------+---------------+-------------+--------+--------+--------+
| images | https://images.linuxcontainers.org | simplestreams | none | YES | NO | NO |
+-------------------+------------------------------------+---------------+-------------+--------+--------+--------+
| local | unix:// | incus | file access | NO | YES | NO |
+-------------------+------------------------------------+---------------+-------------+--------+--------+--------+
[root@void gibmat]# incus query /os/1.0
{
"environment": {
"hostname": "35837ed6-b407-48da-b6b5-1f502f5d6238",
"os_name": "IncusOS",
"os_version": "202511070055"
}
}