LXD 4.2: error when attaching additional disk to VM

Tried to attach an additional disk to a VM as a block device:

ubuntu@aa1-cptef101-n1:~$ lxc --project <project> storage volume attach ceph aa1_f91c_disk_pod-890-1000-01-vault_audit pod-890-1000-01 vaultaudit
ubuntu@aa1-cptef101-n1:~$ lxc --project <project> start pod-890-1000-01
Error: Error detecting file type "/var/snap/lxd/common/lxd/devices_<project>_pod-890-1000-01/aa1_f91c_disk_pod-890-1000-01-vault_audit.sock": stat /var/snap/lxd/common/lxd/devices/<project>_pod-890-1000-01/aa1_f91c_disk_pod-890-1000-01-vault_audit.sock: no such file or directory
Try `lxc info --show-log pod-890-1000-01` for more info

The logfile:
ubuntu@aa1-cptef101-n1:~$ lxc info --show-log pod-890-1000-01 --project
Name: pod-890-1000-01
Location: aa1-cptef102-n4
Remote: unix://
Architecture: x86_64
Created: 2020/06/18 23:55 UTC
Status: Stopped
Type: virtual-machine
Profiles: stagingtwo-virtual-vault-890

Log:

Can you run lxc monitor --type=logging --pretty on aa1-cptef102-n4 while running lxc start again?

This will capture all log entries for all log levels during startup.

@tomp can you take a look at that? It’s VM 9p attach of a custom volume coming from a ceph storage pool.

Will do.

Hi,

I’ve not been able to reproduce your specific error yet, but I have found another issue related to using relative mount paths:

Please could you try altering your volume attach command and add an absolute target path:

lxc --project <project> storage volume detach ceph aa1_f91c_disk_pod-890-1000-01-vault_audit pod-890-1000-01
lxc --project <project> storage volume attach ceph aa1_f91c_disk_pod-890-1000-01-vault_audit pod-890-1000-01 vaultaudit /vaultaudit

I’ve not been able to reproduce this.

lxc project create test -c features.images=false -c features.profiles=false
lxc project switch test
lxc storage volume create default aa1_f91c_disk_pod-890-1000-01-vault_audit
lxc init images:ubuntu/focal pod-890-1000-01 --vm
lxc storage volume attach default aa1_f91c_disk_pod-890-1000-01-vault_audit pod-890-1000-01 vaultaudit /vaultaudit

lxc start pod-890-1000-01
lxc exec pod-890-1000-01 mount | grep vaultaudit
   lxd_vaultaudit on /vaultaudit type 9p (rw,relatime,sync,dirsync,access=client,trans=virtio)

Also checked on LXD host and can see the virtfs-proxy-helper running with the correct .sock path.

virtfs-proxy-helper -n -u 0 -g 0 -s /var/snap/lxd/common/lxd/devices/test_pod-890-1000-01/vaultaudit.sock -p /var/snap/lxd/common/lxd/storage-pools/default/custom/test_aa1_f91c_disk_pod-890-1000-01-vault_audit

I noticed in your error that the character before <project> in the stat error path is _ which it shouldn’t be, it should be /. Can you confirm whether this is actually happening, or whether it is a result of you modifying the error to remove the project name.

It would be helpful to know the project name, in case it is an issue with the way the path is being generated from the project name.

Thanks

The project name is company-staging2.

Also - we are trying to attach additional block devices to a VM - not do 9p.

The custom volume you are specifying is a filesystem volume and so will be passed to the VM as a 9p share (as not all storage pools can represent custom filesystem volumes as a block device).

Let me try again with a “-” in the project name and see if I can reproduce.

Yep no problems with dashes in the project name.

Can you confirm you have the virtfs-proxy-helper command available?

Ah - okay - that makes sense. How would we go about attaching an additional block device then?

That command seems to exist in the snap yes, but not in the VM - not sure which side you’re asking about.

Right now all custom volumes created by LXD are considered “filesystem” volumes.

Because of the different storage drivers that LXD support, not all “filesystem” volumes are backed by block devices, and so LXD does not support natively passing through “filesystem” volumes as block devices.

So the command you are using will only pass the ceph custom filesystem volume as a 9p share:

lxc storage volume attach ceph aa1_f91c_disk_pod-890-1000-01-vault_audit pod-890-1000-01 vaultaudit /vaultaudit

Eventually we will add support for LXD custom block volumes, and as these will not have a filesystem that LXD is aware of, they will only be able to be passed as block volumes.

However LXD does support passing external block devices (that LXD doesn’t manage) into a VM using the source disk device setting (see https://linuxcontainers.org/lxd/docs/master/instances#type-disk)

source	string	- Path on the host, either to a file/directory or to a block device

E.g.

lxc config device add <VM> <disk name> disk source=<block device path>

It also supports directly passing through an external ceph RBD device using the source format:

lxc config device add <instance> ceph-rbd1 disk source=ceph:<my_pool>/<my-volume> ceph.user_name=<username> ceph.cluster_name=<username>

So you may be able to pass the LXD managed custom filesystem volume through to the VM as a block device by specifying it as an unmanaged external ceph volume.

The RBD device attach is not working:

lxc --project <company>-staging2 config device add pod-890-1000-00 aa1_f91d_disk_pod-890-1000-00-a disk source=ceph:lxd-virtualvaultfs-01/aa1_f91d_disk_pod-890-1000-00-a ceph.user_name=lxd ceph.cluster_name=ceph succeeds, however: lxc start pod-890-1000-00 results in this error: qemu-system-x86_64:/var/snap/lxd/common/lxd/logs/<company>-staging2_pod-890-1000-00/qemu.conf:224: error reading header from aa1_f91d_disk_pod-890-1000-00-a: No such file or directory

The disk exists in ceph, however:
root@aa1-ceph-master-1:~# rbd ls --pool lxd-virtualvaultfs-01
custom_-staging2_aa1_f91d_disk_pod-890-1000-00-a

Can you show /var/snap/lxd/logs/pod-890-1000-00/qemu.conf so we can see the full rbd config that was passed (it may contain credentials and addresses so you’ll want to remove those)

# Machine
[machine]
graphics = "off"
type = "q35"
accel = "kvm"
usb = "off"
graphics = "off"

[global]
driver = "ICH9-LPC"
property = "disable_s3"
value = "1"

[global]
driver = "ICH9-LPC"
property = "disable_s4"
value = "1"
[boot-opts]
strict = "on"

# Console
[chardev "console"]
backend = "pty"

# Graphical console
[spice]
unix = "on"
addr = "/var/snap/lxd/common/lxd/logs/<company>-staging2_pod-890-1000-00/qemu.spice"
disable-ticketing = "on"

# CPU
[smp-opts]
cpus = "8"
sockets = "1"
cores = "8"
threads = "1"






# Memory
[memory]
size = "16000000000B"

# Firmware (read only)
[drive]
file = "/snap/lxd/current/share/qemu/OVMF_CODE.fd"
if = "pflash"
format = "raw"
unit = "0"
readonly = "on"

# Firmware settings (writable)
[drive]
file = "/var/snap/lxd/common/lxd/virtual-machines/<company>-staging2_pod-890-1000-00/qemu.nvram"
if = "pflash"
format = "raw"
unit = "1"

# Qemu control
[chardev "monitor"]
backend = "socket"
path = "/var/snap/lxd/common/lxd/logs/<company>-staging2_pod-890-1000-00/qemu.monitor"
server = "on"
wait = "off"

[mon]
chardev = "monitor"
mode = "control"

[device "qemu_pcie0"]
driver = "pcie-root-port"
bus = "pcie.0"
addr = "1.0"
chassis = "0"
multifunction = "on"

# Balloon driver
[device "qemu_balloon"]
driver = "virtio-balloon-pci"
bus = "qemu_pcie0"
addr = "00.0"

multifunction = "on"

# Random number generator
[object "qemu_rng"]
qom-type = "rng-random"
filename = "/dev/urandom"

[device "dev-qemu_rng"]
driver = "virtio-rng-pci"
bus = "qemu_pcie0"
addr = "00.1"

rng = "qemu_rng"


# Input
[device "qemu_keyboard"]
driver = "virtio-keyboard-pci"
bus = "qemu_pcie0"
addr = "00.2"



# Input
[device "qemu_tablet"]
driver = "virtio-tablet-pci"
bus = "qemu_pcie0"
addr = "00.3"



# Vsock
[device "qemu_vsock"]
driver = "vhost-vsock-pci"
bus = "qemu_pcie0"
addr = "00.4"

guest-cid = "71"


# LXD serial identifier
[device "dev-qemu_serial"]
driver = "virtio-serial-pci"
bus = "qemu_pcie0"
addr = "00.5"



[chardev "qemu_serial-chardev"]
backend = "ringbuf"
size = "16B"

[device "qemu_serial"]
driver = "virtserialport"
name = "org.linuxcontainers.lxd"
chardev = "qemu_serial-chardev"
bus = "dev-qemu_serial.0"

[device "qemu_pcie1"]
driver = "pcie-root-port"
bus = "pcie.0"
addr = "1.1"
chassis = "1"


# SCSI controller
[device "qemu_scsi"]
driver = "virtio-scsi-pci"
bus = "qemu_pcie1"
addr = "00.0"



[device "qemu_pcie2"]
driver = "pcie-root-port"
bus = "pcie.0"
addr = "1.2"
chassis = "2"


# Config drive
[fsdev "qemu_config"]
fsdriver = "local"
security_model = "none"
readonly = "on"
path = "/var/snap/lxd/common/lxd/virtual-machines/<company>-staging2_pod-890-1000-00/config"

[device "dev-qemu_config"]
driver = "virtio-9p-pci"
bus = "qemu_pcie2"
addr = "00.0"

mount_tag = "config"
fsdev = "qemu_config"
multifunction = "on"

[device "qemu_pcie3"]
driver = "pcie-root-port"
bus = "pcie.0"
addr = "1.3"
chassis = "3"


# GPU
[device "qemu_gpu"]
driver = "virtio-vga"
bus = "qemu_pcie3"
addr = "00.0"



[device "qemu_pcie4"]
driver = "pcie-root-port"
bus = "pcie.0"
addr = "1.4"
chassis = "4"


# Network card ("eth0" device)
[netdev "lxd_eth0"]
type = "tap"
vhost = "on"
ifname = "tap25090e45"
script = "no"
downscript = "no"

[device "dev-lxd_eth0"]
driver = "virtio-net-pci"
bus = "qemu_pcie4"
addr = "00.0"

netdev = "lxd_eth0"
mac = "<redacted>"
bootindex = "1"


# aa1_f91d_disk_pod-890-1000-00-a drive
[drive "lxd_aa1_f91d_disk_pod-890-1000-00-a"]
file = "rbd:lxd-virtualvaultfs-01/aa1_f91d_disk_pod-890-1000-00-a:id=lxd:conf=/etc/ceph/ceph.conf"
format = "raw"
if = "none"
cache = "none"
aio = "native"
discard = "on"

[device "dev-lxd_aa1_f91d_disk_pod-890-1000-00-a"]
driver = "scsi-hd"
bus = "qemu_scsi.0"
channel = "0"
scsi-id = "2"
lun = "1"
drive = "lxd_aa1_f91d_disk_pod-890-1000-00-a"
bootindex = "2"


# root drive
[drive "lxd_root"]
file = "/dev/rbd8"
format = "raw"
if = "none"
cache = "none"
aio = "native"
discard = "on"

[device "dev-lxd_root"]
driver = "scsi-hd"
bus = "qemu_scsi.0"
channel = "0"
scsi-id = "0"
lun = "1"
drive = "lxd_root"
bootindex = "0"

Can you show `lxc storage show lxd-virtualvaultfs-01 --target LOCATION

Where LOCATION is the server name where that container is hosted.

ubuntu@aa1-cptef101-n1:~$ lxc storage show ceph-vault --target aa1-cptef101-n1
config:
  ceph.cluster_name: ceph
  ceph.osd.pg_num: "32"
  ceph.osd.pool_name: lxd-virtualvaultfs-01
  ceph.user.name: lxd
  source: lxd-virtualvaultfs-01
  volatile.initial_source: lxd-virtualvaultfs-01
  volatile.pool.pristine: "false"
  volume.size: 256GB
description: ""
name: ceph-vault
driver: ceph
used_by:
- /1.0/profiles/default
- /1.0/profiles/default?project=<company>-staging2
- /1.0/profiles/default?project=<company>-staging2-pods
- /1.0/profiles/default?project=experimental-instances
- /1.0/profiles/experimental-6.8.50?project=experimental-instances
- /1.0/profiles/stagingtwo-2.2.25
- /1.0/profiles/stagingtwo-2.2.25?project=<company>-staging2
- /1.0/profiles/stagingtwo-2.4.250
- /1.0/profiles/stagingtwo-2.4.250?project=<company>-staging2
- /1.0/profiles/stagingtwo-4.10.100
- /1.0/profiles/stagingtwo-4.10.100?project=<company>-staging2
- /1.0/profiles/stagingtwo-4.32.1536?project=<company>-staging2
- /1.0/profiles/stagingtwo-4.4.150
- /1.0/profiles/stagingtwo-4.4.150?project=<company>-staging2
- /1.0/profiles/stagingtwo-4.4.50
- /1.0/profiles/stagingtwo-4.4.50?project=<company>-staging2
- /1.0/profiles/stagingtwo-8.16.128?project=<company>-staging2
- /1.0/profiles/stagingtwo-8.8.50
- /1.0/profiles/stagingtwo-8.8.50?project=<company>-staging2
- /1.0/profiles/stagingtwo-virtual-vault-890?project=<company>-staging2
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1000-00-a?project=<company>-staging2&target=aa1-cptef101-n1
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1000-00-a?project=<company>-staging2&target=aa1-cptef101-n2
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1000-00-a?project=<company>-staging2&target=aa1-cptef101-n3
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1000-00-a?project=<company>-staging2&target=aa1-cptef101-n4
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1000-00-a?project=<company>-staging2&target=aa1-cptef102-n1
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1000-00-a?project=<company>-staging2&target=aa1-cptef102-n2
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1000-00-a?project=<company>-staging2&target=aa1-cptef102-n3
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1000-00-a?project=<company>-staging2&target=aa1-cptef102-n4
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1000-00-b?project=<company>-staging2&target=aa1-cptef101-n1
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1000-00-b?project=<company>-staging2&target=aa1-cptef101-n2
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1000-00-b?project=<company>-staging2&target=aa1-cptef101-n3
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1000-00-b?project=<company>-staging2&target=aa1-cptef101-n4
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1000-00-b?project=<company>-staging2&target=aa1-cptef102-n1
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1000-00-b?project=<company>-staging2&target=aa1-cptef102-n2
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1000-00-b?project=<company>-staging2&target=aa1-cptef102-n3
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1000-00-b?project=<company>-staging2&target=aa1-cptef102-n4
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1000-00-c?project=<company>-staging2&target=aa1-cptef101-n1
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1000-00-c?project=<company>-staging2&target=aa1-cptef101-n2
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1000-00-c?project=<company>-staging2&target=aa1-cptef101-n3
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1000-00-c?project=<company>-staging2&target=aa1-cptef101-n4
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1000-00-c?project=<company>-staging2&target=aa1-cptef102-n1
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1000-00-c?project=<company>-staging2&target=aa1-cptef102-n2
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1000-00-c?project=<company>-staging2&target=aa1-cptef102-n3
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1000-00-c?project=<company>-staging2&target=aa1-cptef102-n4
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1000-00-d?project=<company>-staging2&target=aa1-cptef101-n1
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1000-00-d?project=<company>-staging2&target=aa1-cptef101-n2
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1000-00-d?project=<company>-staging2&target=aa1-cptef101-n3
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1000-00-d?project=<company>-staging2&target=aa1-cptef101-n4
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1000-00-d?project=<company>-staging2&target=aa1-cptef102-n1
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1000-00-d?project=<company>-staging2&target=aa1-cptef102-n2
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1000-00-d?project=<company>-staging2&target=aa1-cptef102-n3
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1000-00-d?project=<company>-staging2&target=aa1-cptef102-n4
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1000-01-a?project=<company>-staging2&target=aa1-cptef101-n1
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1000-01-a?project=<company>-staging2&target=aa1-cptef101-n2
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1000-01-a?project=<company>-staging2&target=aa1-cptef101-n3
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1000-01-a?project=<company>-staging2&target=aa1-cptef101-n4
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1000-01-a?project=<company>-staging2&target=aa1-cptef102-n1
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1000-01-a?project=<company>-staging2&target=aa1-cptef102-n2
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1000-01-a?project=<company>-staging2&target=aa1-cptef102-n3
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1000-01-a?project=<company>-staging2&target=aa1-cptef102-n4
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1000-01-b?project=<company>-staging2&target=aa1-cptef101-n1
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1000-01-b?project=<company>-staging2&target=aa1-cptef101-n2
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1000-01-b?project=<company>-staging2&target=aa1-cptef101-n3
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1000-01-b?project=<company>-staging2&target=aa1-cptef101-n4
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1000-01-b?project=<company>-staging2&target=aa1-cptef102-n1
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1000-01-b?project=<company>-staging2&target=aa1-cptef102-n2
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1000-01-b?project=<company>-staging2&target=aa1-cptef102-n3
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1000-01-b?project=<company>-staging2&target=aa1-cptef102-n4
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1000-01-c?project=<company>-staging2&target=aa1-cptef101-n1
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1000-01-c?project=<company>-staging2&target=aa1-cptef101-n2
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1000-01-c?project=<company>-staging2&target=aa1-cptef101-n3
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1000-01-c?project=<company>-staging2&target=aa1-cptef101-n4
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1000-01-c?project=<company>-staging2&target=aa1-cptef102-n1
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1000-01-c?project=<company>-staging2&target=aa1-cptef102-n2
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1000-01-c?project=<company>-staging2&target=aa1-cptef102-n3
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1000-01-c?project=<company>-staging2&target=aa1-cptef102-n4
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1000-01-d?project=<company>-staging2&target=aa1-cptef101-n1
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1000-01-d?project=<company>-staging2&target=aa1-cptef101-n2
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1000-01-d?project=<company>-staging2&target=aa1-cptef101-n3
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1000-01-d?project=<company>-staging2&target=aa1-cptef101-n4
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1000-01-d?project=<company>-staging2&target=aa1-cptef102-n1
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1000-01-d?project=<company>-staging2&target=aa1-cptef102-n2
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1000-01-d?project=<company>-staging2&target=aa1-cptef102-n3
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1000-01-d?project=<company>-staging2&target=aa1-cptef102-n4
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1000-02-a?project=<company>-staging2&target=aa1-cptef101-n1
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1000-02-a?project=<company>-staging2&target=aa1-cptef101-n2
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1000-02-a?project=<company>-staging2&target=aa1-cptef101-n3
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1000-02-a?project=<company>-staging2&target=aa1-cptef101-n4
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1000-02-a?project=<company>-staging2&target=aa1-cptef102-n1
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1000-02-a?project=<company>-staging2&target=aa1-cptef102-n2
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1000-02-a?project=<company>-staging2&target=aa1-cptef102-n3
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1000-02-a?project=<company>-staging2&target=aa1-cptef102-n4
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1000-02-b?project=<company>-staging2&target=aa1-cptef101-n1
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1000-02-b?project=<company>-staging2&target=aa1-cptef101-n2
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1000-02-b?project=<company>-staging2&target=aa1-cptef101-n3
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1000-02-b?project=<company>-staging2&target=aa1-cptef101-n4
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1000-02-b?project=<company>-staging2&target=aa1-cptef102-n1
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1000-02-b?project=<company>-staging2&target=aa1-cptef102-n2
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1000-02-b?project=<company>-staging2&target=aa1-cptef102-n3
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1000-02-b?project=<company>-staging2&target=aa1-cptef102-n4
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1000-02-c?project=<company>-staging2&target=aa1-cptef101-n1
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1000-02-c?project=<company>-staging2&target=aa1-cptef101-n2
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1000-02-c?project=<company>-staging2&target=aa1-cptef101-n3
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1000-02-c?project=<company>-staging2&target=aa1-cptef101-n4
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1000-02-c?project=<company>-staging2&target=aa1-cptef102-n1
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1000-02-c?project=<company>-staging2&target=aa1-cptef102-n2
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1000-02-c?project=<company>-staging2&target=aa1-cptef102-n3
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1000-02-c?project=<company>-staging2&target=aa1-cptef102-n4
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1000-02-d?project=<company>-staging2&target=aa1-cptef101-n1
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1000-02-d?project=<company>-staging2&target=aa1-cptef101-n2
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1000-02-d?project=<company>-staging2&target=aa1-cptef101-n3
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1000-02-d?project=<company>-staging2&target=aa1-cptef101-n4
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1000-02-d?project=<company>-staging2&target=aa1-cptef102-n1
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1000-02-d?project=<company>-staging2&target=aa1-cptef102-n2
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1000-02-d?project=<company>-staging2&target=aa1-cptef102-n3
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1000-02-d?project=<company>-staging2&target=aa1-cptef102-n4
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1000-03-a?project=<company>-staging2&target=aa1-cptef101-n1
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1000-03-a?project=<company>-staging2&target=aa1-cptef101-n2
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1000-03-a?project=<company>-staging2&target=aa1-cptef101-n3
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1000-03-a?project=<company>-staging2&target=aa1-cptef101-n4
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1000-03-a?project=<company>-staging2&target=aa1-cptef102-n1
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1000-03-a?project=<company>-staging2&target=aa1-cptef102-n2
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1000-03-a?project=<company>-staging2&target=aa1-cptef102-n3
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1000-03-a?project=<company>-staging2&target=aa1-cptef102-n4
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1000-03-b?project=<company>-staging2&target=aa1-cptef101-n1
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1000-03-b?project=<company>-staging2&target=aa1-cptef101-n2
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1000-03-b?project=<company>-staging2&target=aa1-cptef101-n3
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1000-03-b?project=<company>-staging2&target=aa1-cptef101-n4
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1000-03-b?project=<company>-staging2&target=aa1-cptef102-n1
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1000-03-b?project=<company>-staging2&target=aa1-cptef102-n2
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1000-03-b?project=<company>-staging2&target=aa1-cptef102-n3
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1000-03-b?project=<company>-staging2&target=aa1-cptef102-n4
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1000-03-c?project=<company>-staging2&target=aa1-cptef101-n1
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1000-03-c?project=<company>-staging2&target=aa1-cptef101-n2
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1000-03-c?project=<company>-staging2&target=aa1-cptef101-n3
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1000-03-c?project=<company>-staging2&target=aa1-cptef101-n4
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1000-03-c?project=<company>-staging2&target=aa1-cptef102-n1
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1000-03-c?project=<company>-staging2&target=aa1-cptef102-n2
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1000-03-c?project=<company>-staging2&target=aa1-cptef102-n3
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1000-03-c?project=<company>-staging2&target=aa1-cptef102-n4
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1000-03-d?project=<company>-staging2&target=aa1-cptef101-n1
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1000-03-d?project=<company>-staging2&target=aa1-cptef101-n2
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1000-03-d?project=<company>-staging2&target=aa1-cptef101-n3
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1000-03-d?project=<company>-staging2&target=aa1-cptef101-n4
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1000-03-d?project=<company>-staging2&target=aa1-cptef102-n1
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1000-03-d?project=<company>-staging2&target=aa1-cptef102-n2
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1000-03-d?project=<company>-staging2&target=aa1-cptef102-n3
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1000-03-d?project=<company>-staging2&target=aa1-cptef102-n4
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1000-04-a?project=<company>-staging2&target=aa1-cptef101-n1
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1000-04-a?project=<company>-staging2&target=aa1-cptef101-n2
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1000-04-a?project=<company>-staging2&target=aa1-cptef101-n3
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1000-04-a?project=<company>-staging2&target=aa1-cptef101-n4
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1000-04-a?project=<company>-staging2&target=aa1-cptef102-n1
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1000-04-a?project=<company>-staging2&target=aa1-cptef102-n2
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1000-04-a?project=<company>-staging2&target=aa1-cptef102-n3
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1000-04-a?project=<company>-staging2&target=aa1-cptef102-n4
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1000-04-b?project=<company>-staging2&target=aa1-cptef101-n1
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1000-04-b?project=<company>-staging2&target=aa1-cptef101-n2
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1000-04-b?project=<company>-staging2&target=aa1-cptef101-n3
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1000-04-b?project=<company>-staging2&target=aa1-cptef101-n4
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1000-04-b?project=<company>-staging2&target=aa1-cptef102-n1
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1000-04-b?project=<company>-staging2&target=aa1-cptef102-n2
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1000-04-b?project=<company>-staging2&target=aa1-cptef102-n3
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1000-04-b?project=<company>-staging2&target=aa1-cptef102-n4
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1000-04-c?project=<company>-staging2&target=aa1-cptef101-n1
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1000-04-c?project=<company>-staging2&target=aa1-cptef101-n2
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1000-04-c?project=<company>-staging2&target=aa1-cptef101-n3
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1000-04-c?project=<company>-staging2&target=aa1-cptef101-n4
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1000-04-c?project=<company>-staging2&target=aa1-cptef102-n1
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1000-04-c?project=<company>-staging2&target=aa1-cptef102-n2
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1000-04-c?project=<company>-staging2&target=aa1-cptef102-n3
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1000-04-c?project=<company>-staging2&target=aa1-cptef102-n4
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1000-04-d?project=<company>-staging2&target=aa1-cptef101-n1
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1000-04-d?project=<company>-staging2&target=aa1-cptef101-n2
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1000-04-d?project=<company>-staging2&target=aa1-cptef101-n3
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1000-04-d?project=<company>-staging2&target=aa1-cptef101-n4
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1000-04-d?project=<company>-staging2&target=aa1-cptef102-n1
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1000-04-d?project=<company>-staging2&target=aa1-cptef102-n2
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1000-04-d?project=<company>-staging2&target=aa1-cptef102-n3
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1000-04-d?project=<company>-staging2&target=aa1-cptef102-n4
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1000-05-a?project=<company>-staging2&target=aa1-cptef101-n1
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1000-05-a?project=<company>-staging2&target=aa1-cptef101-n2
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1000-05-a?project=<company>-staging2&target=aa1-cptef101-n3
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1000-05-a?project=<company>-staging2&target=aa1-cptef101-n4
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1000-05-a?project=<company>-staging2&target=aa1-cptef102-n1
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1000-05-a?project=<company>-staging2&target=aa1-cptef102-n2
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1000-05-a?project=<company>-staging2&target=aa1-cptef102-n3
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1000-05-a?project=<company>-staging2&target=aa1-cptef102-n4
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1000-05-b?project=<company>-staging2&target=aa1-cptef101-n1
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1000-05-b?project=<company>-staging2&target=aa1-cptef101-n2
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1000-05-b?project=<company>-staging2&target=aa1-cptef101-n3
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1000-05-b?project=<company>-staging2&target=aa1-cptef101-n4
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1000-05-b?project=<company>-staging2&target=aa1-cptef102-n1
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1000-05-b?project=<company>-staging2&target=aa1-cptef102-n2
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1000-05-b?project=<company>-staging2&target=aa1-cptef102-n3
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1000-05-b?project=<company>-staging2&target=aa1-cptef102-n4
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1000-05-c?project=<company>-staging2&target=aa1-cptef101-n1
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1000-05-c?project=<company>-staging2&target=aa1-cptef101-n2
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1000-05-c?project=<company>-staging2&target=aa1-cptef101-n3
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1000-05-c?project=<company>-staging2&target=aa1-cptef101-n4
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1000-05-c?project=<company>-staging2&target=aa1-cptef102-n1
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1000-05-c?project=<company>-staging2&target=aa1-cptef102-n2
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1000-05-c?project=<company>-staging2&target=aa1-cptef102-n3
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1000-05-c?project=<company>-staging2&target=aa1-cptef102-n4
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1000-05-d?project=<company>-staging2&target=aa1-cptef101-n1
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1000-05-d?project=<company>-staging2&target=aa1-cptef101-n2
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1000-05-d?project=<company>-staging2&target=aa1-cptef101-n3
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1000-05-d?project=<company>-staging2&target=aa1-cptef101-n4
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1000-05-d?project=<company>-staging2&target=aa1-cptef102-n1
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1000-05-d?project=<company>-staging2&target=aa1-cptef102-n2
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1000-05-d?project=<company>-staging2&target=aa1-cptef102-n3
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1000-05-d?project=<company>-staging2&target=aa1-cptef102-n4
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1001-00-a?project=<company>-staging2&target=aa1-cptef101-n1
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1001-00-a?project=<company>-staging2&target=aa1-cptef101-n2
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1001-00-a?project=<company>-staging2&target=aa1-cptef101-n3
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1001-00-a?project=<company>-staging2&target=aa1-cptef101-n4
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1001-00-a?project=<company>-staging2&target=aa1-cptef102-n1
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1001-00-a?project=<company>-staging2&target=aa1-cptef102-n2
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1001-00-a?project=<company>-staging2&target=aa1-cptef102-n3
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1001-00-a?project=<company>-staging2&target=aa1-cptef102-n4
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1001-00-b?project=<company>-staging2&target=aa1-cptef101-n1
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1001-00-b?project=<company>-staging2&target=aa1-cptef101-n2
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1001-00-b?project=<company>-staging2&target=aa1-cptef101-n3
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1001-00-b?project=<company>-staging2&target=aa1-cptef101-n4
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1001-00-b?project=<company>-staging2&target=aa1-cptef102-n1
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1001-00-b?project=<company>-staging2&target=aa1-cptef102-n2
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1001-00-b?project=<company>-staging2&target=aa1-cptef102-n3
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1001-00-b?project=<company>-staging2&target=aa1-cptef102-n4
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1001-00-c?project=<company>-staging2&target=aa1-cptef101-n1
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1001-00-c?project=<company>-staging2&target=aa1-cptef101-n2
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1001-00-c?project=<company>-staging2&target=aa1-cptef101-n3
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1001-00-c?project=<company>-staging2&target=aa1-cptef101-n4
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1001-00-c?project=<company>-staging2&target=aa1-cptef102-n1
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1001-00-c?project=<company>-staging2&target=aa1-cptef102-n2
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1001-00-c?project=<company>-staging2&target=aa1-cptef102-n3
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1001-00-c?project=<company>-staging2&target=aa1-cptef102-n4
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1001-00-d?project=<company>-staging2&target=aa1-cptef101-n1
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1001-00-d?project=<company>-staging2&target=aa1-cptef101-n2
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1001-00-d?project=<company>-staging2&target=aa1-cptef101-n3
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1001-00-d?project=<company>-staging2&target=aa1-cptef101-n4
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1001-00-d?project=<company>-staging2&target=aa1-cptef102-n1
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1001-00-d?project=<company>-staging2&target=aa1-cptef102-n2
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1001-00-d?project=<company>-staging2&target=aa1-cptef102-n3
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1001-00-d?project=<company>-staging2&target=aa1-cptef102-n4
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1001-01-a?project=<company>-staging2&target=aa1-cptef101-n1
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1001-01-a?project=<company>-staging2&target=aa1-cptef101-n2
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1001-01-a?project=<company>-staging2&target=aa1-cptef101-n3
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1001-01-a?project=<company>-staging2&target=aa1-cptef101-n4
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1001-01-a?project=<company>-staging2&target=aa1-cptef102-n1
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1001-01-a?project=<company>-staging2&target=aa1-cptef102-n2
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1001-01-a?project=<company>-staging2&target=aa1-cptef102-n3
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1001-01-a?project=<company>-staging2&target=aa1-cptef102-n4
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1001-01-b?project=<company>-staging2&target=aa1-cptef101-n1
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1001-01-b?project=<company>-staging2&target=aa1-cptef101-n2
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1001-01-b?project=<company>-staging2&target=aa1-cptef101-n3
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1001-01-b?project=<company>-staging2&target=aa1-cptef101-n4
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1001-01-b?project=<company>-staging2&target=aa1-cptef102-n1
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1001-01-b?project=<company>-staging2&target=aa1-cptef102-n2
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1001-01-b?project=<company>-staging2&target=aa1-cptef102-n3
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1001-01-b?project=<company>-staging2&target=aa1-cptef102-n4
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1001-01-c?project=<company>-staging2&target=aa1-cptef101-n1
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1001-01-c?project=<company>-staging2&target=aa1-cptef101-n2
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1001-01-c?project=<company>-staging2&target=aa1-cptef101-n3
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1001-01-c?project=<company>-staging2&target=aa1-cptef101-n4
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1001-01-c?project=<company>-staging2&target=aa1-cptef102-n1
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1001-01-c?project=<company>-staging2&target=aa1-cptef102-n2
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1001-01-c?project=<company>-staging2&target=aa1-cptef102-n3
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1001-01-c?project=<company>-staging2&target=aa1-cptef102-n4
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1001-01-d?project=<company>-staging2&target=aa1-cptef101-n1
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1001-01-d?project=<company>-staging2&target=aa1-cptef101-n2
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1001-01-d?project=<company>-staging2&target=aa1-cptef101-n3
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1001-01-d?project=<company>-staging2&target=aa1-cptef101-n4
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1001-01-d?project=<company>-staging2&target=aa1-cptef102-n1
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1001-01-d?project=<company>-staging2&target=aa1-cptef102-n2
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1001-01-d?project=<company>-staging2&target=aa1-cptef102-n3
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1001-01-d?project=<company>-staging2&target=aa1-cptef102-n4
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1001-02-a?project=<company>-staging2&target=aa1-cptef101-n1
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1001-02-a?project=<company>-staging2&target=aa1-cptef101-n2
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1001-02-a?project=<company>-staging2&target=aa1-cptef101-n3
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1001-02-a?project=<company>-staging2&target=aa1-cptef101-n4
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1001-02-a?project=<company>-staging2&target=aa1-cptef102-n1
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1001-02-a?project=<company>-staging2&target=aa1-cptef102-n2
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1001-02-a?project=<company>-staging2&target=aa1-cptef102-n3
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1001-02-a?project=<company>-staging2&target=aa1-cptef102-n4
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1001-02-b?project=<company>-staging2&target=aa1-cptef101-n1
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1001-02-b?project=<company>-staging2&target=aa1-cptef101-n2
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1001-02-b?project=<company>-staging2&target=aa1-cptef101-n3
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1001-02-b?project=<company>-staging2&target=aa1-cptef101-n4
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1001-02-b?project=<company>-staging2&target=aa1-cptef102-n1
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1001-02-b?project=<company>-staging2&target=aa1-cptef102-n2
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1001-02-b?project=<company>-staging2&target=aa1-cptef102-n3
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1001-02-b?project=<company>-staging2&target=aa1-cptef102-n4
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1001-02-c?project=<company>-staging2&target=aa1-cptef101-n1
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1001-02-c?project=<company>-staging2&target=aa1-cptef101-n2
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1001-02-c?project=<company>-staging2&target=aa1-cptef101-n3
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1001-02-c?project=<company>-staging2&target=aa1-cptef101-n4
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1001-02-c?project=<company>-staging2&target=aa1-cptef102-n1
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1001-02-c?project=<company>-staging2&target=aa1-cptef102-n2
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1001-02-c?project=<company>-staging2&target=aa1-cptef102-n3
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1001-02-c?project=<company>-staging2&target=aa1-cptef102-n4
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1001-02-d?project=<company>-staging2&target=aa1-cptef101-n1
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1001-02-d?project=<company>-staging2&target=aa1-cptef101-n2
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1001-02-d?project=<company>-staging2&target=aa1-cptef101-n3
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1001-02-d?project=<company>-staging2&target=aa1-cptef101-n4
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1001-02-d?project=<company>-staging2&target=aa1-cptef102-n1
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1001-02-d?project=<company>-staging2&target=aa1-cptef102-n2
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1001-02-d?project=<company>-staging2&target=aa1-cptef102-n3
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1001-02-d?project=<company>-staging2&target=aa1-cptef102-n4
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1001-03-a?project=<company>-staging2&target=aa1-cptef101-n1
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1001-03-a?project=<company>-staging2&target=aa1-cptef101-n2
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1001-03-a?project=<company>-staging2&target=aa1-cptef101-n3
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1001-03-a?project=<company>-staging2&target=aa1-cptef101-n4
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1001-03-a?project=<company>-staging2&target=aa1-cptef102-n1
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1001-03-a?project=<company>-staging2&target=aa1-cptef102-n2
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1001-03-a?project=<company>-staging2&target=aa1-cptef102-n3
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1001-03-a?project=<company>-staging2&target=aa1-cptef102-n4
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1001-03-b?project=<company>-staging2&target=aa1-cptef101-n1
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1001-03-b?project=<company>-staging2&target=aa1-cptef101-n2
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1001-03-b?project=<company>-staging2&target=aa1-cptef101-n3
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1001-03-b?project=<company>-staging2&target=aa1-cptef101-n4
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1001-03-b?project=<company>-staging2&target=aa1-cptef102-n1
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1001-03-b?project=<company>-staging2&target=aa1-cptef102-n2
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1001-03-b?project=<company>-staging2&target=aa1-cptef102-n3
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1001-03-b?project=<company>-staging2&target=aa1-cptef102-n4
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1001-03-c?project=<company>-staging2&target=aa1-cptef101-n1
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1001-03-c?project=<company>-staging2&target=aa1-cptef101-n2
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1001-03-c?project=<company>-staging2&target=aa1-cptef101-n3
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1001-03-c?project=<company>-staging2&target=aa1-cptef101-n4
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1001-03-c?project=<company>-staging2&target=aa1-cptef102-n1
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1001-03-c?project=<company>-staging2&target=aa1-cptef102-n2
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1001-03-c?project=<company>-staging2&target=aa1-cptef102-n3
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1001-03-c?project=<company>-staging2&target=aa1-cptef102-n4
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1001-03-d?project=<company>-staging2&target=aa1-cptef101-n1
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1001-03-d?project=<company>-staging2&target=aa1-cptef101-n2
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1001-03-d?project=<company>-staging2&target=aa1-cptef101-n3
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1001-03-d?project=<company>-staging2&target=aa1-cptef101-n4
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1001-03-d?project=<company>-staging2&target=aa1-cptef102-n1
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1001-03-d?project=<company>-staging2&target=aa1-cptef102-n2
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1001-03-d?project=<company>-staging2&target=aa1-cptef102-n3
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1001-03-d?project=<company>-staging2&target=aa1-cptef102-n4
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1001-04-a?project=<company>-staging2&target=aa1-cptef101-n1
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1001-04-a?project=<company>-staging2&target=aa1-cptef101-n2
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1001-04-a?project=<company>-staging2&target=aa1-cptef101-n3
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1001-04-a?project=<company>-staging2&target=aa1-cptef101-n4
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1001-04-a?project=<company>-staging2&target=aa1-cptef102-n1
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1001-04-a?project=<company>-staging2&target=aa1-cptef102-n2
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1001-04-a?project=<company>-staging2&target=aa1-cptef102-n3
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1001-04-a?project=<company>-staging2&target=aa1-cptef102-n4
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1001-04-b?project=<company>-staging2&target=aa1-cptef101-n1
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1001-04-b?project=<company>-staging2&target=aa1-cptef101-n2
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1001-04-b?project=<company>-staging2&target=aa1-cptef101-n3
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1001-04-b?project=<company>-staging2&target=aa1-cptef101-n4
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1001-04-b?project=<company>-staging2&target=aa1-cptef102-n1
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1001-04-b?project=<company>-staging2&target=aa1-cptef102-n2
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1001-04-b?project=<company>-staging2&target=aa1-cptef102-n3
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1001-04-b?project=<company>-staging2&target=aa1-cptef102-n4
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1001-04-c?project=<company>-staging2&target=aa1-cptef101-n1
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1001-04-c?project=<company>-staging2&target=aa1-cptef101-n2
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1001-04-c?project=<company>-staging2&target=aa1-cptef101-n3
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1001-04-c?project=<company>-staging2&target=aa1-cptef101-n4
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1001-04-c?project=<company>-staging2&target=aa1-cptef102-n1
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1001-04-c?project=<company>-staging2&target=aa1-cptef102-n2
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1001-04-c?project=<company>-staging2&target=aa1-cptef102-n3
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1001-04-c?project=<company>-staging2&target=aa1-cptef102-n4
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1001-04-d?project=<company>-staging2&target=aa1-cptef101-n1
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1001-04-d?project=<company>-staging2&target=aa1-cptef101-n2
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1001-04-d?project=<company>-staging2&target=aa1-cptef101-n3
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1001-04-d?project=<company>-staging2&target=aa1-cptef101-n4
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1001-04-d?project=<company>-staging2&target=aa1-cptef102-n1
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1001-04-d?project=<company>-staging2&target=aa1-cptef102-n2
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1001-04-d?project=<company>-staging2&target=aa1-cptef102-n3
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1001-04-d?project=<company>-staging2&target=aa1-cptef102-n4
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1001-05-a?project=<company>-staging2&target=aa1-cptef101-n1
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1001-05-a?project=<company>-staging2&target=aa1-cptef101-n2
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1001-05-a?project=<company>-staging2&target=aa1-cptef101-n3
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1001-05-a?project=<company>-staging2&target=aa1-cptef101-n4
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1001-05-a?project=<company>-staging2&target=aa1-cptef102-n1
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1001-05-a?project=<company>-staging2&target=aa1-cptef102-n2
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1001-05-a?project=<company>-staging2&target=aa1-cptef102-n3
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1001-05-a?project=<company>-staging2&target=aa1-cptef102-n4
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1001-05-b?project=<company>-staging2&target=aa1-cptef101-n1
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1001-05-b?project=<company>-staging2&target=aa1-cptef101-n2
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1001-05-b?project=<company>-staging2&target=aa1-cptef101-n3
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1001-05-b?project=<company>-staging2&target=aa1-cptef101-n4
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1001-05-b?project=<company>-staging2&target=aa1-cptef102-n1
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1001-05-b?project=<company>-staging2&target=aa1-cptef102-n2
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1001-05-b?project=<company>-staging2&target=aa1-cptef102-n3
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1001-05-b?project=<company>-staging2&target=aa1-cptef102-n4
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1001-05-c?project=<company>-staging2&target=aa1-cptef101-n1
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1001-05-c?project=<company>-staging2&target=aa1-cptef101-n2
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1001-05-c?project=<company>-staging2&target=aa1-cptef101-n3
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1001-05-c?project=<company>-staging2&target=aa1-cptef101-n4
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1001-05-c?project=<company>-staging2&target=aa1-cptef102-n1
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1001-05-c?project=<company>-staging2&target=aa1-cptef102-n2
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1001-05-c?project=<company>-staging2&target=aa1-cptef102-n3
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1001-05-c?project=<company>-staging2&target=aa1-cptef102-n4
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1001-05-d?project=<company>-staging2&target=aa1-cptef101-n1
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1001-05-d?project=<company>-staging2&target=aa1-cptef101-n2
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1001-05-d?project=<company>-staging2&target=aa1-cptef101-n3
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1001-05-d?project=<company>-staging2&target=aa1-cptef101-n4
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1001-05-d?project=<company>-staging2&target=aa1-cptef102-n1
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1001-05-d?project=<company>-staging2&target=aa1-cptef102-n2
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1001-05-d?project=<company>-staging2&target=aa1-cptef102-n3
- /1.0/storage-pools/ceph-vault/volumes/custom/aa1_f91d_disk_pod-890-1001-05-d?project=<company>-staging2&target=aa1-cptef102-n4
status: Created
locations:
- aa1-cptef101-n1
- aa1-cptef101-n2
- aa1-cptef101-n3
- aa1-cptef101-n4
- aa1-cptef102-n1
- aa1-cptef102-n2
- aa1-cptef102-n3
- aa1-cptef102-n4

Does /etc/ceph/ceph.conf exist on your system?

Nevermind, output above suggests it probably does as that’s your cluster name


Not having any luck reproducing this issue


root@rpi-cluster01:~# lxc config device add maas-vm15 test disk source=ceph:lxd-virtualvaultfs-01/aa1_f91d_disk_pod-890-1000-00-a ceph.user_name=admin ceph.cluster_name=ceph
Device test added to maas-vm15
root@rpi-cluster01:~# lxc start maas-vm15
root@rpi-cluster01:~# 

The only scenario where I can get the exact same error you have is if I enter an invalid (non-existent) volume name.

All other scenarios (wrong pool name, wrong user name, wrong cluster name, 
) all result in different errors.

ubuntu@aa1-cptef101-n1:~$ lxc --project <company>-staging2 config device add pod-890-1000-01 aa1_f91d_disk_pod-890-1000-01-a disk source=ceph:lxd-virtualvaultfs-01/aa1_f91d_disk_pod-890-1000-01-a ceph.user_name=lxd ceph.cluster_name=ceph
Device aa1_f91d_disk_pod-890-1000-01-a added to pod-890-1000-01
ubuntu@aa1-cptef101-n1:~$ lxc start --project <company>-staging2 pod-890-1000-01
Error: Failed to run: /snap/lxd/current/bin/lxd forklimits limit=memlock:unlimited:unlimited fd=3 -- /snap/lxd/15724/bin/qemu-system-x86_64 -S -name pod-890-1000-01 -uuid f22cc6ac-cdc4-40db-93d6-49149439b7c6 -daemonize -cpu host -nographic -serial chardev:console -nodefaults -no-reboot -no-user-config -sandbox on,obsolete=deny,elevateprivileges=allow,spawn=deny,resourcecontrol=deny -readconfig /var/snap/lxd/common/lxd/logs/<company>-staging2_pod-890-1000-01/qemu.conf -pidfile /var/snap/lxd/common/lxd/logs/<company>-staging2_pod-890-1000-01/qemu.pid -D /var/snap/lxd/common/lxd/logs/<company>-staging2_pod-890-1000-01/qemu.log -chroot /var/snap/lxd/common/lxd/virtual-machines/<company>-staging2_pod-890-1000-01 -smbios type=2,manufacturer=Canonical Ltd.,product=LXD -runas lxd: : exit status 1
Try `lxc info --show-log pod-890-1000-01` for more info

Tried with a second VM assigned to a different host in the cluster and it still fails.

root@aa1-ceph-master-1:~# rbd ls --pool lxd-virtualvaultfs-01 custom_<company>-staging2_aa1_f91d_disk_pod-890-1000-01-a
custom_<company>-staging2_aa1_f91d_disk_pod-890-1000-01-a