LXD 4.12 unable to start privileged container

Hello.
Recently i have updated my arch system to the latest available version. Additionally, i updated snap lxd package and now i can’t start my privileged container.

I suppose that it can be related with new warning that lxd reports every time:

WARNING: cgroup v2 is not fully supported yet, proceeding with partial confinement

I have managed to start restored container, but it loses it’s privileged setting and i can’t even upgrade it:

$ lxc config get xpraclient-arch security.privileged
WARNING: cgroup v2 is not fully supported yet, proceeding with partial confinement

In result of update inside container i receive this error:

/bin/sh: line 1: /dev/null: Operation not permitted

From container:

# ls -la /dev/null 
crw-rw-rw- 1 nobody nobody 1, 3 Apr  5 07:51 /dev/null

If i enable security.privileged, then container becomes unusable.
And at this point running restart doesn’t work as expected - container is acting like there was no such command and to finalize it i need to shutdown it from inside.

sudo tail -f /var/snap/lxd/common/lxd/logs/lxd.log
t=2021-04-05T16:38:38+0300 lvl=dbug msg="GetInstanceUsage finished" driver=lvm instance=xpraclient-arch/210318-xpra-4.1.1-good pool=xpraclientTPool project=default
t=2021-04-05T16:38:38+0300 lvl=dbug msg="GetInstanceUsage started" driver=lvm instance=xpraclient-arch/210319-xpra-4.1.1-better pool=xpraclientTPool project=default
t=2021-04-05T16:38:38+0300 lvl=dbug msg="GetInstanceUsage finished" driver=lvm instance=xpraclient-arch/210319-xpra-4.1.1-better pool=xpraclientTPool project=default
t=2021-04-05T16:38:38+0300 lvl=dbug msg="GetInstanceUsage started" driver=lvm instance=xpraclient-arch/210320-export pool=xpraclientTPool project=default
t=2021-04-05T16:38:38+0300 lvl=dbug msg="GetInstanceUsage finished" driver=lvm instance=xpraclient-arch/210320-export pool=xpraclientTPool project=default
t=2021-04-05T16:38:38+0300 lvl=dbug msg="GetInstanceUsage started" driver=lvm instance=xpraclient-arch/210402-before-update pool=xpraclientTPool project=default
t=2021-04-05T16:38:38+0300 lvl=dbug msg="GetInstanceUsage finished" driver=lvm instance=xpraclient-arch/210402-before-update pool=xpraclientTPool project=default
t=2021-04-05T16:38:38+0300 lvl=dbug msg="GetInstanceUsage started" driver=lvm instance=xpraclient-arch/210405-broken pool=xpraclientTPool project=default
t=2021-04-05T16:38:38+0300 lvl=dbug msg="GetInstanceUsage finished" driver=lvm instance=xpraclient-arch/210405-broken pool=xpraclientTPool project=default
t=2021-04-05T16:38:38+0300 lvl=dbug msg=Handling ip=@ method=GET protocol=unix url=/1.0/instances/xpraclient-arch/logs/lxc.log username=colt
t=2021-04-05T16:41:19+0300 lvl=dbug msg=Handling ip=@ method=GET protocol=unix url=/1.0 username=colt
t=2021-04-05T16:41:19+0300 lvl=dbug msg=Handling ip=@ method=GET protocol=unix url=/1.0/events username=colt
t=2021-04-05T16:41:19+0300 lvl=dbug msg="New event listener: 66caf421-2345-4aab-b028-fe0c3682ba5f" 
t=2021-04-05T16:41:19+0300 lvl=dbug msg=Handling ip=@ method=PUT protocol=unix url=/1.0/instances/xpraclient-arch/state username=colt
t=2021-04-05T16:41:19+0300 lvl=dbug msg="\n\t{\n\t\t\"action\": \"restart\",\n\t\t\"timeout\": -1,\n\t\t\"force\": false,\n\t\t\"stateful\": false\n\t}" 
t=2021-04-05T16:41:19+0300 lvl=dbug msg="New task Operation: 3a98bd1b-dbe5-4b8d-936c-e7aacd8cdac0" 
t=2021-04-05T16:41:19+0300 lvl=dbug msg="Started task operation: 3a98bd1b-dbe5-4b8d-936c-e7aacd8cdac0" 
t=2021-04-05T16:41:19+0300 lvl=info msg="Restarting container" action=shutdown created=2021-03-16T16:26:40+0200 ephemeral=false instance=xpraclient-arch instanceType=container project=default timeout=-1ns used=2021-04-05T16:35:51+0300
t=2021-04-05T16:41:19+0300 lvl=dbug msg="\n\t{\n\t\t\"type\": \"async\",\n\t\t\"status\": \"Operation created\",\n\t\t\"status_code\": 100,\n\t\t\"operation\": \"/1.0/operations/3a98bd1b-dbe5-4b8d-936c-e7aacd8cdac0\",\n\t\t\"error_code\": 0,\n\t\t\"error\": \"\",\n\t\t\"metadata\": {\n\t\t\t\"id\": \"3a98bd1b-dbe5-4b8d-936c-e7aacd8cdac0\",\n\t\t\t\"class\": \"task\",\n\t\t\t\"description\": \"Restarting instance\",\n\t\t\t\"created_at\": \"2021-04-05T16:41:19.156326097+03:00\",\n\t\t\t\"updated_at\": \"2021-04-05T16:41:19.156326097+03:00\",\n\t\t\t\"status\": \"Running\",\n\t\t\t\"status_code\": 103,\n\t\t\t\"resources\": {\n\t\t\t\t\"instances\": [\n\t\t\t\t\t\"/1.0/instances/xpraclient-arch\"\n\t\t\t\t]\n\t\t\t},\n\t\t\t\"metadata\": null,\n\t\t\t\"may_cancel\": false,\n\t\t\t\"err\": \"\",\n\t\t\t\"location\": \"none\"\n\t\t}\n\t}" 
t=2021-04-05T16:41:19+0300 lvl=dbug msg=Handling ip=@ method=GET protocol=unix url=/1.0/operations/3a98bd1b-dbe5-4b8d-936c-e7aacd8cdac0 username=colt
t=2021-04-05T16:42:02+0300 lvl=dbug msg=Handling ip=@ method=GET protocol=unix url=/1.0 username=colt
t=2021-04-05T16:42:02+0300 lvl=dbug msg=Handling ip=@ method=GET protocol=unix url="/1.0/instances?recursion=2" username=colt
t=2021-04-05T16:42:02+0300 lvl=dbug msg="GetInstanceUsage started" driver=btrfs instance=m-tmp pool=messangers2 project=default
t=2021-04-05T16:42:02+0300 lvl=dbug msg="GetInstanceUsage started" driver=btrfs instance=messangers-tmp pool=messangers2 project=default
t=2021-04-05T16:42:02+0300 lvl=dbug msg="GetInstanceUsage finished" driver=btrfs instance=m-tmp pool=messangers2 project=default
t=2021-04-05T16:42:02+0300 lvl=dbug msg="GetInstanceUsage finished" driver=btrfs instance=messangers-tmp pool=messangers2 project=default
t=2021-04-05T16:42:02+0300 lvl=dbug msg="GetInstanceUsage started" driver=lvm instance=messangers pool=messangersTPool project=default
t=2021-04-05T16:42:02+0300 lvl=dbug msg="GetInstanceUsage started" driver=lvm instance=xpraclient-arch pool=xpraclientTPool project=default
t=2021-04-05T16:42:02+0300 lvl=dbug msg="GetInstanceUsage finished" driver=lvm instance=xpraclient-arch pool=xpraclientTPool project=default
t=2021-04-05T16:42:02+0300 lvl=dbug msg="GetInstanceUsage finished" driver=lvm instance=messangers pool=messangersTPool project=default
t=2021-04-05T16:42:38+0300 lvl=dbug msg=Handling ip=@ method=GET protocol=unix url=/1.0 username=colt
t=2021-04-05T16:42:38+0300 lvl=dbug msg=Handling ip=@ method=GET protocol=unix url="/1.0/instances?recursion=2" username=colt
t=2021-04-05T16:42:38+0300 lvl=dbug msg="GetInstanceUsage started" driver=btrfs instance=messangers-tmp pool=messangers2 project=default
t=2021-04-05T16:42:38+0300 lvl=dbug msg="GetInstanceUsage started" driver=btrfs instance=m-tmp pool=messangers2 project=default
t=2021-04-05T16:42:38+0300 lvl=dbug msg="GetInstanceUsage finished" driver=btrfs instance=m-tmp pool=messangers2 project=default
t=2021-04-05T16:42:38+0300 lvl=dbug msg="GetInstanceUsage finished" driver=btrfs instance=messangers-tmp pool=messangers2 project=default
t=2021-04-05T16:42:38+0300 lvl=dbug msg="GetInstanceUsage started" driver=lvm instance=messangers pool=messangersTPool project=default
t=2021-04-05T16:42:38+0300 lvl=dbug msg="GetInstanceUsage finished" driver=lvm instance=messangers pool=messangersTPool project=default
t=2021-04-05T16:42:38+0300 lvl=dbug msg="GetInstanceUsage started" driver=lvm instance=xpraclient-arch pool=xpraclientTPool project=default
t=2021-04-05T16:42:38+0300 lvl=dbug msg="GetInstanceUsage finished" driver=lvm instance=xpraclient-arch pool=xpraclientTPool project=default
t=2021-04-05T16:42:41+0300 lvl=dbug msg=Handling ip=@ method=GET protocol=unix url=/1.0 username=colt
t=2021-04-05T16:42:41+0300 lvl=dbug msg=Handling ip=@ method=GET protocol=unix url=/1.0/instances/xpraclient-arch username=colt
t=2021-04-05T16:42:41+0300 lvl=dbug msg=Handling ip=@ method=GET protocol=unix url=/1.0/instances/xpraclient-arch/state username=colt
t=2021-04-05T16:42:41+0300 lvl=dbug msg="GetInstanceUsage started" driver=lvm instance=xpraclient-arch pool=xpraclientTPool project=default
t=2021-04-05T16:42:41+0300 lvl=dbug msg="GetInstanceUsage finished" driver=lvm instance=xpraclient-arch pool=xpraclientTPool project=default
t=2021-04-05T16:42:41+0300 lvl=dbug msg=Handling ip=@ method=GET protocol=unix url="/1.0/instances/xpraclient-arch/snapshots?recursion=1" username=colt
t=2021-04-05T16:42:41+0300 lvl=dbug msg="GetInstanceUsage started" driver=lvm instance=xpraclient-arch/210318-xpra-4.0.9-ready pool=xpraclientTPool project=default
t=2021-04-05T16:42:41+0300 lvl=dbug msg="GetInstanceUsage finished" driver=lvm instance=xpraclient-arch/210318-xpra-4.0.9-ready pool=xpraclientTPool project=default
t=2021-04-05T16:42:41+0300 lvl=dbug msg="GetInstanceUsage started" driver=lvm instance=xpraclient-arch/210318-xpra-4.1.1-good pool=xpraclientTPool project=default
t=2021-04-05T16:42:41+0300 lvl=dbug msg="GetInstanceUsage finished" driver=lvm instance=xpraclient-arch/210318-xpra-4.1.1-good pool=xpraclientTPool project=default
t=2021-04-05T16:42:41+0300 lvl=dbug msg="GetInstanceUsage started" driver=lvm instance=xpraclient-arch/210319-xpra-4.1.1-better pool=xpraclientTPool project=default
t=2021-04-05T16:42:41+0300 lvl=dbug msg="GetInstanceUsage finished" driver=lvm instance=xpraclient-arch/210319-xpra-4.1.1-better pool=xpraclientTPool project=default
t=2021-04-05T16:42:41+0300 lvl=dbug msg="GetInstanceUsage started" driver=lvm instance=xpraclient-arch/210320-export pool=xpraclientTPool project=default
t=2021-04-05T16:42:41+0300 lvl=dbug msg="GetInstanceUsage finished" driver=lvm instance=xpraclient-arch/210320-export pool=xpraclientTPool project=default
t=2021-04-05T16:42:41+0300 lvl=dbug msg="GetInstanceUsage started" driver=lvm instance=xpraclient-arch/210402-before-update pool=xpraclientTPool project=default
t=2021-04-05T16:42:41+0300 lvl=dbug msg="GetInstanceUsage finished" driver=lvm instance=xpraclient-arch/210402-before-update pool=xpraclientTPool project=default
t=2021-04-05T16:42:41+0300 lvl=dbug msg="GetInstanceUsage started" driver=lvm instance=xpraclient-arch/210405-broken pool=xpraclientTPool project=default
t=2021-04-05T16:42:41+0300 lvl=dbug msg="GetInstanceUsage finished" driver=lvm instance=xpraclient-arch/210405-broken pool=xpraclientTPool project=default
t=2021-04-05T16:42:41+0300 lvl=dbug msg=Handling ip=@ method=GET protocol=unix url=/1.0/instances/xpraclient-arch/logs/lxc.log username=colt
t=2021-04-05T16:42:54+0300 lvl=dbug msg=Handling ip=@ method=GET protocol=unix url=/1.0 username=colt
t=2021-04-05T16:42:54+0300 lvl=dbug msg=Handling ip=@ method=GET protocol=unix url="/1.0/instances?recursion=2" username=colt
t=2021-04-05T16:42:54+0300 lvl=dbug msg="GetInstanceUsage started" driver=btrfs instance=messangers-tmp pool=messangers2 project=default
t=2021-04-05T16:42:54+0300 lvl=dbug msg="GetInstanceUsage started" driver=btrfs instance=m-tmp pool=messangers2 project=default
t=2021-04-05T16:42:54+0300 lvl=dbug msg="GetInstanceUsage finished" driver=btrfs instance=messangers-tmp pool=messangers2 project=default
t=2021-04-05T16:42:54+0300 lvl=dbug msg="GetInstanceUsage finished" driver=btrfs instance=m-tmp pool=messangers2 project=default
t=2021-04-05T16:42:54+0300 lvl=dbug msg="GetInstanceUsage started" driver=lvm instance=xpraclient-arch pool=xpraclientTPool project=default
t=2021-04-05T16:42:54+0300 lvl=dbug msg="GetInstanceUsage finished" driver=lvm instance=xpraclient-arch pool=xpraclientTPool project=default
t=2021-04-05T16:42:54+0300 lvl=dbug msg="GetInstanceUsage started" driver=lvm instance=messangers pool=messangersTPool project=default
t=2021-04-05T16:42:54+0300 lvl=dbug msg="GetInstanceUsage finished" driver=lvm instance=messangers pool=messangersTPool project=default
t=2021-04-05T16:42:56+0300 lvl=dbug msg=Handling ip=@ method=GET protocol=unix url=/1.0 username=colt
t=2021-04-05T16:42:56+0300 lvl=dbug msg=Handling ip=@ method=GET protocol=unix url=/1.0/instances/xpraclient-arch username=colt
t=2021-04-05T16:42:56+0300 lvl=dbug msg=Handling ip=@ method=GET protocol=unix url=/1.0/instances/xpraclient-arch/state username=colt
t=2021-04-05T16:42:56+0300 lvl=dbug msg="GetInstanceUsage started" driver=lvm instance=xpraclient-arch pool=xpraclientTPool project=default
t=2021-04-05T16:42:56+0300 lvl=dbug msg="GetInstanceUsage finished" driver=lvm instance=xpraclient-arch pool=xpraclientTPool project=default
t=2021-04-05T16:42:56+0300 lvl=dbug msg=Handling ip=@ method=GET protocol=unix url="/1.0/instances/xpraclient-arch/snapshots?recursion=1" username=colt
t=2021-04-05T16:42:56+0300 lvl=dbug msg="GetInstanceUsage started" driver=lvm instance=xpraclient-arch/210318-xpra-4.0.9-ready pool=xpraclientTPool project=default
t=2021-04-05T16:42:56+0300 lvl=dbug msg="GetInstanceUsage finished" driver=lvm instance=xpraclient-arch/210318-xpra-4.0.9-ready pool=xpraclientTPool project=default
t=2021-04-05T16:42:56+0300 lvl=dbug msg="GetInstanceUsage started" driver=lvm instance=xpraclient-arch/210318-xpra-4.1.1-good pool=xpraclientTPool project=default
t=2021-04-05T16:42:56+0300 lvl=dbug msg="GetInstanceUsage finished" driver=lvm instance=xpraclient-arch/210318-xpra-4.1.1-good pool=xpraclientTPool project=default
t=2021-04-05T16:42:56+0300 lvl=dbug msg="GetInstanceUsage started" driver=lvm instance=xpraclient-arch/210319-xpra-4.1.1-better pool=xpraclientTPool project=default
t=2021-04-05T16:42:56+0300 lvl=dbug msg="GetInstanceUsage finished" driver=lvm instance=xpraclient-arch/210319-xpra-4.1.1-better pool=xpraclientTPool project=default
t=2021-04-05T16:42:56+0300 lvl=dbug msg="GetInstanceUsage started" driver=lvm instance=xpraclient-arch/210320-export pool=xpraclientTPool project=default
t=2021-04-05T16:42:56+0300 lvl=dbug msg="GetInstanceUsage finished" driver=lvm instance=xpraclient-arch/210320-export pool=xpraclientTPool project=default
t=2021-04-05T16:42:56+0300 lvl=dbug msg="GetInstanceUsage started" driver=lvm instance=xpraclient-arch/210402-before-update pool=xpraclientTPool project=default
t=2021-04-05T16:42:56+0300 lvl=dbug msg="GetInstanceUsage finished" driver=lvm instance=xpraclient-arch/210402-before-update pool=xpraclientTPool project=default
t=2021-04-05T16:42:56+0300 lvl=dbug msg="GetInstanceUsage started" driver=lvm instance=xpraclient-arch/210405-broken pool=xpraclientTPool project=default
t=2021-04-05T16:42:56+0300 lvl=dbug msg="GetInstanceUsage finished" driver=lvm instance=xpraclient-arch/210405-broken pool=xpraclientTPool project=default
t=2021-04-05T16:42:56+0300 lvl=dbug msg=Handling ip=@ method=GET protocol=unix url=/1.0/instances/xpraclient-arch/logs/lxc.log username=colt
t=2021-04-05T16:43:57+0300 lvl=dbug msg=Handling ip=@ method=GET protocol=unix url=/1.0 username=colt
t=2021-04-05T16:43:57+0300 lvl=dbug msg=Handling ip=@ method=GET protocol=unix url=/1.0/instances/xpraclient-arch username=colt
t=2021-04-05T16:43:57+0300 lvl=dbug msg=Handling ip=@ method=GET protocol=unix url=/1.0/instances/xpraclient-arch/state username=colt
t=2021-04-05T16:43:57+0300 lvl=dbug msg="GetInstanceUsage started" driver=lvm instance=xpraclient-arch pool=xpraclientTPool project=default
t=2021-04-05T16:43:57+0300 lvl=dbug msg="GetInstanceUsage finished" driver=lvm instance=xpraclient-arch pool=xpraclientTPool project=default
t=2021-04-05T16:43:57+0300 lvl=dbug msg=Handling ip=@ method=GET protocol=unix url="/1.0/instances/xpraclient-arch/snapshots?recursion=1" username=colt
t=2021-04-05T16:43:57+0300 lvl=dbug msg="GetInstanceUsage started" driver=lvm instance=xpraclient-arch/210318-xpra-4.0.9-ready pool=xpraclientTPool project=default
t=2021-04-05T16:43:57+0300 lvl=dbug msg="GetInstanceUsage finished" driver=lvm instance=xpraclient-arch/210318-xpra-4.0.9-ready pool=xpraclientTPool project=default
t=2021-04-05T16:43:57+0300 lvl=dbug msg="GetInstanceUsage started" driver=lvm instance=xpraclient-arch/210318-xpra-4.1.1-good pool=xpraclientTPool project=default
t=2021-04-05T16:43:57+0300 lvl=dbug msg="GetInstanceUsage finished" driver=lvm instance=xpraclient-arch/210318-xpra-4.1.1-good pool=xpraclientTPool project=default
t=2021-04-05T16:43:57+0300 lvl=dbug msg="GetInstanceUsage started" driver=lvm instance=xpraclient-arch/210319-xpra-4.1.1-better pool=xpraclientTPool project=default
t=2021-04-05T16:43:57+0300 lvl=dbug msg="GetInstanceUsage finished" driver=lvm instance=xpraclient-arch/210319-xpra-4.1.1-better pool=xpraclientTPool project=default
t=2021-04-05T16:43:57+0300 lvl=dbug msg="GetInstanceUsage started" driver=lvm instance=xpraclient-arch/210320-export pool=xpraclientTPool project=default
t=2021-04-05T16:43:57+0300 lvl=dbug msg="GetInstanceUsage finished" driver=lvm instance=xpraclient-arch/210320-export pool=xpraclientTPool project=default
t=2021-04-05T16:43:57+0300 lvl=dbug msg="GetInstanceUsage started" driver=lvm instance=xpraclient-arch/210402-before-update pool=xpraclientTPool project=default
t=2021-04-05T16:43:57+0300 lvl=dbug msg="GetInstanceUsage finished" driver=lvm instance=xpraclient-arch/210402-before-update pool=xpraclientTPool project=default
t=2021-04-05T16:43:57+0300 lvl=dbug msg="GetInstanceUsage started" driver=lvm instance=xpraclient-arch/210405-broken pool=xpraclientTPool project=default
t=2021-04-05T16:43:57+0300 lvl=dbug msg="GetInstanceUsage finished" driver=lvm instance=xpraclient-arch/210405-broken pool=xpraclientTPool project=default
t=2021-04-05T16:43:57+0300 lvl=dbug msg=Handling ip=@ method=GET protocol=unix url=/1.0/instances/xpraclient-arch/logs/lxc.log username=colt
t=2021-04-05T16:44:39+0300 lvl=dbug msg=Handling ip=@ method=GET protocol=unix url=/1.0 username=colt
t=2021-04-05T16:44:39+0300 lvl=dbug msg=Handling ip=@ method=GET protocol=unix url=/1.0/instances/xpraclient-arch username=colt
t=2021-04-05T16:44:39+0300 lvl=dbug msg=Handling ip=@ method=GET protocol=unix url=/1.0/instances/xpraclient-arch/state username=colt
t=2021-04-05T16:44:39+0300 lvl=dbug msg="GetInstanceUsage started" driver=lvm instance=xpraclient-arch pool=xpraclientTPool project=default
t=2021-04-05T16:44:39+0300 lvl=dbug msg="GetInstanceUsage finished" driver=lvm instance=xpraclient-arch pool=xpraclientTPool project=default
t=2021-04-05T16:44:39+0300 lvl=dbug msg=Handling ip=@ method=GET protocol=unix url="/1.0/instances/xpraclient-arch/snapshots?recursion=1" username=colt
t=2021-04-05T16:44:39+0300 lvl=dbug msg="GetInstanceUsage started" driver=lvm instance=xpraclient-arch/210318-xpra-4.0.9-ready pool=xpraclientTPool project=default
t=2021-04-05T16:44:39+0300 lvl=dbug msg="GetInstanceUsage finished" driver=lvm instance=xpraclient-arch/210318-xpra-4.0.9-ready pool=xpraclientTPool project=default
t=2021-04-05T16:44:39+0300 lvl=dbug msg="GetInstanceUsage started" driver=lvm instance=xpraclient-arch/210318-xpra-4.1.1-good pool=xpraclientTPool project=default
t=2021-04-05T16:44:39+0300 lvl=dbug msg="GetInstanceUsage finished" driver=lvm instance=xpraclient-arch/210318-xpra-4.1.1-good pool=xpraclientTPool project=default
t=2021-04-05T16:44:39+0300 lvl=dbug msg="GetInstanceUsage started" driver=lvm instance=xpraclient-arch/210319-xpra-4.1.1-better pool=xpraclientTPool project=default
t=2021-04-05T16:44:39+0300 lvl=dbug msg="GetInstanceUsage finished" driver=lvm instance=xpraclient-arch/210319-xpra-4.1.1-better pool=xpraclientTPool project=default
t=2021-04-05T16:44:39+0300 lvl=dbug msg="GetInstanceUsage started" driver=lvm instance=xpraclient-arch/210320-export pool=xpraclientTPool project=default
t=2021-04-05T16:44:39+0300 lvl=dbug msg="GetInstanceUsage finished" driver=lvm instance=xpraclient-arch/210320-export pool=xpraclientTPool project=default
t=2021-04-05T16:44:39+0300 lvl=dbug msg="GetInstanceUsage started" driver=lvm instance=xpraclient-arch/210402-before-update pool=xpraclientTPool project=default
t=2021-04-05T16:44:39+0300 lvl=dbug msg="GetInstanceUsage finished" driver=lvm instance=xpraclient-arch/210402-before-update pool=xpraclientTPool project=default
t=2021-04-05T16:44:39+0300 lvl=dbug msg="GetInstanceUsage started" driver=lvm instance=xpraclient-arch/210405-broken pool=xpraclientTPool project=default
t=2021-04-05T16:44:39+0300 lvl=dbug msg="GetInstanceUsage finished" driver=lvm instance=xpraclient-arch/210405-broken pool=xpraclientTPool project=default
t=2021-04-05T16:44:39+0300 lvl=dbug msg=Handling ip=@ method=GET protocol=unix url=/1.0/instances/xpraclient-arch/logs/lxc.log username=colt
t=2021-04-05T16:44:40+0300 lvl=dbug msg=Handling ip=@ method=GET protocol=unix url=/1.0 username=colt
t=2021-04-05T16:44:40+0300 lvl=dbug msg=Handling ip=@ method=GET protocol=unix url=/1.0/instances/xpraclient-arch username=colt
t=2021-04-05T16:44:40+0300 lvl=dbug msg=Handling ip=@ method=GET protocol=unix url=/1.0/instances/xpraclient-arch/state username=colt
t=2021-04-05T16:44:40+0300 lvl=dbug msg="GetInstanceUsage started" driver=lvm instance=xpraclient-arch pool=xpraclientTPool project=default
t=2021-04-05T16:44:40+0300 lvl=dbug msg="GetInstanceUsage finished" driver=lvm instance=xpraclient-arch pool=xpraclientTPool project=default
t=2021-04-05T16:44:40+0300 lvl=dbug msg=Handling ip=@ method=GET protocol=unix url="/1.0/instances/xpraclient-arch/snapshots?recursion=1" username=colt
t=2021-04-05T16:44:40+0300 lvl=dbug msg="GetInstanceUsage started" driver=lvm instance=xpraclient-arch/210318-xpra-4.0.9-ready pool=xpraclientTPool project=default
t=2021-04-05T16:44:40+0300 lvl=dbug msg="GetInstanceUsage finished" driver=lvm instance=xpraclient-arch/210318-xpra-4.0.9-ready pool=xpraclientTPool project=default
t=2021-04-05T16:44:40+0300 lvl=dbug msg="GetInstanceUsage started" driver=lvm instance=xpraclient-arch/210318-xpra-4.1.1-good pool=xpraclientTPool project=default
t=2021-04-05T16:44:40+0300 lvl=dbug msg="GetInstanceUsage finished" driver=lvm instance=xpraclient-arch/210318-xpra-4.1.1-good pool=xpraclientTPool project=default
t=2021-04-05T16:44:40+0300 lvl=dbug msg="GetInstanceUsage started" driver=lvm instance=xpraclient-arch/210319-xpra-4.1.1-better pool=xpraclientTPool project=default
t=2021-04-05T16:44:40+0300 lvl=dbug msg="GetInstanceUsage finished" driver=lvm instance=xpraclient-arch/210319-xpra-4.1.1-better pool=xpraclientTPool project=default
t=2021-04-05T16:44:40+0300 lvl=dbug msg="GetInstanceUsage started" driver=lvm instance=xpraclient-arch/210320-export pool=xpraclientTPool project=default
t=2021-04-05T16:44:40+0300 lvl=dbug msg="GetInstanceUsage finished" driver=lvm instance=xpraclient-arch/210320-export pool=xpraclientTPool project=default
t=2021-04-05T16:44:40+0300 lvl=dbug msg="GetInstanceUsage started" driver=lvm instance=xpraclient-arch/210402-before-update pool=xpraclientTPool project=default
t=2021-04-05T16:44:40+0300 lvl=dbug msg="GetInstanceUsage finished" driver=lvm instance=xpraclient-arch/210402-before-update pool=xpraclientTPool project=default
t=2021-04-05T16:44:40+0300 lvl=dbug msg="GetInstanceUsage started" driver=lvm instance=xpraclient-arch/210405-broken pool=xpraclientTPool project=default
t=2021-04-05T16:44:40+0300 lvl=dbug msg="GetInstanceUsage finished" driver=lvm instance=xpraclient-arch/210405-broken pool=xpraclientTPool project=default
t=2021-04-05T16:44:40+0300 lvl=dbug msg=Handling ip=@ method=GET protocol=unix url=/1.0/instances/xpraclient-arch/logs/lxc.log username=colt
t=2021-04-05T16:44:41+0300 lvl=dbug msg=Handling ip=@ method=GET protocol=unix url=/1.0 username=colt
t=2021-04-05T16:44:41+0300 lvl=dbug msg=Handling ip=@ method=GET protocol=unix url=/1.0/instances/xpraclient-arch username=colt
t=2021-04-05T16:44:42+0300 lvl=dbug msg=Handling ip=@ method=GET protocol=unix url=/1.0/instances/xpraclient-arch/state username=colt
t=2021-04-05T16:44:42+0300 lvl=dbug msg="GetInstanceUsage started" driver=lvm instance=xpraclient-arch pool=xpraclientTPool project=default
t=2021-04-05T16:44:42+0300 lvl=dbug msg="GetInstanceUsage finished" driver=lvm instance=xpraclient-arch pool=xpraclientTPool project=default
t=2021-04-05T16:44:42+0300 lvl=dbug msg=Handling ip=@ method=GET protocol=unix url="/1.0/instances/xpraclient-arch/snapshots?recursion=1" username=colt
t=2021-04-05T16:44:42+0300 lvl=dbug msg="GetInstanceUsage started" driver=lvm instance=xpraclient-arch/210318-xpra-4.0.9-ready pool=xpraclientTPool project=default
t=2021-04-05T16:44:42+0300 lvl=dbug msg="GetInstanceUsage finished" driver=lvm instance=xpraclient-arch/210318-xpra-4.0.9-ready pool=xpraclientTPool project=default
t=2021-04-05T16:44:42+0300 lvl=dbug msg="GetInstanceUsage started" driver=lvm instance=xpraclient-arch/210318-xpra-4.1.1-good pool=xpraclientTPool project=default
t=2021-04-05T16:44:42+0300 lvl=dbug msg="GetInstanceUsage finished" driver=lvm instance=xpraclient-arch/210318-xpra-4.1.1-good pool=xpraclientTPool project=default
t=2021-04-05T16:44:42+0300 lvl=dbug msg="GetInstanceUsage started" driver=lvm instance=xpraclient-arch/210319-xpra-4.1.1-better pool=xpraclientTPool project=default
t=2021-04-05T16:44:42+0300 lvl=dbug msg="GetInstanceUsage finished" driver=lvm instance=xpraclient-arch/210319-xpra-4.1.1-better pool=xpraclientTPool project=default
t=2021-04-05T16:44:42+0300 lvl=dbug msg="GetInstanceUsage started" driver=lvm instance=xpraclient-arch/210320-export pool=xpraclientTPool project=default
t=2021-04-05T16:44:42+0300 lvl=dbug msg="GetInstanceUsage finished" driver=lvm instance=xpraclient-arch/210320-export pool=xpraclientTPool project=default
t=2021-04-05T16:44:42+0300 lvl=dbug msg="GetInstanceUsage started" driver=lvm instance=xpraclient-arch/210402-before-update pool=xpraclientTPool project=default
t=2021-04-05T16:44:42+0300 lvl=dbug msg="GetInstanceUsage finished" driver=lvm instance=xpraclient-arch/210402-before-update pool=xpraclientTPool project=default
t=2021-04-05T16:44:42+0300 lvl=dbug msg="GetInstanceUsage started" driver=lvm instance=xpraclient-arch/210405-broken pool=xpraclientTPool project=default
t=2021-04-05T16:44:42+0300 lvl=dbug msg="GetInstanceUsage finished" driver=lvm instance=xpraclient-arch/210405-broken pool=xpraclientTPool project=default
t=2021-04-05T16:44:42+0300 lvl=dbug msg=Handling ip=@ method=GET protocol=unix url=/1.0/instances/xpraclient-arch/logs/lxc.log username=colt
t=2021-04-05T16:44:50+0300 lvl=dbug msg=Handling ip=@ method=GET protocol=unix url=/1.0 username=colt
t=2021-04-05T16:44:50+0300 lvl=dbug msg=Handling ip=@ method=GET protocol=unix url="/1.0/instances?recursion=2" username=colt
t=2021-04-05T16:44:50+0300 lvl=dbug msg="GetInstanceUsage started" driver=btrfs instance=m-tmp pool=messangers2 project=default
t=2021-04-05T16:44:50+0300 lvl=dbug msg="GetInstanceUsage started" driver=btrfs instance=messangers-tmp pool=messangers2 project=default
t=2021-04-05T16:44:50+0300 lvl=dbug msg="GetInstanceUsage finished" driver=btrfs instance=messangers-tmp pool=messangers2 project=default
t=2021-04-05T16:44:50+0300 lvl=dbug msg="GetInstanceUsage finished" driver=btrfs instance=m-tmp pool=messangers2 project=default
t=2021-04-05T16:44:50+0300 lvl=dbug msg="GetInstanceUsage started" driver=lvm instance=messangers pool=messangersTPool project=default
t=2021-04-05T16:44:50+0300 lvl=dbug msg="GetInstanceUsage finished" driver=lvm instance=messangers pool=messangersTPool project=default
t=2021-04-05T16:44:50+0300 lvl=dbug msg="GetInstanceUsage started" driver=lvm instance=xpraclient-arch pool=xpraclientTPool project=default
t=2021-04-05T16:44:50+0300 lvl=dbug msg="GetInstanceUsage finished" driver=lvm instance=xpraclient-arch pool=xpraclientTPool project=default
t=2021-04-05T16:45:56+0300 lvl=dbug msg=Handling ip=@ method=GET protocol=unix url=/1.0 username=colt
t=2021-04-05T16:45:56+0300 lvl=dbug msg=Handling ip=@ method=GET protocol=unix url="/1.0/instances?recursion=2" username=colt
t=2021-04-05T16:45:56+0300 lvl=dbug msg="GetInstanceUsage started" driver=btrfs instance=messangers-tmp pool=messangers2 project=default
t=2021-04-05T16:45:56+0300 lvl=dbug msg="GetInstanceUsage started" driver=btrfs instance=m-tmp pool=messangers2 project=default
t=2021-04-05T16:45:56+0300 lvl=dbug msg="GetInstanceUsage finished" driver=btrfs instance=messangers-tmp pool=messangers2 project=default
t=2021-04-05T16:45:56+0300 lvl=dbug msg="GetInstanceUsage finished" driver=btrfs instance=m-tmp pool=messangers2 project=default
t=2021-04-05T16:45:56+0300 lvl=dbug msg="GetInstanceUsage started" driver=lvm instance=messangers pool=messangersTPool project=default
t=2021-04-05T16:45:56+0300 lvl=dbug msg="GetInstanceUsage finished" driver=lvm instance=messangers pool=messangersTPool project=default
t=2021-04-05T16:45:56+0300 lvl=dbug msg="GetInstanceUsage started" driver=lvm instance=xpraclient-arch pool=xpraclientTPool project=default
t=2021-04-05T16:45:56+0300 lvl=dbug msg="GetInstanceUsage finished" driver=lvm instance=xpraclient-arch pool=xpraclientTPool project=default
t=2021-04-05T16:46:34+0300 lvl=dbug msg=Handling ip=@ method=GET protocol=unix url=/1.0 username=colt
t=2021-04-05T16:46:34+0300 lvl=dbug msg=Handling ip=@ method=GET protocol=unix url=/1.0/instances/xpraclient-arch username=colt
t=2021-04-05T16:46:34+0300 lvl=dbug msg=Handling ip=@ method=GET protocol=unix url=/1.0/instances/xpraclient-arch/state username=colt
t=2021-04-05T16:46:34+0300 lvl=dbug msg="GetInstanceUsage started" driver=lvm instance=xpraclient-arch pool=xpraclientTPool project=default
t=2021-04-05T16:46:34+0300 lvl=dbug msg="GetInstanceUsage finished" driver=lvm instance=xpraclient-arch pool=xpraclientTPool project=default
t=2021-04-05T16:46:34+0300 lvl=dbug msg=Handling ip=@ method=GET protocol=unix url="/1.0/instances/xpraclient-arch/snapshots?recursion=1" username=colt
t=2021-04-05T16:46:34+0300 lvl=dbug msg="GetInstanceUsage started" driver=lvm instance=xpraclient-arch/210318-xpra-4.0.9-ready pool=xpraclientTPool project=default
t=2021-04-05T16:46:34+0300 lvl=dbug msg="GetInstanceUsage finished" driver=lvm instance=xpraclient-arch/210318-xpra-4.0.9-ready pool=xpraclientTPool project=default
t=2021-04-05T16:46:34+0300 lvl=dbug msg="GetInstanceUsage started" driver=lvm instance=xpraclient-arch/210318-xpra-4.1.1-good pool=xpraclientTPool project=default
t=2021-04-05T16:46:34+0300 lvl=dbug msg="GetInstanceUsage finished" driver=lvm instance=xpraclient-arch/210318-xpra-4.1.1-good pool=xpraclientTPool project=default
t=2021-04-05T16:46:34+0300 lvl=dbug msg="GetInstanceUsage started" driver=lvm instance=xpraclient-arch/210319-xpra-4.1.1-better pool=xpraclientTPool project=default
t=2021-04-05T16:46:34+0300 lvl=dbug msg="GetInstanceUsage finished" driver=lvm instance=xpraclient-arch/210319-xpra-4.1.1-better pool=xpraclientTPool project=default
t=2021-04-05T16:46:34+0300 lvl=dbug msg="GetInstanceUsage started" driver=lvm instance=xpraclient-arch/210320-export pool=xpraclientTPool project=default
t=2021-04-05T16:46:34+0300 lvl=dbug msg="GetInstanceUsage finished" driver=lvm instance=xpraclient-arch/210320-export pool=xpraclientTPool project=default
t=2021-04-05T16:46:34+0300 lvl=dbug msg="GetInstanceUsage started" driver=lvm instance=xpraclient-arch/210402-before-update pool=xpraclientTPool project=default
t=2021-04-05T16:46:34+0300 lvl=dbug msg="GetInstanceUsage finished" driver=lvm instance=xpraclient-arch/210402-before-update pool=xpraclientTPool project=default
t=2021-04-05T16:46:34+0300 lvl=dbug msg="GetInstanceUsage started" driver=lvm instance=xpraclient-arch/210405-broken pool=xpraclientTPool project=default
t=2021-04-05T16:46:34+0300 lvl=dbug msg="GetInstanceUsage finished" driver=lvm instance=xpraclient-arch/210405-broken pool=xpraclientTPool project=default
t=2021-04-05T16:46:34+0300 lvl=dbug msg=Handling ip=@ method=GET protocol=unix url=/1.0/instances/xpraclient-arch/logs/lxc.log username=colt
t=2021-04-05T16:47:22+0300 lvl=dbug msg=Handling ip=@ method=GET protocol=unix url=/1.0 username=colt
t=2021-04-05T16:47:22+0300 lvl=dbug msg=Handling ip=@ method=GET protocol=unix url=/1.0/events username=colt
t=2021-04-05T16:47:22+0300 lvl=dbug msg="New event listener: 4b378ce2-8686-4931-ae0d-3dede97b07a3" 
t=2021-04-05T16:47:22+0300 lvl=dbug msg=Handling ip=@ method=PUT protocol=unix url=/1.0/instances/xpraclient-arch/state username=colt
t=2021-04-05T16:47:22+0300 lvl=dbug msg="\n\t{\n\t\t\"action\": \"stop\",\n\t\t\"timeout\": -1,\n\t\t\"force\": false,\n\t\t\"stateful\": false\n\t}" 
t=2021-04-05T16:47:22+0300 lvl=dbug msg="New task Operation: dbaf1f5c-269d-4148-9725-05c0416a96d7" 
t=2021-04-05T16:47:22+0300 lvl=dbug msg="Started task operation: dbaf1f5c-269d-4148-9725-05c0416a96d7" 
t=2021-04-05T16:47:22+0300 lvl=info msg="Shutting down container" action=shutdown created=2021-03-16T16:26:40+0200 ephemeral=false instance=xpraclient-arch instanceType=container project=default timeout=-1s used=2021-04-05T16:35:51+0300
t=2021-04-05T16:47:22+0300 lvl=dbug msg="\n\t{\n\t\t\"type\": \"async\",\n\t\t\"status\": \"Operation created\",\n\t\t\"status_code\": 100,\n\t\t\"operation\": \"/1.0/operations/dbaf1f5c-269d-4148-9725-05c0416a96d7\",\n\t\t\"error_code\": 0,\n\t\t\"error\": \"\",\n\t\t\"metadata\": {\n\t\t\t\"id\": \"dbaf1f5c-269d-4148-9725-05c0416a96d7\",\n\t\t\t\"class\": \"task\",\n\t\t\t\"description\": \"Stopping instance\",\n\t\t\t\"created_at\": \"2021-04-05T16:47:22.271011947+03:00\",\n\t\t\t\"updated_at\": \"2021-04-05T16:47:22.271011947+03:00\",\n\t\t\t\"status\": \"Running\",\n\t\t\t\"status_code\": 103,\n\t\t\t\"resources\": {\n\t\t\t\t\"instances\": [\n\t\t\t\t\t\"/1.0/instances/xpraclient-arch\"\n\t\t\t\t]\n\t\t\t},\n\t\t\t\"metadata\": null,\n\t\t\t\"may_cancel\": false,\n\t\t\t\"err\": \"\",\n\t\t\t\"location\": \"none\"\n\t\t}\n\t}" 
t=2021-04-05T16:47:22+0300 lvl=dbug msg=Handling ip=@ method=GET protocol=unix url=/1.0/operations/dbaf1f5c-269d-4148-9725-05c0416a96d7 username=colt
t=2021-04-05T16:47:29+0300 lvl=dbug msg=Handling ip=@ method=DELETE protocol=unix url=/1.0/operations/dbaf1f5c-269d-4148-9725-05c0416a96d7 username=colt
t=2021-04-05T16:47:32+0300 lvl=dbug msg=Handling ip=@ method=DELETE protocol=unix url=/1.0/operations/dbaf1f5c-269d-4148-9725-05c0416a96d7 username=colt
t=2021-04-05T16:47:32+0300 lvl=dbug msg=Handling ip=@ method=DELETE protocol=unix url=/1.0/operations/dbaf1f5c-269d-4148-9725-05c0416a96d7 username=colt
t=2021-04-05T16:47:32+0300 lvl=dbug msg="Event listener finished: 4b378ce2-8686-4931-ae0d-3dede97b07a3" 
t=2021-04-05T16:47:32+0300 lvl=dbug msg="Disconnected event listener: 4b378ce2-8686-4931-ae0d-3dede97b07a3" 
t=2021-04-05T16:47:36+0300 lvl=dbug msg=Handling ip=@ method=GET protocol=unix url=/1.0 username=colt
t=2021-04-05T16:47:36+0300 lvl=dbug msg=Handling ip=@ method=GET protocol=unix url=/1.0/instances/xpraclient-arch username=colt
t=2021-04-05T16:47:36+0300 lvl=dbug msg=Handling ip=@ method=GET protocol=unix url=/1.0/instances/xpraclient-arch/state username=colt
t=2021-04-05T16:47:36+0300 lvl=dbug msg="GetInstanceUsage started" driver=lvm instance=xpraclient-arch pool=xpraclientTPool project=default
t=2021-04-05T16:47:36+0300 lvl=dbug msg="GetInstanceUsage finished" driver=lvm instance=xpraclient-arch pool=xpraclientTPool project=default
t=2021-04-05T16:47:36+0300 lvl=dbug msg=Handling ip=@ method=GET protocol=unix url="/1.0/instances/xpraclient-arch/snapshots?recursion=1" username=colt
t=2021-04-05T16:47:36+0300 lvl=dbug msg="GetInstanceUsage started" driver=lvm instance=xpraclient-arch/210318-xpra-4.0.9-ready pool=xpraclientTPool project=default
t=2021-04-05T16:47:36+0300 lvl=dbug msg="GetInstanceUsage finished" driver=lvm instance=xpraclient-arch/210318-xpra-4.0.9-ready pool=xpraclientTPool project=default
t=2021-04-05T16:47:36+0300 lvl=dbug msg="GetInstanceUsage started" driver=lvm instance=xpraclient-arch/210318-xpra-4.1.1-good pool=xpraclientTPool project=default
t=2021-04-05T16:47:36+0300 lvl=dbug msg="GetInstanceUsage finished" driver=lvm instance=xpraclient-arch/210318-xpra-4.1.1-good pool=xpraclientTPool project=default
t=2021-04-05T16:47:36+0300 lvl=dbug msg="GetInstanceUsage started" driver=lvm instance=xpraclient-arch/210319-xpra-4.1.1-better pool=xpraclientTPool project=default
t=2021-04-05T16:47:36+0300 lvl=dbug msg="GetInstanceUsage finished" driver=lvm instance=xpraclient-arch/210319-xpra-4.1.1-better pool=xpraclientTPool project=default
t=2021-04-05T16:47:36+0300 lvl=dbug msg="GetInstanceUsage started" driver=lvm instance=xpraclient-arch/210320-export pool=xpraclientTPool project=default
t=2021-04-05T16:47:36+0300 lvl=dbug msg="GetInstanceUsage finished" driver=lvm instance=xpraclient-arch/210320-export pool=xpraclientTPool project=default
t=2021-04-05T16:47:36+0300 lvl=dbug msg="GetInstanceUsage started" driver=lvm instance=xpraclient-arch/210402-before-update pool=xpraclientTPool project=default
t=2021-04-05T16:47:36+0300 lvl=dbug msg="GetInstanceUsage finished" driver=lvm instance=xpraclient-arch/210402-before-update pool=xpraclientTPool project=default
t=2021-04-05T16:47:36+0300 lvl=dbug msg="GetInstanceUsage started" driver=lvm instance=xpraclient-arch/210405-broken pool=xpraclientTPool project=default
t=2021-04-05T16:47:36+0300 lvl=dbug msg="GetInstanceUsage finished" driver=lvm instance=xpraclient-arch/210405-broken pool=xpraclientTPool project=default
t=2021-04-05T16:47:36+0300 lvl=dbug msg=Handling ip=@ method=GET protocol=unix url=/1.0/instances/xpraclient-arch/logs/lxc.log username=colt
t=2021-04-05T16:47:53+0300 lvl=dbug msg=Handling ip=@ method=GET protocol=unix url=/1.0 username=colt
t=2021-04-05T16:47:53+0300 lvl=dbug msg=Handling ip=@ method=GET protocol=unix url=/1.0/events username=colt
t=2021-04-05T16:47:53+0300 lvl=dbug msg="New event listener: 5efb2922-68db-44a6-a429-cfdf7825f249" 
t=2021-04-05T16:47:53+0300 lvl=dbug msg=Handling ip=@ method=POST protocol=unix url=/1.0/instances/xpraclient-arch/exec username=colt
t=2021-04-05T16:47:53+0300 lvl=dbug msg="\n\t{\n\t\t\"command\": [\n\t\t\t\"bash\"\n\t\t],\n\t\t\"wait-for-websocket\": true,\n\t\t\"interactive\": true,\n\t\t\"environment\": {\n\t\t\t\"TERM\": \"xterm-256color\"\n\t\t},\n\t\t\"width\": 118,\n\t\t\"height\": 31,\n\t\t\"record-output\": false,\n\t\t\"user\": 0,\n\t\t\"group\": 0,\n\t\t\"cwd\": \"\"\n\t}" 
t=2021-04-05T16:47:53+0300 lvl=dbug msg="MountInstance started" driver=lvm instance=xpraclient-arch pool=xpraclientTPool project=default
t=2021-04-05T16:47:53+0300 lvl=dbug msg="MountInstance finished" driver=lvm instance=xpraclient-arch pool=xpraclientTPool project=default
t=2021-04-05T16:47:53+0300 lvl=dbug msg=forkcheckfile instance=xpraclient-arch instanceType=container line="Path doesn't exist: No such file or directory" project=default
t=2021-04-05T16:47:53+0300 lvl=dbug msg="UnmountInstance started" driver=lvm instance=xpraclient-arch pool=xpraclientTPool project=default
t=2021-04-05T16:47:53+0300 lvl=dbug msg="Skipping unmount as in use" driver=lvm pool=xpraclientTPool refCount=1
t=2021-04-05T16:47:53+0300 lvl=dbug msg="UnmountInstance finished" driver=lvm instance=xpraclient-arch pool=xpraclientTPool project=default
t=2021-04-05T16:47:53+0300 lvl=dbug msg="New websocket Operation: 101a3452-332d-4983-a34a-76d89bbe9692" 
t=2021-04-05T16:47:53+0300 lvl=dbug msg="Started websocket operation: 101a3452-332d-4983-a34a-76d89bbe9692" 
t=2021-04-05T16:47:53+0300 lvl=dbug msg="\n\t{\n\t\t\"type\": \"async\",\n\t\t\"status\": \"Operation created\",\n\t\t\"status_code\": 100,\n\t\t\"operation\": \"/1.0/operations/101a3452-332d-4983-a34a-76d89bbe9692\",\n\t\t\"error_code\": 0,\n\t\t\"error\": \"\",\n\t\t\"metadata\": {\n\t\t\t\"id\": \"101a3452-332d-4983-a34a-76d89bbe9692\",\n\t\t\t\"class\": \"websocket\",\n\t\t\t\"description\": \"Executing command\",\n\t\t\t\"created_at\": \"2021-04-05T16:47:53.34610374+03:00\",\n\t\t\t\"updated_at\": \"2021-04-05T16:47:53.34610374+03:00\",\n\t\t\t\"status\": \"Running\",\n\t\t\t\"status_code\": 103,\n\t\t\t\"resources\": {\n\t\t\t\t\"containers\": [\n\t\t\t\t\t\"/1.0/containers/xpraclient-arch\"\n\t\t\t\t],\n\t\t\t\t\"instances\": [\n\t\t\t\t\t\"/1.0/instances/xpraclient-arch\"\n\t\t\t\t]\n\t\t\t},\n\t\t\t\"metadata\": {\n\t\t\t\t\"command\": [\n\t\t\t\t\t\"bash\"\n\t\t\t\t],\n\t\t\t\t\"environment\": {\n\t\t\t\t\t\"DISPLAY\": \":0\",\n\t\t\t\t\t\"HOME\": \"/root\",\n\t\t\t\t\t\"LANG\": \"C.UTF-8\",\n\t\t\t\t\t\"PATH\": \"/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin\",\n\t\t\t\t\t\"TERM\": \"xterm-256color\",\n\t\t\t\t\t\"USER\": \"root\"\n\t\t\t\t},\n\t\t\t\t\"fds\": {\n\t\t\t\t\t\"0\": \"7b25c9029638648215e0334bca6f1efe5de9d2040827aa93417d5240f3b96ec1\",\n\t\t\t\t\t\"control\": \"dcc4c8946db7d61a6620dfd0146ce58937abc6d2644fa3926347c80c711e7a8c\"\n\t\t\t\t},\n\t\t\t\t\"interactive\": true\n\t\t\t},\n\t\t\t\"may_cancel\": false,\n\t\t\t\"err\": \"\",\n\t\t\t\"location\": \"none\"\n\t\t}\n\t}" 
t=2021-04-05T16:47:53+0300 lvl=dbug msg=Handling ip=@ method=GET protocol=unix url="/1.0/operations/101a3452-332d-4983-a34a-76d89bbe9692/websocket?secret=dcc4c8946db7d61a6620dfd0146ce58937abc6d2644fa3926347c80c711e7a8c" username=colt
t=2021-04-05T16:47:53+0300 lvl=dbug msg="Connected websocket Operation: 101a3452-332d-4983-a34a-76d89bbe9692" 
t=2021-04-05T16:47:53+0300 lvl=dbug msg="Handled websocket Operation: 101a3452-332d-4983-a34a-76d89bbe9692" 
t=2021-04-05T16:47:53+0300 lvl=dbug msg=Handling ip=@ method=GET protocol=unix url="/1.0/operations/101a3452-332d-4983-a34a-76d89bbe9692/websocket?secret=7b25c9029638648215e0334bca6f1efe5de9d2040827aa93417d5240f3b96ec1" username=colt
t=2021-04-05T16:47:53+0300 lvl=dbug msg="Connected websocket Operation: 101a3452-332d-4983-a34a-76d89bbe9692" 
t=2021-04-05T16:47:53+0300 lvl=dbug msg="Handled websocket Operation: 101a3452-332d-4983-a34a-76d89bbe9692" 
t=2021-04-05T16:47:53+0300 lvl=dbug msg=Handling ip=@ method=GET protocol=unix url=/1.0/operations/101a3452-332d-4983-a34a-76d89bbe9692 username=colt
t=2021-04-05T16:47:53+0300 lvl=dbug msg="Retrieved PID of executing child process" attachedPid=147909 instance=xpraclient-arch instanceType=container project=default
t=2021-04-05T16:47:53+0300 lvl=dbug msg="Instance process started" PID=147909 instance=xpraclient-arch
t=2021-04-05T16:47:53+0300 lvl=dbug msg="Interactive child process handler started" PID=147909 instance=xpraclient-arch
t=2021-04-05T16:47:53+0300 lvl=dbug msg="Started mirroring websocket" PID=147909 instance=xpraclient-arch
t=2021-04-05T16:47:58+0300 lvl=dbug msg="Instance process stopped" PID=147909 instance=xpraclient-arch
t=2021-04-05T16:47:58+0300 lvl=dbug msg="Detected poll(POLLHUP) event: exiting." 
t=2021-04-05T16:47:58+0300 lvl=dbug msg="Sending write barrier" 
t=2021-04-05T16:47:58+0300 lvl=dbug msg="Got error getting next reader" PID=147909 err="read unix /var/snap/lxd/common/lxd/unix.socket->@: use of closed network connection" instance=xpraclient-arch
t=2021-04-05T16:47:58+0300 lvl=dbug msg="Interactive child process handler finished" PID=147909 instance=xpraclient-arch
t=2021-04-05T16:47:58+0300 lvl=dbug msg="Got error getting next reader websocket: close 1006 (abnormal closure): unexpected EOF" 
t=2021-04-05T16:47:58+0300 lvl=dbug msg="Finished mirroring websocket" PID=147909 instance=xpraclient-arch
t=2021-04-05T16:47:58+0300 lvl=dbug msg="Updated metadata for websocket Operation: 101a3452-332d-4983-a34a-76d89bbe9692" 
t=2021-04-05T16:47:58+0300 lvl=dbug msg="Success for websocket operation: 101a3452-332d-4983-a34a-76d89bbe9692" 
t=2021-04-05T16:47:58+0300 lvl=dbug msg="Event listener finished: 5efb2922-68db-44a6-a429-cfdf7825f249" 
t=2021-04-05T16:47:58+0300 lvl=dbug msg="Disconnected event listener: 5efb2922-68db-44a6-a429-cfdf7825f249" 
t=2021-04-05T16:47:58+0300 lvl=dbug msg=Handling ip=@ method=GET protocol=unix url="/internal/containers/xpraclient-arch/onstopns?netns=%2Fproc%2F143753%2Ffd%2F4&project=default&target=stop" username=root
t=2021-04-05T16:47:58+0300 lvl=dbug msg="Stopping device" device=eth0 instance=xpraclient-arch instanceType=container project=default type=nic
t=2021-04-05T16:47:58+0300 lvl=dbug msg="Clearing instance firewall static filters" dev=eth0 host_name=veth11c102f0 hwaddr=00:16:3e:7c:12:de instance=xpraclient-arch ipv4=0.0.0.0 ipv6=:: parent=lxdbr0 project=default
t=2021-04-05T16:47:58+0300 lvl=dbug msg="Clearing instance firewall dynamic filters" dev=eth0 host_name=veth11c102f0 hwaddr=00:16:3e:7c:12:de instance=xpraclient-arch ipv4=<nil> ipv6=<nil> parent=lxdbr0 project=default
t=2021-04-05T16:47:58+0300 lvl=dbug msg=Handling ip=@ method=GET protocol=unix url="/internal/containers/xpraclient-arch/onstop?project=default&target=stop" username=root
t=2021-04-05T16:47:58+0300 lvl=dbug msg="Container initiated" action=stop created=2021-03-16T16:26:40+0200 ephemeral=false instance=xpraclient-arch instanceType=container project=default stateful=false used=2021-04-05T16:35:51+0300
t=2021-04-05T16:47:58+0300 lvl=dbug msg="Container stopped, cleaning up" instance=xpraclient-arch instanceType=container project=default
t=2021-04-05T16:47:58+0300 lvl=dbug msg="Stopping device" device=mygpu instance=xpraclient-arch instanceType=container project=default type=gpu
t=2021-04-05T16:47:58+0300 lvl=dbug msg="Stopping device" device=X0 instance=xpraclient-arch instanceType=container project=default type=proxy
t=2021-04-05T16:47:58+0300 lvl=info msg="Shut down container" action=shutdown created=2021-03-16T16:26:40+0200 ephemeral=false instance=xpraclient-arch instanceType=container project=default timeout=-1s used=2021-04-05T16:35:51+0300
t=2021-04-05T16:47:58+0300 lvl=dbug msg="Stopping device" device=Xauthority instance=xpraclient-arch instanceType=container project=default type=disk
t=2021-04-05T16:47:58+0300 lvl=dbug msg="Success for task operation: dbaf1f5c-269d-4148-9725-05c0416a96d7" 
t=2021-04-05T16:47:58+0300 lvl=dbug msg="Stopping device" device=Xauthority2 instance=xpraclient-arch instanceType=container project=default type=disk
t=2021-04-05T16:47:58+0300 lvl=dbug msg="Stopping device" device=shared_storage instance=xpraclient-arch instanceType=container project=default type=disk
t=2021-04-05T16:47:58+0300 lvl=dbug msg="Stopping device" device=root instance=xpraclient-arch instanceType=container project=default type=disk
t=2021-04-05T16:47:58+0300 lvl=dbug msg="MountInstance started" driver=lvm instance=xpraclient-arch pool=xpraclientTPool project=default
t=2021-04-05T16:47:58+0300 lvl=dbug msg="UnmountInstance started" driver=lvm instance=xpraclient-arch pool=xpraclientTPool project=default
t=2021-04-05T16:47:58+0300 lvl=dbug msg="MountInstance finished" driver=lvm instance=xpraclient-arch pool=xpraclientTPool project=default
t=2021-04-05T16:47:58+0300 lvl=dbug msg="Container idmap changed, remapping" instance=xpraclient-arch instanceType=container project=default
t=2021-04-05T16:47:58+0300 lvl=dbug msg="Updated metadata for task Operation: 3a98bd1b-dbe5-4b8d-936c-e7aacd8cdac0" 
t=2021-04-05T16:47:58+0300 lvl=dbug msg="Skipping unmount as in use" driver=lvm pool=xpraclientTPool refCount=1
t=2021-04-05T16:47:58+0300 lvl=dbug msg="UnmountInstance finished" driver=lvm instance=xpraclient-arch pool=xpraclientTPool project=default
t=2021-04-05T16:48:22+0300 lvl=dbug msg="UnmountInstance started" driver=lvm instance=xpraclient-arch pool=xpraclientTPool project=default
t=2021-04-05T16:48:27+0300 lvl=dbug msg="Unmounted logical volume" driver=lvm keepBlockDev=false path=/var/snap/lxd/common/lxd/storage-pools/xpraclientTPool/containers/xpraclient-arch pool=xpraclientTPool
t=2021-04-05T16:48:27+0300 lvl=dbug msg="Deactivated logical volume" dev=/dev/xpraclientTPoolVG/containers_xpraclient--arch driver=lvm pool=xpraclientTPool
t=2021-04-05T16:48:27+0300 lvl=dbug msg="UnmountInstance finished" driver=lvm instance=xpraclient-arch pool=xpraclientTPool project=default
t=2021-04-05T16:48:27+0300 lvl=dbug msg="Failure for task operation: 3a98bd1b-dbe5-4b8d-936c-e7aacd8cdac0: Failed preparing container for start: invalid argument - Failed to change ACLs on /var/snap/lxd/common/lxd/storage-pools/xpraclientTPool/containers/xpraclient-arch/rootfs/var/log/journal" 
t=2021-04-05T16:48:27+0300 lvl=dbug msg="Event listener finished: 66caf421-2345-4aab-b028-fe0c3682ba5f" 
t=2021-04-05T16:48:27+0300 lvl=dbug msg="Disconnected event listener: 66caf421-2345-4aab-b028-fe0c3682ba5f" 

Restart result:

$ lxc restart xpraclient-arch
WARNING: cgroup v2 is not fully supported yet, proceeding with partial confinement
Error: Failed preparing container for start: invalid argument - Failed to change ACLs on /var/snap/lxd/common/lxd/storage-pools/xpraclientTPool/containers/xpraclient-arch/rootfs/var/log/journal
Try `lxc info --show-log xpraclient-arch` for more info

$ sudo nsenter -t $(cat /var/snap/lxd/common/lxd.pid) -m -u -i -n -- /bin/bash -c "/bin/ls -la /var/snap/lxd/common/lxd/storage-pools/xpraclientTPool/containers/xpraclient-arch/rootfs/var/log/journal"
/bin/ls: cannot access '/var/snap/lxd/common/lxd/storage-pools/xpraclientTPool/containers/xpraclient-arch/rootfs/var/log/journal': No such file or directory

Container log:

$ lxc info --show-log xpraclient-arch
WARNING: cgroup v2 is not fully supported yet, proceeding with partial confinement
Name: xpraclient-arch
Location: none
Remote: unix://
Architecture: x86_64
Created: 2021/03/16 14:26 UTC
Status: Stopped
Type: container
Profiles: xpraclient-arch, shared_storage, x11
Snapshots:
  210402-before-update (taken at 2021/04/02 08:18 UTC) (stateless)
  210405-broken (taken at 2021/04/05 10:27 UTC) (stateless)

Log:

lxc 20210405135304.545 TRACE    commands - commands.c:lxc_cmd:302 - Connection refused - Command "get_state" failed to connect command socket
lxc 20210405135304.550 TRACE    commands - commands.c:lxc_cmd:302 - Connection refused - Command "get_state" failed to connect command socket
lxc 20210405135304.550 TRACE    commands - commands.c:lxc_cmd:302 - Connection refused - Command "get_state" failed to connect command socket

And i have managed to start it if i restore it from snapshot with uid/gid mapping profile enabled.

$ lxc info
WARNING: cgroup v2 is not fully supported yet, proceeding with partial confinement
config:
  core.https_address: '[::]:8443'
  images.auto_update_interval: "0"
api_extensions:
- storage_zfs_remove_snapshots
- container_host_shutdown_timeout
- container_stop_priority
- container_syscall_filtering
- auth_pki
- container_last_used_at
- etag
- patch
- usb_devices
- https_allowed_credentials
- image_compression_algorithm
- directory_manipulation
- container_cpu_time
- storage_zfs_use_refquota
- storage_lvm_mount_options
- network
- profile_usedby
- container_push
- container_exec_recording
- certificate_update
- container_exec_signal_handling
- gpu_devices
- container_image_properties
- migration_progress
- id_map
- network_firewall_filtering
- network_routes
- storage
- file_delete
- file_append
- network_dhcp_expiry
- storage_lvm_vg_rename
- storage_lvm_thinpool_rename
- network_vlan
- image_create_aliases
- container_stateless_copy
- container_only_migration
- storage_zfs_clone_copy
- unix_device_rename
- storage_lvm_use_thinpool
- storage_rsync_bwlimit
- network_vxlan_interface
- storage_btrfs_mount_options
- entity_description
- image_force_refresh
- storage_lvm_lv_resizing
- id_map_base
- file_symlinks
- container_push_target
- network_vlan_physical
- storage_images_delete
- container_edit_metadata
- container_snapshot_stateful_migration
- storage_driver_ceph
- storage_ceph_user_name
- resource_limits
- storage_volatile_initial_source
- storage_ceph_force_osd_reuse
- storage_block_filesystem_btrfs
- resources
- kernel_limits
- storage_api_volume_rename
- macaroon_authentication
- network_sriov
- console
- restrict_devlxd
- migration_pre_copy
- infiniband
- maas_network
- devlxd_events
- proxy
- network_dhcp_gateway
- file_get_symlink
- network_leases
- unix_device_hotplug
- storage_api_local_volume_handling
- operation_description
- clustering
- event_lifecycle
- storage_api_remote_volume_handling
- nvidia_runtime
- container_mount_propagation
- container_backup
- devlxd_images
- container_local_cross_pool_handling
- proxy_unix
- proxy_udp
- clustering_join
- proxy_tcp_udp_multi_port_handling
- network_state
- proxy_unix_dac_properties
- container_protection_delete
- unix_priv_drop
- pprof_http
- proxy_haproxy_protocol
- network_hwaddr
- proxy_nat
- network_nat_order
- container_full
- candid_authentication
- backup_compression
- candid_config
- nvidia_runtime_config
- storage_api_volume_snapshots
- storage_unmapped
- projects
- candid_config_key
- network_vxlan_ttl
- container_incremental_copy
- usb_optional_vendorid
- snapshot_scheduling
- container_copy_project
- clustering_server_address
- clustering_image_replication
- container_protection_shift
- snapshot_expiry
- container_backup_override_pool
- snapshot_expiry_creation
- network_leases_location
- resources_cpu_socket
- resources_gpu
- resources_numa
- kernel_features
- id_map_current
- event_location
- storage_api_remote_volume_snapshots
- network_nat_address
- container_nic_routes
- rbac
- cluster_internal_copy
- seccomp_notify
- lxc_features
- container_nic_ipvlan
- network_vlan_sriov
- storage_cephfs
- container_nic_ipfilter
- resources_v2
- container_exec_user_group_cwd
- container_syscall_intercept
- container_disk_shift
- storage_shifted
- resources_infiniband
- daemon_storage
- instances
- image_types
- resources_disk_sata
- clustering_roles
- images_expiry
- resources_network_firmware
- backup_compression_algorithm
- ceph_data_pool_name
- container_syscall_intercept_mount
- compression_squashfs
- container_raw_mount
- container_nic_routed
- container_syscall_intercept_mount_fuse
- container_disk_ceph
- virtual-machines
- image_profiles
- clustering_architecture
- resources_disk_id
- storage_lvm_stripes
- vm_boot_priority
- unix_hotplug_devices
- api_filtering
- instance_nic_network
- clustering_sizing
- firewall_driver
- projects_limits
- container_syscall_intercept_hugetlbfs
- limits_hugepages
- container_nic_routed_gateway
- projects_restrictions
- custom_volume_snapshot_expiry
- volume_snapshot_scheduling
- trust_ca_certificates
- snapshot_disk_usage
- clustering_edit_roles
- container_nic_routed_host_address
- container_nic_ipvlan_gateway
- resources_usb_pci
- resources_cpu_threads_numa
- resources_cpu_core_die
- api_os
- container_nic_routed_host_table
- container_nic_ipvlan_host_table
- container_nic_ipvlan_mode
- resources_system
- images_push_relay
- network_dns_search
- container_nic_routed_limits
- instance_nic_bridged_vlan
- network_state_bond_bridge
- usedby_consistency
- custom_block_volumes
- clustering_failure_domains
- resources_gpu_mdev
- console_vga_type
- projects_limits_disk
- network_type_macvlan
- network_type_sriov
- container_syscall_intercept_bpf_devices
- network_type_ovn
- projects_networks
- projects_networks_restricted_uplinks
- custom_volume_backup
- backup_override_name
- storage_rsync_compression
- network_type_physical
- network_ovn_external_subnets
- network_ovn_nat
- network_ovn_external_routes_remove
- tpm_device_type
- storage_zfs_clone_copy_rebase
- gpu_mdev
- resources_pci_iommu
- resources_network_usb
- resources_disk_address
- network_physical_ovn_ingress_mode
- network_ovn_dhcp
- network_physical_routes_anycast
- projects_limits_instances
- network_state_vlan
- instance_nic_bridged_port_isolation
- instance_bulk_state_change
- network_gvrp
- instance_pool_move
- gpu_sriov
- pci_device_type
- storage_volume_state
- network_acl
- migration_stateful
- disk_state_quota
- storage_ceph_features
- projects_compression
- projects_images_remote_cache_expiry
- certificate_project
- network_ovn_acl
- projects_images_auto_update
- projects_restricted_cluster_target
api_status: stable
api_version: "1.0"
auth: trusted
public: false
auth_methods:
- tls
environment:
  addresses:
  - 192.168.1.200:8443
  - 10.160.195.1:8443
  architectures:
  - x86_64
  - i686
  certificate: |
    -----BEGIN CERTIFICATE-----
    MIICGDCCAZ+gAwIBAgIRAL++gzfM8xNuOb4BsKBTCaMwCgYIKoZIzj0EAwMwOzEc
    MBoGA1UEChMTbGludXhjb250YWluZXJzLm9yZzEbMBkGA1UEAwwScm9vdEBMSVZF
    LkFSQ0guQk9YMB4XDTIwMDgyNTEyMTUwOVoXDTMwMDgyMzEyMTUwOVowOzEcMBoG
    A1UEChMTbGludXhjb250YWluZXJzLm9yZzEbMBkGA1UEAwwScm9vdEBMSVZFLkFS
    Q0guQk9YMHYwEAYHKoZIzj0CAQYFK4EEACIDYgAEY17ximGHRxC439CHtZhF2k0q
    vBfYRoPgaxFs/1SLx19saWiVBPRlnyG9+l8fb3zyr0aggZXMyJ4UsIUz9YlvFL+I
    n1n4rpNYbw5OT6dC+RQef0aHBZAxoPdIqXX0122fo2cwZTAOBgNVHQ8BAf8EBAMC
    BaAwEwYDVR0lBAwwCgYIKwYBBQUHAwEwDAYDVR0TAQH/BAIwADAwBgNVHREEKTAn
    gg1MSVZFLkFSQ0guQk9YhwR/AAABhxAAAAAAAAAAAAAAAAAAAAABMAoGCCqGSM49
    BAMDA2cAMGQCMHElVmytnDb/FFlJlfnNOjvCH02teNlagxnUkVfHsDWNpCHlVgro
    xldds5u/pggnswIwHU3MVytvgzShaqjetiWK9uo5K2FzWr1sp9j65dzMx7nGurDD
    ohn/xOdz9DISuVU2
    -----END CERTIFICATE-----
  certificate_fingerprint: ccbd532659224204c192fa609b86cce2282e722b5d6a7dde67e8f988661b9803
  driver: lxc | qemu
  driver_version: 4.0.6 | 5.2.0
  firewall: nftables
  kernel: Linux
  kernel_architecture: x86_64
  kernel_features:
    netnsid_getifaddrs: "true"
    seccomp_listener: "true"
    seccomp_listener_continue: "true"
    shiftfs: "false"
    uevent_injection: "true"
    unpriv_fscaps: "true"
  kernel_version: 5.11.11-arch1-1
  lxc_features:
    cgroup2: "true"
    devpts_fd: "true"
    mount_injection_file: "true"
    network_gateway_device_route: "true"
    network_ipvlan: "true"
    network_l2proxy: "true"
    network_phys_macvlan_mtu: "true"
    network_veth_router: "true"
    pidfd: "true"
    seccomp_allow_deny_syntax: "true"
    seccomp_notify: "true"
    seccomp_proxy_send_notify_fd: "true"
  os_name: Arch Linux
  os_version: ""
  project: default
  server: lxd
  server_clustered: false
  server_name: LIVE.ARCH.BOX
  server_pid: 1118
  server_version: "4.12"
  storage: lvm | btrfs
  storage_version: 2.03.11(2) (2021-01-08) / 1.02.175 (2021-01-08) / 4.43.0 | 4.15.1

Another attempt: restore with uid/gid mapping profile remove -> try start -> add uid/gid mapping profile -> try start (another error) -> set container privileged -> try start …

sudo tail -f /var/snap/lxd/common/lxd/logs/lxd.log
t=2021-04-05T16:58:26+0300 lvl=dbug msg="New task Operation: aa83c921-7528-47e0-9599-9a9aaf08d505" 
t=2021-04-05T16:58:26+0300 lvl=dbug msg="Started task operation: aa83c921-7528-47e0-9599-9a9aaf08d505" 
t=2021-04-05T16:58:26+0300 lvl=dbug msg="\n\t{\n\t\t\"type\": \"async\",\n\t\t\"status\": \"Operation created\",\n\t\t\"status_code\": 100,\n\t\t\"operation\": \"/1.0/operations/aa83c921-7528-47e0-9599-9a9aaf08d505\",\n\t\t\"error_code\": 0,\n\t\t\"error\": \"\",\n\t\t\"metadata\": {\n\t\t\t\"id\": \"aa83c921-7528-47e0-9599-9a9aaf08d505\",\n\t\t\t\"class\": \"task\",\n\t\t\t\"description\": \"Starting instance\",\n\t\t\t\"created_at\": \"2021-04-05T16:58:26.24754608+03:00\",\n\t\t\t\"updated_at\": \"2021-04-05T16:58:26.24754608+03:00\",\n\t\t\t\"status\": \"Running\",\n\t\t\t\"status_code\": 103,\n\t\t\t\"resources\": {\n\t\t\t\t\"instances\": [\n\t\t\t\t\t\"/1.0/instances/xpraclient-arch\"\n\t\t\t\t]\n\t\t\t},\n\t\t\t\"metadata\": null,\n\t\t\t\"may_cancel\": false,\n\t\t\t\"err\": \"\",\n\t\t\t\"location\": \"none\"\n\t\t}\n\t}" 
t=2021-04-05T16:58:26+0300 lvl=dbug msg="MountInstance started" driver=lvm instance=xpraclient-arch pool=xpraclientTPool project=default
t=2021-04-05T16:58:26+0300 lvl=dbug msg=Handling ip=@ method=GET protocol=unix url=/1.0/operations/aa83c921-7528-47e0-9599-9a9aaf08d505 username=colt
t=2021-04-05T16:58:26+0300 lvl=dbug msg="Activated logical volume" dev=/dev/xpraclientTPoolVG/containers_xpraclient--arch driver=lvm pool=xpraclientTPool
t=2021-04-05T16:58:26+0300 lvl=dbug msg="Mounted logical volume" dev=/dev/xpraclientTPoolVG/containers_xpraclient--arch driver=lvm options=discard path=/var/snap/lxd/common/lxd/storage-pools/xpraclientTPool/containers/xpraclient-arch pool=xpraclientTPool
t=2021-04-05T16:58:26+0300 lvl=dbug msg="MountInstance finished" driver=lvm instance=xpraclient-arch pool=xpraclientTPool project=default
t=2021-04-05T16:58:26+0300 lvl=dbug msg="Container idmap changed, remapping" instance=xpraclient-arch instanceType=container project=default
t=2021-04-05T16:58:26+0300 lvl=dbug msg="Updated metadata for task Operation: aa83c921-7528-47e0-9599-9a9aaf08d505" 
t=2021-04-05T16:58:49+0300 lvl=dbug msg="UnmountInstance started" driver=lvm instance=xpraclient-arch pool=xpraclientTPool project=default
t=2021-04-05T16:58:53+0300 lvl=dbug msg="Unmounted logical volume" driver=lvm keepBlockDev=false path=/var/snap/lxd/common/lxd/storage-pools/xpraclientTPool/containers/xpraclient-arch pool=xpraclientTPool
t=2021-04-05T16:58:53+0300 lvl=dbug msg="Deactivated logical volume" dev=/dev/xpraclientTPoolVG/containers_xpraclient--arch driver=lvm pool=xpraclientTPool
t=2021-04-05T16:58:53+0300 lvl=dbug msg="UnmountInstance finished" driver=lvm instance=xpraclient-arch pool=xpraclientTPool project=default
t=2021-04-05T16:58:53+0300 lvl=dbug msg="Failure for task operation: aa83c921-7528-47e0-9599-9a9aaf08d505: Failed preparing container for start: invalid argument - Failed to change ACLs on /var/snap/lxd/common/lxd/storage-pools/xpraclientTPool/containers/xpraclient-arch/rootfs/var/log/journal" 
t=2021-04-05T16:58:53+0300 lvl=dbug msg="Event listener finished: e941f934-d3b3-4da5-8e76-f14fb9eacbbc" 
t=2021-04-05T16:58:53+0300 lvl=dbug msg="Disconnected event listener: e941f934-d3b3-4da5-8e76-f14fb9eacbbc" 
t=2021-04-05T16:59:28+0300 lvl=dbug msg=Handling ip=@ method=GET protocol=unix url=/1.0 username=colt
t=2021-04-05T16:59:28+0300 lvl=dbug msg=Handling ip=@ method=GET protocol=unix url=/1.0/instances/xpraclient-arch username=colt
t=2021-04-05T16:59:28+0300 lvl=dbug msg=Handling ip=@ method=GET protocol=unix url=/1.0/events username=colt
t=2021-04-05T16:59:28+0300 lvl=dbug msg="New event listener: e722e336-af45-42ba-abf7-8732e9ca4fd6" 
t=2021-04-05T16:59:28+0300 lvl=dbug msg=Handling ip=@ method=PUT protocol=unix url=/1.0/instances/xpraclient-arch username=colt
t=2021-04-05T16:59:28+0300 lvl=dbug msg="\n\t{\n\t\t\"architecture\": \"x86_64\",\n\t\t\"config\": {\n\t\t\t\"image.architecture\": \"x86_64\",\n\t\t\t\"image.description\": \"Archlinux  x86_64 (20201120_04:33)\",\n\t\t\t\"image.name\": \"archlinux--x86_64-default-20201120_04:33\",\n\t\t\t\"image.os\": \"archlinux\",\n\t\t\t\"image.serial\": \"20201120_04:33\",\n\t\t\t\"image.variant\": \"default\",\n\t\t\t\"volatile.base_image\": \"9423095458216311d4022896fff73bb390fc72c57ff49940e0b8e09c398ce896\",\n\t\t\t\"volatile.eth0.hwaddr\": \"00:16:3e:7c:12:de\",\n\t\t\t\"volatile.idmap.base\": \"0\",\n\t\t\t\"volatile.idmap.current\": \"[{\\\"Isuid\\\":true,\\\"Isgid\\\":false,\\\"Hostid\\\":1000000,\\\"Nsid\\\":0,\\\"Maprange\\\":1000},{\\\"Isuid\\\":true,\\\"Isgid\\\":false,\\\"Hostid\\\":1000,\\\"Nsid\\\":1000,\\\"Maprange\\\":1},{\\\"Isuid\\\":true,\\\"Isgid\\\":false,\\\"Hostid\\\":1001001,\\\"Nsid\\\":1001,\\\"Maprange\\\":999998999},{\\\"Isuid\\\":false,\\\"Isgid\\\":true,\\\"Hostid\\\":1000000,\\\"Nsid\\\":0,\\\"Maprange\\\":985},{\\\"Isuid\\\":false,\\\"Isgid\\\":true,\\\"Hostid\\\":985,\\\"Nsid\\\":985,\\\"Maprange\\\":1},{\\\"Isuid\\\":false,\\\"Isgid\\\":true,\\\"Hostid\\\":1000986,\\\"Nsid\\\":986,\\\"Maprange\\\":999999014}]\",\n\t\t\t\"volatile.idmap.next\": \"[{\\\"Isuid\\\":true,\\\"Isgid\\\":false,\\\"Hostid\\\":1000000,\\\"Nsid\\\":0,\\\"Maprange\\\":1000000000},{\\\"Isuid\\\":false,\\\"Isgid\\\":true,\\\"Hostid\\\":1000000,\\\"Nsid\\\":0,\\\"Maprange\\\":1000000000}]\",\n\t\t\t\"volatile.last_state.idmap\": \"[{\\\"Isuid\\\":true,\\\"Isgid\\\":false,\\\"Hostid\\\":1000000,\\\"Nsid\\\":0,\\\"Maprange\\\":1000},{\\\"Isuid\\\":true,\\\"Isgid\\\":false,\\\"Hostid\\\":1000,\\\"Nsid\\\":1000,\\\"Maprange\\\":1},{\\\"Isuid\\\":true,\\\"Isgid\\\":false,\\\"Hostid\\\":1001001,\\\"Nsid\\\":1001,\\\"Maprange\\\":999998999},{\\\"Isuid\\\":false,\\\"Isgid\\\":true,\\\"Hostid\\\":1000000,\\\"Nsid\\\":0,\\\"Maprange\\\":985},{\\\"Isuid\\\":false,\\\"Isgid\\\":true,\\\"Hostid\\\":985,\\\"Nsid\\\":985,\\\"Maprange\\\":1},{\\\"Isuid\\\":false,\\\"Isgid\\\":true,\\\"Hostid\\\":1000986,\\\"Nsid\\\":986,\\\"Maprange\\\":999999014}]\",\n\t\t\t\"volatile.last_state.power\": \"RUNNING\",\n\t\t\t\"volatile.uuid\": \"c72e2b3c-56b4-42e4-9081-6c098e3da3d5\"\n\t\t},\n\t\t\"devices\": {\n\t\t\t\"root\": {\n\t\t\t\t\"path\": \"/\",\n\t\t\t\t\"pool\": \"xpraclientTPool\",\n\t\t\t\t\"type\": \"disk\"\n\t\t\t}\n\t\t},\n\t\t\"ephemeral\": false,\n\t\t\"profiles\": [\n\t\t\t\"xpraclient-arch\",\n\t\t\t\"shared_storage\",\n\t\t\t\"x11\"\n\t\t],\n\t\t\"stateful\": false,\n\t\t\"description\": \"\"\n\t}" 
t=2021-04-05T16:59:28+0300 lvl=dbug msg="New task Operation: b729fa62-dca0-4fbd-bf65-074cc13e2314" 
t=2021-04-05T16:59:28+0300 lvl=dbug msg="Started task operation: b729fa62-dca0-4fbd-bf65-074cc13e2314" 
t=2021-04-05T16:59:28+0300 lvl=dbug msg="\n\t{\n\t\t\"type\": \"async\",\n\t\t\"status\": \"Operation created\",\n\t\t\"status_code\": 100,\n\t\t\"operation\": \"/1.0/operations/b729fa62-dca0-4fbd-bf65-074cc13e2314\",\n\t\t\"error_code\": 0,\n\t\t\"error\": \"\",\n\t\t\"metadata\": {\n\t\t\t\"id\": \"b729fa62-dca0-4fbd-bf65-074cc13e2314\",\n\t\t\t\"class\": \"task\",\n\t\t\t\"description\": \"Updating instance\",\n\t\t\t\"created_at\": \"2021-04-05T16:59:28.218831861+03:00\",\n\t\t\t\"updated_at\": \"2021-04-05T16:59:28.218831861+03:00\",\n\t\t\t\"status\": \"Running\",\n\t\t\t\"status_code\": 103,\n\t\t\t\"resources\": {\n\t\t\t\t\"containers\": [\n\t\t\t\t\t\"/1.0/containers/xpraclient-arch\"\n\t\t\t\t],\n\t\t\t\t\"instances\": [\n\t\t\t\t\t\"/1.0/instances/xpraclient-arch\"\n\t\t\t\t]\n\t\t\t},\n\t\t\t\"metadata\": null,\n\t\t\t\"may_cancel\": false,\n\t\t\t\"err\": \"\",\n\t\t\t\"location\": \"none\"\n\t\t}\n\t}" 
t=2021-04-05T16:59:28+0300 lvl=dbug msg=Handling ip=@ method=GET protocol=unix url=/1.0/operations/b729fa62-dca0-4fbd-bf65-074cc13e2314 username=colt
t=2021-04-05T16:59:28+0300 lvl=dbug msg="UpdateInstanceBackupFile started" driver=lvm instance=xpraclient-arch pool=xpraclientTPool project=default
t=2021-04-05T16:59:28+0300 lvl=dbug msg="Activated logical volume" dev=/dev/xpraclientTPoolVG/containers_xpraclient--arch driver=lvm pool=xpraclientTPool
t=2021-04-05T16:59:28+0300 lvl=dbug msg="Mounted logical volume" dev=/dev/xpraclientTPoolVG/containers_xpraclient--arch driver=lvm options=discard path=/var/snap/lxd/common/lxd/storage-pools/xpraclientTPool/containers/xpraclient-arch pool=xpraclientTPool
t=2021-04-05T16:59:28+0300 lvl=dbug msg="Unmounted logical volume" driver=lvm keepBlockDev=false path=/var/snap/lxd/common/lxd/storage-pools/xpraclientTPool/containers/xpraclient-arch pool=xpraclientTPool
t=2021-04-05T16:59:28+0300 lvl=dbug msg="Deactivated logical volume" dev=/dev/xpraclientTPoolVG/containers_xpraclient--arch driver=lvm pool=xpraclientTPool
t=2021-04-05T16:59:28+0300 lvl=dbug msg="UpdateInstanceBackupFile finished" driver=lvm instance=xpraclient-arch pool=xpraclientTPool project=default
t=2021-04-05T16:59:28+0300 lvl=dbug msg="Success for task operation: b729fa62-dca0-4fbd-bf65-074cc13e2314" 
t=2021-04-05T16:59:28+0300 lvl=dbug msg="Event listener finished: e722e336-af45-42ba-abf7-8732e9ca4fd6" 
t=2021-04-05T16:59:28+0300 lvl=dbug msg="Disconnected event listener: e722e336-af45-42ba-abf7-8732e9ca4fd6" 
t=2021-04-05T16:59:56+0300 lvl=dbug msg=Handling ip=@ method=GET protocol=unix url=/1.0 username=colt
t=2021-04-05T16:59:56+0300 lvl=dbug msg=Handling ip=@ method=GET protocol=unix url=/1.0/instances/xpraclient-arch username=colt
t=2021-04-05T16:59:56+0300 lvl=dbug msg=Handling ip=@ method=GET protocol=unix url=/1.0/events username=colt
t=2021-04-05T16:59:56+0300 lvl=dbug msg="New event listener: da94e425-da00-4c21-943e-8866b838721b" 
t=2021-04-05T16:59:56+0300 lvl=dbug msg=Handling ip=@ method=PUT protocol=unix url=/1.0/instances/xpraclient-arch/state username=colt
t=2021-04-05T16:59:56+0300 lvl=dbug msg="\n\t{\n\t\t\"action\": \"start\",\n\t\t\"timeout\": 0,\n\t\t\"force\": false,\n\t\t\"stateful\": false\n\t}" 
t=2021-04-05T16:59:56+0300 lvl=dbug msg="New task Operation: 3d26051f-321c-4a6f-bfdb-0998349f8f52" 
t=2021-04-05T16:59:56+0300 lvl=dbug msg="Started task operation: 3d26051f-321c-4a6f-bfdb-0998349f8f52" 
t=2021-04-05T16:59:56+0300 lvl=dbug msg="\n\t{\n\t\t\"type\": \"async\",\n\t\t\"status\": \"Operation created\",\n\t\t\"status_code\": 100,\n\t\t\"operation\": \"/1.0/operations/3d26051f-321c-4a6f-bfdb-0998349f8f52\",\n\t\t\"error_code\": 0,\n\t\t\"error\": \"\",\n\t\t\"metadata\": {\n\t\t\t\"id\": \"3d26051f-321c-4a6f-bfdb-0998349f8f52\",\n\t\t\t\"class\": \"task\",\n\t\t\t\"description\": \"Starting instance\",\n\t\t\t\"created_at\": \"2021-04-05T16:59:56.102566192+03:00\",\n\t\t\t\"updated_at\": \"2021-04-05T16:59:56.102566192+03:00\",\n\t\t\t\"status\": \"Running\",\n\t\t\t\"status_code\": 103,\n\t\t\t\"resources\": {\n\t\t\t\t\"instances\": [\n\t\t\t\t\t\"/1.0/instances/xpraclient-arch\"\n\t\t\t\t]\n\t\t\t},\n\t\t\t\"metadata\": null,\n\t\t\t\"may_cancel\": false,\n\t\t\t\"err\": \"\",\n\t\t\t\"location\": \"none\"\n\t\t}\n\t}" 
t=2021-04-05T16:59:56+0300 lvl=dbug msg="MountInstance started" driver=lvm instance=xpraclient-arch pool=xpraclientTPool project=default
t=2021-04-05T16:59:56+0300 lvl=dbug msg=Handling ip=@ method=GET protocol=unix url=/1.0/operations/3d26051f-321c-4a6f-bfdb-0998349f8f52 username=colt
t=2021-04-05T16:59:56+0300 lvl=dbug msg="Activated logical volume" dev=/dev/xpraclientTPoolVG/containers_xpraclient--arch driver=lvm pool=xpraclientTPool
t=2021-04-05T16:59:56+0300 lvl=dbug msg="Mounted logical volume" dev=/dev/xpraclientTPoolVG/containers_xpraclient--arch driver=lvm options=discard path=/var/snap/lxd/common/lxd/storage-pools/xpraclientTPool/containers/xpraclient-arch pool=xpraclientTPool
t=2021-04-05T16:59:56+0300 lvl=dbug msg="MountInstance finished" driver=lvm instance=xpraclient-arch pool=xpraclientTPool project=default
t=2021-04-05T16:59:56+0300 lvl=dbug msg="Starting device" device=eth0 instance=xpraclient-arch instanceType=container project=default type=nic
t=2021-04-05T16:59:56+0300 lvl=dbug msg="Starting device" device=root instance=xpraclient-arch instanceType=container project=default type=disk
t=2021-04-05T16:59:56+0300 lvl=dbug msg="Starting device" device=shared_storage instance=xpraclient-arch instanceType=container project=default type=disk
t=2021-04-05T16:59:56+0300 lvl=dbug msg="Starting device" device=Xauthority2 instance=xpraclient-arch instanceType=container project=default type=disk
t=2021-04-05T16:59:56+0300 lvl=dbug msg="Starting device" device=Xauthority instance=xpraclient-arch instanceType=container project=default type=disk
t=2021-04-05T16:59:56+0300 lvl=dbug msg="Starting device" device=X0 instance=xpraclient-arch instanceType=container project=default type=proxy
t=2021-04-05T16:59:56+0300 lvl=dbug msg="Starting device" device=mygpu instance=xpraclient-arch instanceType=container project=default type=gpu
t=2021-04-05T16:59:56+0300 lvl=dbug msg="UpdateInstanceBackupFile started" driver=lvm instance=xpraclient-arch pool=xpraclientTPool project=default
t=2021-04-05T16:59:56+0300 lvl=dbug msg="Skipping unmount as in use" driver=lvm pool=xpraclientTPool refCount=1
t=2021-04-05T16:59:56+0300 lvl=dbug msg="UpdateInstanceBackupFile finished" driver=lvm instance=xpraclient-arch pool=xpraclientTPool project=default
t=2021-04-05T16:59:56+0300 lvl=info msg="Starting container" action=start created=2021-03-16T16:26:40+0200 ephemeral=false instance=xpraclient-arch instanceType=container project=default stateful=false used=2021-04-05T16:35:51+0300
t=2021-04-05T16:59:56+0300 lvl=dbug msg=Handling ip=@ method=GET protocol=unix url="/internal/containers/xpraclient-arch/onstart?project=default" username=root
t=2021-04-05T16:59:56+0300 lvl=dbug msg="Scheduler: container xpraclient-arch started: re-balancing" 
t=2021-04-05T16:59:56+0300 lvl=dbug msg=Handling ip=@ method=GET protocol=unix url="/internal/containers/xpraclient-arch/onstopns?netns=%2Fproc%2F151778%2Ffd%2F4&project=default&target=stop" username=root
t=2021-04-05T16:59:56+0300 lvl=dbug msg="Stopping device" device=eth0 instance=xpraclient-arch instanceType=container project=default type=nic
t=2021-04-05T16:59:56+0300 lvl=dbug msg="Clearing instance firewall static filters" dev=eth0 host_name=veth9639ea57 hwaddr=00:16:3e:7c:12:de instance=xpraclient-arch ipv4=0.0.0.0 ipv6=:: parent=lxdbr0 project=default
t=2021-04-05T16:59:56+0300 lvl=dbug msg="Clearing instance firewall dynamic filters" dev=eth0 host_name=veth9639ea57 hwaddr=00:16:3e:7c:12:de instance=xpraclient-arch ipv4=<nil> ipv6=<nil> parent=lxdbr0 project=default
t=2021-04-05T16:59:56+0300 lvl=eror msg="Failed starting container" action=start created=2021-03-16T16:26:40+0200 ephemeral=false instance=xpraclient-arch instanceType=container project=default stateful=false used=2021-04-05T16:35:51+0300
t=2021-04-05T16:59:56+0300 lvl=dbug msg="Failure for task operation: 3d26051f-321c-4a6f-bfdb-0998349f8f52: Failed to run: /snap/lxd/current/bin/lxd forkstart xpraclient-arch /var/snap/lxd/common/lxd/containers /var/snap/lxd/common/lxd/logs/xpraclient-arch/lxc.conf: " 
t=2021-04-05T16:59:56+0300 lvl=dbug msg="Event listener finished: da94e425-da00-4c21-943e-8866b838721b" 
t=2021-04-05T16:59:56+0300 lvl=dbug msg="Disconnected event listener: da94e425-da00-4c21-943e-8866b838721b" 
t=2021-04-05T16:59:57+0300 lvl=dbug msg=Handling ip=@ method=GET protocol=unix url="/internal/containers/xpraclient-arch/onstop?project=default&target=stop" username=root
t=2021-04-05T16:59:57+0300 lvl=dbug msg="Container initiated" action=stop created=2021-03-16T16:26:40+0200 ephemeral=false instance=xpraclient-arch instanceType=container project=default stateful=false used=2021-04-05T16:59:56+0300
t=2021-04-05T16:59:57+0300 lvl=dbug msg="Container stopped, cleaning up" instance=xpraclient-arch instanceType=container project=default
t=2021-04-05T16:59:57+0300 lvl=dbug msg="Stopping device" device=mygpu instance=xpraclient-arch instanceType=container project=default type=gpu
t=2021-04-05T16:59:57+0300 lvl=dbug msg="Stopping device" device=X0 instance=xpraclient-arch instanceType=container project=default type=proxy
t=2021-04-05T16:59:57+0300 lvl=dbug msg="Stopping device" device=Xauthority instance=xpraclient-arch instanceType=container project=default type=disk
t=2021-04-05T16:59:57+0300 lvl=dbug msg="Stopping device" device=Xauthority2 instance=xpraclient-arch instanceType=container project=default type=disk
t=2021-04-05T16:59:57+0300 lvl=dbug msg="Stopping device" device=shared_storage instance=xpraclient-arch instanceType=container project=default type=disk
t=2021-04-05T16:59:57+0300 lvl=dbug msg="Stopping device" device=root instance=xpraclient-arch instanceType=container project=default type=disk
t=2021-04-05T16:59:57+0300 lvl=dbug msg="UnmountInstance started" driver=lvm instance=xpraclient-arch pool=xpraclientTPool project=default
t=2021-04-05T16:59:57+0300 lvl=dbug msg="Unmounted logical volume" driver=lvm keepBlockDev=false path=/var/snap/lxd/common/lxd/storage-pools/xpraclientTPool/containers/xpraclient-arch pool=xpraclientTPool
t=2021-04-05T16:59:57+0300 lvl=dbug msg="Deactivated logical volume" dev=/dev/xpraclientTPoolVG/containers_xpraclient--arch driver=lvm pool=xpraclientTPool
t=2021-04-05T16:59:57+0300 lvl=dbug msg="UnmountInstance finished" driver=lvm instance=xpraclient-arch pool=xpraclientTPool project=default
t=2021-04-05T16:59:57+0300 lvl=info msg="Shut down container" action=stop created=2021-03-16T16:26:40+0200 ephemeral=false instance=xpraclient-arch instanceType=container project=default stateful=false used=2021-04-05T16:59:56+0300
t=2021-04-05T16:59:57+0300 lvl=dbug msg="Scheduler: container xpraclient-arch stopped: re-balancing" 
t=2021-04-05T17:01:05+0300 lvl=dbug msg=Handling ip=@ method=GET protocol=unix url=/1.0 username=colt
t=2021-04-05T17:01:05+0300 lvl=dbug msg=Handling ip=@ method=GET protocol=unix url=/1.0/instances/xpraclient-arch username=colt
t=2021-04-05T17:01:13+0300 lvl=dbug msg=Handling ip=@ method=GET protocol=unix url=/1.0 username=colt
t=2021-04-05T17:01:13+0300 lvl=dbug msg=Handling ip=@ method=GET protocol=unix url=/1.0/instances/xpraclient-arch username=colt
t=2021-04-05T17:01:13+0300 lvl=dbug msg=Handling ip=@ method=GET protocol=unix url=/1.0/events username=colt
t=2021-04-05T17:01:13+0300 lvl=dbug msg="New event listener: 4bdd06a2-a455-4320-a990-6732171f4f41" 
t=2021-04-05T17:01:13+0300 lvl=dbug msg=Handling ip=@ method=PUT protocol=unix url=/1.0/instances/xpraclient-arch username=colt
t=2021-04-05T17:01:13+0300 lvl=dbug msg="\n\t{\n\t\t\"architecture\": \"x86_64\",\n\t\t\"config\": {\n\t\t\t\"image.architecture\": \"x86_64\",\n\t\t\t\"image.description\": \"Archlinux  x86_64 (20201120_04:33)\",\n\t\t\t\"image.name\": \"archlinux--x86_64-default-20201120_04:33\",\n\t\t\t\"image.os\": \"archlinux\",\n\t\t\t\"image.serial\": \"20201120_04:33\",\n\t\t\t\"image.variant\": \"default\",\n\t\t\t\"security.privileged\": \"true\",\n\t\t\t\"volatile.base_image\": \"9423095458216311d4022896fff73bb390fc72c57ff49940e0b8e09c398ce896\",\n\t\t\t\"volatile.eth0.hwaddr\": \"00:16:3e:7c:12:de\",\n\t\t\t\"volatile.idmap.base\": \"0\",\n\t\t\t\"volatile.idmap.current\": \"[{\\\"Isuid\\\":true,\\\"Isgid\\\":false,\\\"Hostid\\\":1000000,\\\"Nsid\\\":0,\\\"Maprange\\\":1000},{\\\"Isuid\\\":true,\\\"Isgid\\\":false,\\\"Hostid\\\":1000,\\\"Nsid\\\":1000,\\\"Maprange\\\":1},{\\\"Isuid\\\":true,\\\"Isgid\\\":false,\\\"Hostid\\\":1001001,\\\"Nsid\\\":1001,\\\"Maprange\\\":999998999},{\\\"Isuid\\\":false,\\\"Isgid\\\":true,\\\"Hostid\\\":1000000,\\\"Nsid\\\":0,\\\"Maprange\\\":985},{\\\"Isuid\\\":false,\\\"Isgid\\\":true,\\\"Hostid\\\":985,\\\"Nsid\\\":985,\\\"Maprange\\\":1},{\\\"Isuid\\\":false,\\\"Isgid\\\":true,\\\"Hostid\\\":1000986,\\\"Nsid\\\":986,\\\"Maprange\\\":999999014}]\",\n\t\t\t\"volatile.idmap.next\": \"[{\\\"Isuid\\\":true,\\\"Isgid\\\":false,\\\"Hostid\\\":1000000,\\\"Nsid\\\":0,\\\"Maprange\\\":1000},{\\\"Isuid\\\":true,\\\"Isgid\\\":false,\\\"Hostid\\\":1000,\\\"Nsid\\\":1000,\\\"Maprange\\\":1},{\\\"Isuid\\\":true,\\\"Isgid\\\":false,\\\"Hostid\\\":1001001,\\\"Nsid\\\":1001,\\\"Maprange\\\":999998999},{\\\"Isuid\\\":false,\\\"Isgid\\\":true,\\\"Hostid\\\":1000000,\\\"Nsid\\\":0,\\\"Maprange\\\":985},{\\\"Isuid\\\":false,\\\"Isgid\\\":true,\\\"Hostid\\\":985,\\\"Nsid\\\":985,\\\"Maprange\\\":1},{\\\"Isuid\\\":false,\\\"Isgid\\\":true,\\\"Hostid\\\":1000986,\\\"Nsid\\\":986,\\\"Maprange\\\":999999014}]\",\n\t\t\t\"volatile.last_state.idmap\": \"[{\\\"Isuid\\\":true,\\\"Isgid\\\":false,\\\"Hostid\\\":1000000,\\\"Nsid\\\":0,\\\"Maprange\\\":1000},{\\\"Isuid\\\":true,\\\"Isgid\\\":false,\\\"Hostid\\\":1000,\\\"Nsid\\\":1000,\\\"Maprange\\\":1},{\\\"Isuid\\\":true,\\\"Isgid\\\":false,\\\"Hostid\\\":1001001,\\\"Nsid\\\":1001,\\\"Maprange\\\":999998999},{\\\"Isuid\\\":false,\\\"Isgid\\\":true,\\\"Hostid\\\":1000000,\\\"Nsid\\\":0,\\\"Maprange\\\":985},{\\\"Isuid\\\":false,\\\"Isgid\\\":true,\\\"Hostid\\\":985,\\\"Nsid\\\":985,\\\"Maprange\\\":1},{\\\"Isuid\\\":false,\\\"Isgid\\\":true,\\\"Hostid\\\":1000986,\\\"Nsid\\\":986,\\\"Maprange\\\":999999014}]\",\n\t\t\t\"volatile.last_state.power\": \"STOPPED\",\n\t\t\t\"volatile.uuid\": \"c72e2b3c-56b4-42e4-9081-6c098e3da3d5\"\n\t\t},\n\t\t\"devices\": {\n\t\t\t\"root\": {\n\t\t\t\t\"path\": \"/\",\n\t\t\t\t\"pool\": \"xpraclientTPool\",\n\t\t\t\t\"type\": \"disk\"\n\t\t\t}\n\t\t},\n\t\t\"ephemeral\": false,\n\t\t\"profiles\": [\n\t\t\t\"xpraclient-arch\",\n\t\t\t\"shared_storage\",\n\t\t\t\"x11\"\n\t\t],\n\t\t\"stateful\": false,\n\t\t\"description\": \"\"\n\t}" 
t=2021-04-05T17:01:13+0300 lvl=dbug msg="New task Operation: 41c16458-b0f2-4f55-b383-0ecdcc503962" 
t=2021-04-05T17:01:13+0300 lvl=dbug msg="Started task operation: 41c16458-b0f2-4f55-b383-0ecdcc503962" 
t=2021-04-05T17:01:13+0300 lvl=dbug msg="\n\t{\n\t\t\"type\": \"async\",\n\t\t\"status\": \"Operation created\",\n\t\t\"status_code\": 100,\n\t\t\"operation\": \"/1.0/operations/41c16458-b0f2-4f55-b383-0ecdcc503962\",\n\t\t\"error_code\": 0,\n\t\t\"error\": \"\",\n\t\t\"metadata\": {\n\t\t\t\"id\": \"41c16458-b0f2-4f55-b383-0ecdcc503962\",\n\t\t\t\"class\": \"task\",\n\t\t\t\"description\": \"Updating instance\",\n\t\t\t\"created_at\": \"2021-04-05T17:01:13.395574516+03:00\",\n\t\t\t\"updated_at\": \"2021-04-05T17:01:13.395574516+03:00\",\n\t\t\t\"status\": \"Running\",\n\t\t\t\"status_code\": 103,\n\t\t\t\"resources\": {\n\t\t\t\t\"containers\": [\n\t\t\t\t\t\"/1.0/containers/xpraclient-arch\"\n\t\t\t\t],\n\t\t\t\t\"instances\": [\n\t\t\t\t\t\"/1.0/instances/xpraclient-arch\"\n\t\t\t\t]\n\t\t\t},\n\t\t\t\"metadata\": null,\n\t\t\t\"may_cancel\": false,\n\t\t\t\"err\": \"\",\n\t\t\t\"location\": \"none\"\n\t\t}\n\t}" 
t=2021-04-05T17:01:13+0300 lvl=dbug msg=Handling ip=@ method=GET protocol=unix url=/1.0/operations/41c16458-b0f2-4f55-b383-0ecdcc503962 username=colt
t=2021-04-05T17:01:13+0300 lvl=dbug msg="UpdateInstanceBackupFile started" driver=lvm instance=xpraclient-arch pool=xpraclientTPool project=default
t=2021-04-05T17:01:13+0300 lvl=dbug msg="Activated logical volume" dev=/dev/xpraclientTPoolVG/containers_xpraclient--arch driver=lvm pool=xpraclientTPool
t=2021-04-05T17:01:13+0300 lvl=dbug msg="Mounted logical volume" dev=/dev/xpraclientTPoolVG/containers_xpraclient--arch driver=lvm options=discard path=/var/snap/lxd/common/lxd/storage-pools/xpraclientTPool/containers/xpraclient-arch pool=xpraclientTPool
t=2021-04-05T17:01:13+0300 lvl=dbug msg="Unmounted logical volume" driver=lvm keepBlockDev=false path=/var/snap/lxd/common/lxd/storage-pools/xpraclientTPool/containers/xpraclient-arch pool=xpraclientTPool
t=2021-04-05T17:01:14+0300 lvl=dbug msg="Deactivated logical volume" dev=/dev/xpraclientTPoolVG/containers_xpraclient--arch driver=lvm pool=xpraclientTPool
t=2021-04-05T17:01:14+0300 lvl=dbug msg="UpdateInstanceBackupFile finished" driver=lvm instance=xpraclient-arch pool=xpraclientTPool project=default
t=2021-04-05T17:01:14+0300 lvl=dbug msg="Success for task operation: 41c16458-b0f2-4f55-b383-0ecdcc503962" 
t=2021-04-05T17:01:14+0300 lvl=dbug msg="Event listener finished: 4bdd06a2-a455-4320-a990-6732171f4f41" 
t=2021-04-05T17:01:14+0300 lvl=dbug msg="Disconnected event listener: 4bdd06a2-a455-4320-a990-6732171f4f41" 
t=2021-04-05T17:01:21+0300 lvl=dbug msg=Handling ip=@ method=GET protocol=unix url=/1.0 username=colt
t=2021-04-05T17:01:21+0300 lvl=dbug msg=Handling ip=@ method=GET protocol=unix url=/1.0/instances/xpraclient-arch username=colt
t=2021-04-05T17:01:21+0300 lvl=dbug msg=Handling ip=@ method=GET protocol=unix url=/1.0/events username=colt
t=2021-04-05T17:01:21+0300 lvl=dbug msg="New event listener: 4c631b84-e27c-444a-af1e-8a2a1677c077" 
t=2021-04-05T17:01:21+0300 lvl=dbug msg=Handling ip=@ method=PUT protocol=unix url=/1.0/instances/xpraclient-arch/state username=colt
t=2021-04-05T17:01:21+0300 lvl=dbug msg="\n\t{\n\t\t\"action\": \"start\",\n\t\t\"timeout\": 0,\n\t\t\"force\": false,\n\t\t\"stateful\": false\n\t}" 
t=2021-04-05T17:01:21+0300 lvl=dbug msg="New task Operation: 86d6716d-5341-4bc1-aa30-01b269da208e" 
t=2021-04-05T17:01:21+0300 lvl=dbug msg="Started task operation: 86d6716d-5341-4bc1-aa30-01b269da208e" 
t=2021-04-05T17:01:21+0300 lvl=dbug msg="\n\t{\n\t\t\"type\": \"async\",\n\t\t\"status\": \"Operation created\",\n\t\t\"status_code\": 100,\n\t\t\"operation\": \"/1.0/operations/86d6716d-5341-4bc1-aa30-01b269da208e\",\n\t\t\"error_code\": 0,\n\t\t\"error\": \"\",\n\t\t\"metadata\": {\n\t\t\t\"id\": \"86d6716d-5341-4bc1-aa30-01b269da208e\",\n\t\t\t\"class\": \"task\",\n\t\t\t\"description\": \"Starting instance\",\n\t\t\t\"created_at\": \"2021-04-05T17:01:21.082762196+03:00\",\n\t\t\t\"updated_at\": \"2021-04-05T17:01:21.082762196+03:00\",\n\t\t\t\"status\": \"Running\",\n\t\t\t\"status_code\": 103,\n\t\t\t\"resources\": {\n\t\t\t\t\"instances\": [\n\t\t\t\t\t\"/1.0/instances/xpraclient-arch\"\n\t\t\t\t]\n\t\t\t},\n\t\t\t\"metadata\": null,\n\t\t\t\"may_cancel\": false,\n\t\t\t\"err\": \"\",\n\t\t\t\"location\": \"none\"\n\t\t}\n\t}" 
t=2021-04-05T17:01:21+0300 lvl=dbug msg=Handling ip=@ method=GET protocol=unix url=/1.0/operations/86d6716d-5341-4bc1-aa30-01b269da208e username=colt
t=2021-04-05T17:01:21+0300 lvl=dbug msg="MountInstance started" driver=lvm instance=xpraclient-arch pool=xpraclientTPool project=default
t=2021-04-05T17:01:21+0300 lvl=dbug msg="Activated logical volume" dev=/dev/xpraclientTPoolVG/containers_xpraclient--arch driver=lvm pool=xpraclientTPool
t=2021-04-05T17:01:21+0300 lvl=dbug msg="Mounted logical volume" dev=/dev/xpraclientTPoolVG/containers_xpraclient--arch driver=lvm options=discard path=/var/snap/lxd/common/lxd/storage-pools/xpraclientTPool/containers/xpraclient-arch pool=xpraclientTPool
t=2021-04-05T17:01:21+0300 lvl=dbug msg="MountInstance finished" driver=lvm instance=xpraclient-arch pool=xpraclientTPool project=default
t=2021-04-05T17:01:21+0300 lvl=dbug msg="Container idmap changed, remapping" instance=xpraclient-arch instanceType=container project=default
t=2021-04-05T17:01:21+0300 lvl=dbug msg="Updated metadata for task Operation: 86d6716d-5341-4bc1-aa30-01b269da208e" 
t=2021-04-05T17:01:40+0300 lvl=dbug msg="UnmountInstance started" driver=lvm instance=xpraclient-arch pool=xpraclientTPool project=default
t=2021-04-05T17:01:41+0300 lvl=dbug msg="Unmounted logical volume" driver=lvm keepBlockDev=false path=/var/snap/lxd/common/lxd/storage-pools/xpraclientTPool/containers/xpraclient-arch pool=xpraclientTPool
t=2021-04-05T17:01:42+0300 lvl=dbug msg="Deactivated logical volume" dev=/dev/xpraclientTPoolVG/containers_xpraclient--arch driver=lvm pool=xpraclientTPool
t=2021-04-05T17:01:42+0300 lvl=dbug msg="UnmountInstance finished" driver=lvm instance=xpraclient-arch pool=xpraclientTPool project=default
t=2021-04-05T17:01:42+0300 lvl=dbug msg="Failure for task operation: 86d6716d-5341-4bc1-aa30-01b269da208e: Failed preparing container for start: invalid argument - Failed to change ACLs on /var/snap/lxd/common/lxd/storage-pools/xpraclientTPool/containers/xpraclient-arch/rootfs/var/log/journal" 
t=2021-04-05T17:01:42+0300 lvl=dbug msg="Event listener finished: 4c631b84-e27c-444a-af1e-8a2a1677c077" 
t=2021-04-05T17:01:42+0300 lvl=dbug msg="Disconnected event listener: 4c631b84-e27c-444a-af1e-8a2a1677c077"

UPD luckily lxc copy worked with no errors, however to start container i had to enable privileged mode, otherwise there was an error:

msg="Failure for task operation: 6d2ea60c-6e38-4511-b798-d24733de9638: Failed to run: /snap/lxd/current/bin/lxd forkstart xprabro /var/snap/lxd/common/lxd/containers /var/snap/lxd/common/lxd/logs/xprabro/lxc.conf: "

And after clean boot, i have a problem with containers that are stored in lvm backend.
I need to manually disable _tmeta and _tdata volumes, to be able to start according containers.

$ lxc start messangers
WARNING: cgroup v2 is not fully supported yet, proceeding with partial confinement
Error: Failed preparing container for start: Failed to activate LVM logical volume "/dev/messangersTPoolVG/containers_messangers": Failed to run: lvchange --activate y --ignoreactivationskip /dev/messangersTPoolVG/containers_messangers: Activation of logical volume messangersTPoolVG/containers_messangers is prohibited while logical volume messangersTPoolVG/messangersTPool_tmeta is active.
Try `lxc info --show-log messangers` for more info

$ lxc info --show-log messangers
WARNING: cgroup v2 is not fully supported yet, proceeding with partial confinement
Name: messangers
Location: none
Remote: unix://
Architecture: x86_64
Created: 2021/03/17 07:11 UTC
Status: Stopped
Type: container
Profiles: pulseaudio, x11, shared_storage_messangers, messangers
Snapshots:
  210330 (taken at 2021/03/30 08:41 UTC) (stateless)

Log:

lxc 20210405075740.545 TRACE    commands - commands.c:lxc_cmd:302 - Connection refused - Command "get_state" failed to connect command socket
lxc 20210405075740.551 TRACE    commands - commands.c:lxc_cmd:302 - Connection refused - Command "get_state" failed to connect command socket
lxc 20210405075740.551 TRACE    commands - commands.c:lxc_cmd:302 - Connection refused - Command "get_state" failed to connect command socket

Run to fix:

sudo lvchange -an messangersTPoolVG/messangersTPool_tmeta
sudo lvchange -an messangersTPoolVG/messangersTPool_tdata
sudo lvchange -ay messangersTPoolVG/messangersTPool

 sudo lvscan

This looks like the issue:

Can you show output of sudo lvs please.

$ sudo lvs
  LV                                                      VG                Attr       LSize   Pool            Origin                                             Data%  Meta%  Move Log Cpy%Sync Convert                                                                                    
  containers_xprabro                                      lvmTPoolVG01      Vwi---tz-k <46.57g lvmTPool01                                                                                                
  lvmTPool01                                              lvmTPoolVG01      twi---tz-- <48.00g                                                                                                           
  containers_messangers                                   messangersTPoolVG Vwi-aotz-k <18.63g messangersTPool containers_messangers-210406--before--silos        62.09                                  
  containers_messangers-210330--before--zoom              messangersTPoolVG Vri---tz-k <18.63g messangersTPool                                                                                           
  containers_messangers-210405--fixed--before--yay        messangersTPoolVG Vri---tz-k <18.63g messangersTPool                                                                                           
  containers_messangers-210406--before--silos             messangersTPoolVG Vri---tz-k <18.63g messangersTPool                                                                                           
  messangersTPool                                         messangersTPoolVG twi-aotz-- <18.00g                                                                    85.33  2.74

Thanks.

Do those problem LVs become activated on next boot?

Yes, I have the same issue on next boot.
p.s. all containers on lvm backends are affected, i.e. xprabro, too.

Can you show output of sudo lvs -a after a fresh boot (before activating anything), I’m going to compare it to my local system running LXD 4.12 on Ubuntu Focal.

E.g. on first boot after LXD has started (but not started any instances), I see:

 lvs -a 
  LV                                                                      VG  Attr       LSize  Pool        Origin Data%  Meta%  Move Log Cpy%Sync Convert
  LXDThinPool                                                             lvm twi-aotz-- <6.38g                    10.28  1.61                            
  [LXDThinPool_tdata]                                                     lvm Twi-ao---- <6.38g                                                           
  [LXDThinPool_tmeta]                                                     lvm ewi-ao----  1.00g  

The a in the Attr col indicating they are all activated.

If I disable the thinpool volume both the other associated volumes get deactivated too:

lvchange -an lvm/LXDThinPool
lvs -a 
  LV                                                                      VG  Attr       LSize  Pool        Origin Data%  Meta%  Move Log Cpy%Sync Convert
  LXDThinPool                                                             lvm twi---tz-- <6.38g                                                           
  [LXDThinPool_tdata]                                                     lvm Twi------- <6.38g                                                           
  [LXDThinPool_tmeta]                                                     lvm ewi-------  1.00g                                                           

So I’m not sure why your system is preventing the use of the thinpool when those are activated.

It could potentially be a mismatch between the version of LVM in your distro and that bundled with th snap, you could try doing snap set lxd lvm.external=true and then rebooting to get LXD to use the LVM tools from your system.

Ok, i’ll post it later.
But, external lvm tool i have already enabled, when moved to lvm backend.

$ sudo snap get lxd lvm.external
true

$ sudo lvm version
  LVM version:     2.03.11(2) (2021-01-08)
  Library version: 1.02.175 (2021-01-08)
  Driver version:  4.43.0
  Configuration:   ./configure CONFIG_SHELL=/bin/bash --prefix=/usr --sbindir=/usr/bin --sysconfdir=/etc --localstatedir=/var --enable-cmdlib --enable-dmeventd --enable-lvmpolld --enable-pkgconfig --enable-readline --enable-udev_rules --enable-udev_sync --with-cache=internal --with-default-dm-run-dir=/run --with-default-locking-dir=/run/lock/lvm --with-default-pid-dir=/run --with-default-run-dir=/run/lvm --with-systemdsystemunitdir=/usr/lib/systemd/system --with-thin=internal --with-udev-prefix=/usr --enable-udev-systemd-background-jobs

Can you try creating a new LVM pool and seeing if it is afflicted with the same issue.

If it is then suggests something is affecting all LVM pools on your system, if not then suggests something particular to your original LVM pool.

Also, is it possible to look at the lvs -a output before LXD has started (by ensuring all instances are stopped and none are set to auto start before rebooting), that way we can see if it is the system altering the LV state or the act of starting LXD.

I have disabled snap and rebooted, to make sure that lxd doesnt alter lvm state.

$ sudo lvs -a
  LV                                                      VG                Attr       LSize   Pool            Origin                                             Data%  Meta%  Move Log Cpy%Sync Convert                                                               
  containers_xprabro                                      lvmTPoolVG01      Vwi---tz-k <46.57g lvmTPool01                                                                                                
  lvmTPool01                                              lvmTPoolVG01      twi---tz-- <48.00g                                                                                                           
  [lvmTPool01_tdata]                                      lvmTPoolVG01      Twi-a----- <48.00g                                                                                                           
  [lvmTPool01_tmeta]                                      lvmTPoolVG01      ewi-a-----   1.00g                                                                                                           
  [lvol0_pmspare]                                         lvmTPoolVG01      ewi-------   1.00g                                                                                                           
  containers_messangers                                   messangersTPoolVG Vwi---tz-k <18.63g messangersTPool containers_messangers-210406--before--silos                                               
  containers_messangers-210330--before--zoom              messangersTPoolVG Vri---tz-k <18.63g messangersTPool                                                                                           
  containers_messangers-210405--fixed--before--yay        messangersTPoolVG Vri---tz-k <18.63g messangersTPool                                                                                           
  containers_messangers-210406--before--silos             messangersTPoolVG Vri---tz-k <18.63g messangersTPool                                                                                           
  [lvol0_pmspare]                                         messangersTPoolVG ewi-------   1.00g                                                                                                           
  messangersTPool                                         messangersTPoolVG twi---tz-- <18.00g                                                                                                           
  [messangersTPool_tdata]                                 messangersTPoolVG Twi-a----- <18.00g                                                                                                           
  [messangersTPool_tmeta]                                 messangersTPoolVG ewi-a-----   1.00g           

I see that ‘O’ attribute is missing and if i’ve found proper description then it’s “device (o)pen (Volume is in active state or may be mounted )”.

If image lvm pool is good enough and it’s still necessary, then i’ll try it later.

UPD Interesting thing. After clean boot with lxd enabled again i have ‘o’ attribute on one lvm pool and when i try to run according container i receive another error.

  LV                                                      VG                Attr       LSize   Pool            Origin                                             Data%  Meta%  Move Log Cpy%Sync Convert                                                                                                  
  containers_xprabro                                      lvmTPoolVG01      Vwi---tz-k <46.57g lvmTPool01                                                                                                
  lvmTPool01                                              lvmTPoolVG01      twi-i-tz-- <48.00g                                                                                                           
  [lvmTPool01_tdata]                                      lvmTPoolVG01      Twi-ao---- <48.00g                                                                                                           
  [lvmTPool01_tmeta]                                      lvmTPoolVG01      ewi-ao----   1.00g                                                                                                           
  [lvol0_pmspare]                                         lvmTPoolVG01      ewi-------   1.00g                                                                                                           
  containers_messangers                                   messangersTPoolVG Vwi---tz-k <18.63g messangersTPool containers_messangers-210406--before--silos                                               
  containers_messangers-210330--before--zoom              messangersTPoolVG Vri---tz-k <18.63g messangersTPool                                                                                           
  containers_messangers-210405--fixed--before--yay        messangersTPoolVG Vri---tz-k <18.63g messangersTPool                                                                                           
  containers_messangers-210406--before--silos             messangersTPoolVG Vri---tz-k <18.63g messangersTPool                                                                                           
  [lvol0_pmspare]                                         messangersTPoolVG ewi-------   1.00g                                                                                                           
  messangersTPool                                         messangersTPoolVG twi---tz-- <18.00g                                                                                                           
  [messangersTPool_tdata]                                 messangersTPoolVG Twi-a----- <18.00g                                                                                                           
  [messangersTPool_tmeta]                                 messangersTPoolVG ewi-a-----   1.00g                                                                                                           

  
  lxc start xprabro
WARNING: cgroup v2 is not fully supported yet, proceeding with partial confinement
Error: Failed preparing container for start: Failed to activate LVM logical volume "/dev/lvmTPoolVG01/containers_xprabro": Failed to run: lvchange --activate y --ignoreactivationskip /dev/lvmTPoolVG01/containers_xprabro: device-mapper: reload ioctl on  (254:12) failed: Invalid argument
Try `lxc info --show-log xprabro` for more info

$ lxc info --show-log xprabro
WARNING: cgroup v2 is not fully supported yet, proceeding with partial confinement
Name: xprabro
Location: none
Remote: unix://
Architecture: x86_64
Created: 2021/04/05 14:40 UTC
Status: Stopped
Type: container
Profiles: xpraclient, shared_storage, x11

Log:

lxc 20210407051504.999 TRACE    commands - commands.c:lxc_cmd:302 - Connection refused - Command "get_state" failed to connect command socket
lxc 20210407051505.380 TRACE    commands - commands.c:lxc_cmd:302 - Connection refused - Command "get_state" failed to connect command socket
lxc 20210407051505.384 TRACE    commands - commands.c:lxc_cmd:302 - Connection refused - Command "get_state" failed to connect command socket

UPD2 image lvm pool is working as expected, when i’ll find spare disk to try lvm directly on it i’ll post an update.

Meanwhile, i have disabled lvm.external and containers started with no worries.

$ sudo snap get lxd lvm.external
false

$ sudo lvs --all
  LV                                                      VG                Attr       LSize   Pool            Origin                                             Data%  Meta%  Move Log Cpy%Sync Convert                                                                                                      
  containers_xprabro                                      lvmTPoolVG01      Vwi---tz-k <46.57g lvmTPool01                                                                                                
  lvmTPool01                                              lvmTPoolVG01      twi---tz-- <48.00g                                                                                                           
  [lvmTPool01_tdata]                                      lvmTPoolVG01      Twi-a----- <48.00g                                                                                                           
  [lvmTPool01_tmeta]                                      lvmTPoolVG01      ewi-a-----   1.00g                                                                                                           
  [lvol0_pmspare]                                         lvmTPoolVG01      ewi-------   1.00g                                                                                                           
  containers_messangers                                   messangersTPoolVG Vwi-aotz-k <18.63g messangersTPool containers_messangers-210406--before--silos        62.51                                  
  containers_messangers-210330--before--zoom              messangersTPoolVG Vri---tz-k <18.63g messangersTPool                                                                                           
  containers_messangers-210405--fixed--before--yay        messangersTPoolVG Vri---tz-k <18.63g messangersTPool                                                                                           
  containers_messangers-210406--before--silos             messangersTPoolVG Vri---tz-k <18.63g messangersTPool                                                                                           
  [lvol0_pmspare]                                         messangersTPoolVG ewi-------   1.00g                                                                                                           
  messangersTPool                                         messangersTPoolVG twi---tz-- <18.00g                                                                    86.92  2.79                            
  [messangersTPool_tdata]                                 messangersTPoolVG Twi-ao---- <18.00g                                                                                                           
  [messangersTPool_tmeta]                                 messangersTPoolVG ewi-ao----   1.00g  

‘o’ atribute is set (automatically as it should) to lvs where containers are up, i.e. i haven’t started any container on lvmTPool01.

1 Like