ansible-playbook [core 2.12.6]
  config file = /etc/ansible/ansible.cfg
  configured module search path = ['/root/.ansible/plugins/modules', '/usr/share/ansible/plugins/modules']
  ansible python module location = /usr/lib/python3.9/site-packages/ansible
  ansible collection location = /tmp/tmpi19f9hzy
  executable location = /usr/bin/ansible-playbook
  python version = 3.9.13 (main, May 18 2022, 00:00:00) [GCC 11.3.1 20220421 (Red Hat 11.3.1-2)]
  jinja version = 2.11.3
  libyaml = True
Using /etc/ansible/ansible.cfg as config file
Skipping callback 'debug', as we already have a stdout callback.
Skipping callback 'default', as we already have a stdout callback.
Skipping callback 'minimal', as we already have a stdout callback.
Skipping callback 'oneline', as we already have a stdout callback.

PLAYBOOK: tests_change_disk_mount.yml ******************************************
1 plays in /tmp/tmpsvhdx31t/tests/tests_change_disk_mount.yml

PLAY [all] *********************************************************************

TASK [Gathering Facts] *********************************************************
task path: /tmp/tmpsvhdx31t/tests/tests_change_disk_mount.yml:2
Thursday 21 July 2022  07:14:04 +0000 (0:00:00.012)       0:00:00.012 ********* 
ok: [/cache/rhel-9.qcow2.snap]
META: ran handlers

TASK [include_role : linux-system-roles.storage] *******************************
task path: /tmp/tmpsvhdx31t/tests/tests_change_disk_mount.yml:10
Thursday 21 July 2022  07:14:05 +0000 (0:00:01.423)       0:00:01.436 ********* 

TASK [linux-system-roles.storage : set platform/version specific variables] ****
task path: /tmp/tmpsvhdx31t/tests/roles/linux-system-roles.storage/tasks/main.yml:2
Thursday 21 July 2022  07:14:05 +0000 (0:00:00.042)       0:00:01.479 ********* 
included: /tmp/tmpsvhdx31t/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-9.qcow2.snap

TASK [linux-system-roles.storage : Ensure ansible_facts used by role] **********
task path: /tmp/tmpsvhdx31t/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2
Thursday 21 July 2022  07:14:05 +0000 (0:00:00.030)       0:00:01.509 ********* 
ok: [/cache/rhel-9.qcow2.snap]

TASK [linux-system-roles.storage : Set platform/version specific variables] ****
task path: /tmp/tmpsvhdx31t/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8
Thursday 21 July 2022  07:14:06 +0000 (0:00:00.519)       0:00:02.028 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => (item=RedHat.yml)  => {
    "ansible_loop_var": "item",
    "changed": false,
    "item": "RedHat.yml",
    "skip_reason": "Conditional result was False"
}
skipping: [/cache/rhel-9.qcow2.snap] => (item=RedHat.yml)  => {
    "ansible_loop_var": "item",
    "changed": false,
    "item": "RedHat.yml",
    "skip_reason": "Conditional result was False"
}
ok: [/cache/rhel-9.qcow2.snap] => (item=RedHat_9.yml) => {
    "ansible_facts": {
        "blivet_package_list": [
            "python3-blivet",
            "libblockdev-crypto",
            "libblockdev-dm",
            "libblockdev-lvm",
            "libblockdev-mdraid",
            "libblockdev-swap",
            "vdo",
            "kmod-kvdo",
            "xfsprogs"
        ]
    },
    "ansible_included_var_files": [
        "/tmp/tmpsvhdx31t/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml"
    ],
    "ansible_loop_var": "item",
    "changed": false,
    "item": "RedHat_9.yml"
}
skipping: [/cache/rhel-9.qcow2.snap] => (item=RedHat_9.0.yml)  => {
    "ansible_loop_var": "item",
    "changed": false,
    "item": "RedHat_9.0.yml",
    "skip_reason": "Conditional result was False"
}

TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] ***
task path: /tmp/tmpsvhdx31t/tests/roles/linux-system-roles.storage/tasks/main.yml:5
Thursday 21 July 2022  07:14:06 +0000 (0:00:00.055)       0:00:02.084 ********* 
ok: [/cache/rhel-9.qcow2.snap] => {
    "ansible_facts": {
        "_storage_pools_list": []
    },
    "changed": false
}

TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] ***
task path: /tmp/tmpsvhdx31t/tests/roles/linux-system-roles.storage/tasks/main.yml:9
Thursday 21 July 2022  07:14:06 +0000 (0:00:00.029)       0:00:02.113 ********* 
ok: [/cache/rhel-9.qcow2.snap] => {
    "ansible_facts": {
        "_storage_volumes_list": []
    },
    "changed": false
}

TASK [linux-system-roles.storage : include the appropriate provider tasks] *****
task path: /tmp/tmpsvhdx31t/tests/roles/linux-system-roles.storage/tasks/main.yml:13
Thursday 21 July 2022  07:14:06 +0000 (0:00:00.037)       0:00:02.151 ********* 
redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount
redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount
included: /tmp/tmpsvhdx31t/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-9.qcow2.snap

TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] ***
task path: /tmp/tmpsvhdx31t/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2
Thursday 21 July 2022  07:14:06 +0000 (0:00:00.048)       0:00:02.199 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [linux-system-roles.storage : make sure blivet is available] **************
task path: /tmp/tmpsvhdx31t/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7
Thursday 21 July 2022  07:14:06 +0000 (0:00:00.017)       0:00:02.216 ********* 
ok: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "rc": 0,
    "results": []
}

MSG:

Nothing to do

TASK [linux-system-roles.storage : show storage_pools] *************************
task path: /tmp/tmpsvhdx31t/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14
Thursday 21 July 2022  07:14:07 +0000 (0:00:01.274)       0:00:03.491 ********* 
ok: [/cache/rhel-9.qcow2.snap] => {
    "storage_pools": "VARIABLE IS NOT DEFINED!: 'storage_pools' is undefined"
}

TASK [linux-system-roles.storage : show storage_volumes] ***********************
task path: /tmp/tmpsvhdx31t/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19
Thursday 21 July 2022  07:14:07 +0000 (0:00:00.032)       0:00:03.524 ********* 
ok: [/cache/rhel-9.qcow2.snap] => {
    "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined"
}

TASK [linux-system-roles.storage : get required packages] **********************
task path: /tmp/tmpsvhdx31t/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24
Thursday 21 July 2022  07:14:07 +0000 (0:00:00.032)       0:00:03.556 ********* 
ok: [/cache/rhel-9.qcow2.snap] => {
    "actions": [],
    "changed": false,
    "crypts": [],
    "leaves": [],
    "mounts": [],
    "packages": [],
    "pools": [],
    "volumes": []
}

TASK [linux-system-roles.storage : enable copr repositories if needed] *********
task path: /tmp/tmpsvhdx31t/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37
Thursday 21 July 2022  07:14:08 +0000 (0:00:00.724)       0:00:04.280 ********* 
included: /tmp/tmpsvhdx31t/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml for /cache/rhel-9.qcow2.snap

TASK [linux-system-roles.storage : check if the COPR support packages should be installed] ***
task path: /tmp/tmpsvhdx31t/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:2
Thursday 21 July 2022  07:14:08 +0000 (0:00:00.041)       0:00:04.322 ********* 

TASK [linux-system-roles.storage : make sure COPR support packages are present] ***
task path: /tmp/tmpsvhdx31t/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:13
Thursday 21 July 2022  07:14:08 +0000 (0:00:00.036)       0:00:04.359 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [linux-system-roles.storage : enable COPRs] *******************************
task path: /tmp/tmpsvhdx31t/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:18
Thursday 21 July 2022  07:14:08 +0000 (0:00:00.041)       0:00:04.401 ********* 

TASK [linux-system-roles.storage : make sure required packages are installed] ***
task path: /tmp/tmpsvhdx31t/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44
Thursday 21 July 2022  07:14:08 +0000 (0:00:00.033)       0:00:04.434 ********* 
ok: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "rc": 0,
    "results": []
}

MSG:

Nothing to do

TASK [linux-system-roles.storage : get service facts] **************************
task path: /tmp/tmpsvhdx31t/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51
Thursday 21 July 2022  07:14:09 +0000 (0:00:00.948)       0:00:05.382 ********* 
ok: [/cache/rhel-9.qcow2.snap] => {
    "ansible_facts": {
        "services": {
            "NetworkManager-dispatcher.service": {
                "name": "NetworkManager-dispatcher.service",
                "source": "systemd",
                "state": "inactive",
                "status": "enabled"
            },
            "NetworkManager-wait-online.service": {
                "name": "NetworkManager-wait-online.service",
                "source": "systemd",
                "state": "stopped",
                "status": "enabled"
            },
            "NetworkManager.service": {
                "name": "NetworkManager.service",
                "source": "systemd",
                "state": "running",
                "status": "enabled"
            },
            "auditd.service": {
                "name": "auditd.service",
                "source": "systemd",
                "state": "running",
                "status": "enabled"
            },
            "auth-rpcgss-module.service": {
                "name": "auth-rpcgss-module.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "autofs.service": {
                "name": "autofs.service",
                "source": "systemd",
                "state": "stopped",
                "status": "not-found"
            },
            "autovt@.service": {
                "name": "autovt@.service",
                "source": "systemd",
                "state": "unknown",
                "status": "alias"
            },
            "blivet.service": {
                "name": "blivet.service",
                "source": "systemd",
                "state": "inactive",
                "status": "static"
            },
            "blk-availability.service": {
                "name": "blk-availability.service",
                "source": "systemd",
                "state": "stopped",
                "status": "disabled"
            },
            "chrony-wait.service": {
                "name": "chrony-wait.service",
                "source": "systemd",
                "state": "inactive",
                "status": "disabled"
            },
            "chronyd.service": {
                "name": "chronyd.service",
                "source": "systemd",
                "state": "running",
                "status": "enabled"
            },
            "cloud-config.service": {
                "name": "cloud-config.service",
                "source": "systemd",
                "state": "stopped",
                "status": "enabled"
            },
            "cloud-final.service": {
                "name": "cloud-final.service",
                "source": "systemd",
                "state": "stopped",
                "status": "enabled"
            },
            "cloud-init-local.service": {
                "name": "cloud-init-local.service",
                "source": "systemd",
                "state": "stopped",
                "status": "enabled"
            },
            "cloud-init.service": {
                "name": "cloud-init.service",
                "source": "systemd",
                "state": "stopped",
                "status": "enabled"
            },
            "cockpit-motd.service": {
                "name": "cockpit-motd.service",
                "source": "systemd",
                "state": "inactive",
                "status": "static"
            },
            "cockpit-wsinstance-http.service": {
                "name": "cockpit-wsinstance-http.service",
                "source": "systemd",
                "state": "inactive",
                "status": "static"
            },
            "cockpit-wsinstance-https-factory@.service": {
                "name": "cockpit-wsinstance-https-factory@.service",
                "source": "systemd",
                "state": "unknown",
                "status": "static"
            },
            "cockpit-wsinstance-https@.service": {
                "name": "cockpit-wsinstance-https@.service",
                "source": "systemd",
                "state": "unknown",
                "status": "static"
            },
            "cockpit.service": {
                "name": "cockpit.service",
                "source": "systemd",
                "state": "inactive",
                "status": "static"
            },
            "console-getty.service": {
                "name": "console-getty.service",
                "source": "systemd",
                "state": "inactive",
                "status": "disabled"
            },
            "container-getty@.service": {
                "name": "container-getty@.service",
                "source": "systemd",
                "state": "unknown",
                "status": "static"
            },
            "cpupower.service": {
                "name": "cpupower.service",
                "source": "systemd",
                "state": "stopped",
                "status": "disabled"
            },
            "crond.service": {
                "name": "crond.service",
                "source": "systemd",
                "state": "running",
                "status": "enabled"
            },
            "dbus-broker.service": {
                "name": "dbus-broker.service",
                "source": "systemd",
                "state": "running",
                "status": "enabled"
            },
            "dbus-org.freedesktop.hostname1.service": {
                "name": "dbus-org.freedesktop.hostname1.service",
                "source": "systemd",
                "state": "active",
                "status": "alias"
            },
            "dbus-org.freedesktop.locale1.service": {
                "name": "dbus-org.freedesktop.locale1.service",
                "source": "systemd",
                "state": "inactive",
                "status": "alias"
            },
            "dbus-org.freedesktop.login1.service": {
                "name": "dbus-org.freedesktop.login1.service",
                "source": "systemd",
                "state": "active",
                "status": "alias"
            },
            "dbus-org.freedesktop.nm-dispatcher.service": {
                "name": "dbus-org.freedesktop.nm-dispatcher.service",
                "source": "systemd",
                "state": "inactive",
                "status": "alias"
            },
            "dbus-org.freedesktop.timedate1.service": {
                "name": "dbus-org.freedesktop.timedate1.service",
                "source": "systemd",
                "state": "inactive",
                "status": "alias"
            },
            "dbus.service": {
                "name": "dbus.service",
                "source": "systemd",
                "state": "active",
                "status": "alias"
            },
            "debug-shell.service": {
                "name": "debug-shell.service",
                "source": "systemd",
                "state": "inactive",
                "status": "disabled"
            },
            "display-manager.service": {
                "name": "display-manager.service",
                "source": "systemd",
                "state": "stopped",
                "status": "not-found"
            },
            "dm-event.service": {
                "name": "dm-event.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "dnf-makecache.service": {
                "name": "dnf-makecache.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "dracut-cmdline.service": {
                "name": "dracut-cmdline.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "dracut-initqueue.service": {
                "name": "dracut-initqueue.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "dracut-mount.service": {
                "name": "dracut-mount.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "dracut-pre-mount.service": {
                "name": "dracut-pre-mount.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "dracut-pre-pivot.service": {
                "name": "dracut-pre-pivot.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "dracut-pre-trigger.service": {
                "name": "dracut-pre-trigger.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "dracut-pre-udev.service": {
                "name": "dracut-pre-udev.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "dracut-shutdown-onfailure.service": {
                "name": "dracut-shutdown-onfailure.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "dracut-shutdown.service": {
                "name": "dracut-shutdown.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "emergency.service": {
                "name": "emergency.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "fcoe.service": {
                "name": "fcoe.service",
                "source": "systemd",
                "state": "stopped",
                "status": "not-found"
            },
            "fstrim.service": {
                "name": "fstrim.service",
                "source": "systemd",
                "state": "inactive",
                "status": "static"
            },
            "fwupd-offline-update.service": {
                "name": "fwupd-offline-update.service",
                "source": "systemd",
                "state": "inactive",
                "status": "static"
            },
            "fwupd-refresh.service": {
                "name": "fwupd-refresh.service",
                "source": "systemd",
                "state": "inactive",
                "status": "static"
            },
            "fwupd.service": {
                "name": "fwupd.service",
                "source": "systemd",
                "state": "inactive",
                "status": "static"
            },
            "getty@.service": {
                "name": "getty@.service",
                "source": "systemd",
                "state": "unknown",
                "status": "enabled"
            },
            "getty@tty1.service": {
                "name": "getty@tty1.service",
                "source": "systemd",
                "state": "running",
                "status": "active"
            },
            "grub-boot-indeterminate.service": {
                "name": "grub-boot-indeterminate.service",
                "source": "systemd",
                "state": "inactive",
                "status": "static"
            },
            "grub2-systemd-integration.service": {
                "name": "grub2-systemd-integration.service",
                "source": "systemd",
                "state": "inactive",
                "status": "static"
            },
            "gssproxy.service": {
                "name": "gssproxy.service",
                "source": "systemd",
                "state": "running",
                "status": "disabled"
            },
            "initrd-cleanup.service": {
                "name": "initrd-cleanup.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "initrd-parse-etc.service": {
                "name": "initrd-parse-etc.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "initrd-switch-root.service": {
                "name": "initrd-switch-root.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "initrd-udevadm-cleanup-db.service": {
                "name": "initrd-udevadm-cleanup-db.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "insights-client-boot.service": {
                "name": "insights-client-boot.service",
                "source": "systemd",
                "state": "stopped",
                "status": "enabled"
            },
            "insights-client-results.service": {
                "name": "insights-client-results.service",
                "source": "systemd",
                "state": "inactive",
                "status": "static"
            },
            "insights-client.service": {
                "name": "insights-client.service",
                "source": "systemd",
                "state": "inactive",
                "status": "static"
            },
            "irqbalance.service": {
                "name": "irqbalance.service",
                "source": "systemd",
                "state": "running",
                "status": "enabled"
            },
            "iscsi-shutdown.service": {
                "name": "iscsi-shutdown.service",
                "source": "systemd",
                "state": "stopped",
                "status": "not-found"
            },
            "iscsi.service": {
                "name": "iscsi.service",
                "source": "systemd",
                "state": "stopped",
                "status": "not-found"
            },
            "iscsid.service": {
                "name": "iscsid.service",
                "source": "systemd",
                "state": "stopped",
                "status": "not-found"
            },
            "kdump.service": {
                "name": "kdump.service",
                "source": "systemd",
                "state": "stopped",
                "status": "enabled"
            },
            "kmod-static-nodes.service": {
                "name": "kmod-static-nodes.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "kvm_stat.service": {
                "name": "kvm_stat.service",
                "source": "systemd",
                "state": "inactive",
                "status": "disabled"
            },
            "ldconfig.service": {
                "name": "ldconfig.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "logrotate.service": {
                "name": "logrotate.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "lvm2-activation-early.service": {
                "name": "lvm2-activation-early.service",
                "source": "systemd",
                "state": "stopped",
                "status": "not-found"
            },
            "lvm2-lvmpolld.service": {
                "name": "lvm2-lvmpolld.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "lvm2-monitor.service": {
                "name": "lvm2-monitor.service",
                "source": "systemd",
                "state": "stopped",
                "status": "enabled"
            },
            "man-db-cache-update.service": {
                "name": "man-db-cache-update.service",
                "source": "systemd",
                "state": "inactive",
                "status": "static"
            },
            "man-db-restart-cache-update.service": {
                "name": "man-db-restart-cache-update.service",
                "source": "systemd",
                "state": "inactive",
                "status": "disabled"
            },
            "mdadm-grow-continue@.service": {
                "name": "mdadm-grow-continue@.service",
                "source": "systemd",
                "state": "unknown",
                "status": "static"
            },
            "mdadm-last-resort@.service": {
                "name": "mdadm-last-resort@.service",
                "source": "systemd",
                "state": "unknown",
                "status": "static"
            },
            "mdcheck_continue.service": {
                "name": "mdcheck_continue.service",
                "source": "systemd",
                "state": "inactive",
                "status": "static"
            },
            "mdcheck_start.service": {
                "name": "mdcheck_start.service",
                "source": "systemd",
                "state": "inactive",
                "status": "static"
            },
            "mdmon@.service": {
                "name": "mdmon@.service",
                "source": "systemd",
                "state": "unknown",
                "status": "static"
            },
            "mdmonitor-oneshot.service": {
                "name": "mdmonitor-oneshot.service",
                "source": "systemd",
                "state": "inactive",
                "status": "static"
            },
            "mdmonitor.service": {
                "name": "mdmonitor.service",
                "source": "systemd",
                "state": "stopped",
                "status": "enabled"
            },
            "microcode.service": {
                "name": "microcode.service",
                "source": "systemd",
                "state": "stopped",
                "status": "enabled"
            },
            "modprobe@.service": {
                "name": "modprobe@.service",
                "source": "systemd",
                "state": "unknown",
                "status": "static"
            },
            "modprobe@configfs.service": {
                "name": "modprobe@configfs.service",
                "source": "systemd",
                "state": "stopped",
                "status": "inactive"
            },
            "modprobe@drm.service": {
                "name": "modprobe@drm.service",
                "source": "systemd",
                "state": "stopped",
                "status": "inactive"
            },
            "modprobe@fuse.service": {
                "name": "modprobe@fuse.service",
                "source": "systemd",
                "state": "stopped",
                "status": "inactive"
            },
            "multipathd.service": {
                "name": "multipathd.service",
                "source": "systemd",
                "state": "stopped",
                "status": "enabled"
            },
            "ndctl-monitor.service": {
                "name": "ndctl-monitor.service",
                "source": "systemd",
                "state": "inactive",
                "status": "disabled"
            },
            "network.service": {
                "name": "network.service",
                "source": "systemd",
                "state": "stopped",
                "status": "not-found"
            },
            "nfs-blkmap.service": {
                "name": "nfs-blkmap.service",
                "source": "systemd",
                "state": "inactive",
                "status": "disabled"
            },
            "nfs-idmapd.service": {
                "name": "nfs-idmapd.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "nfs-mountd.service": {
                "name": "nfs-mountd.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "nfs-server.service": {
                "name": "nfs-server.service",
                "source": "systemd",
                "state": "stopped",
                "status": "disabled"
            },
            "nfs-utils.service": {
                "name": "nfs-utils.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "nfsdcld.service": {
                "name": "nfsdcld.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "nis-domainname.service": {
                "name": "nis-domainname.service",
                "source": "systemd",
                "state": "stopped",
                "status": "enabled"
            },
            "nm-priv-helper.service": {
                "name": "nm-priv-helper.service",
                "source": "systemd",
                "state": "inactive",
                "status": "static"
            },
            "ntpd.service": {
                "name": "ntpd.service",
                "source": "systemd",
                "state": "stopped",
                "status": "not-found"
            },
            "ntpdate.service": {
                "name": "ntpdate.service",
                "source": "systemd",
                "state": "stopped",
                "status": "not-found"
            },
            "oddjobd.service": {
                "name": "oddjobd.service",
                "source": "systemd",
                "state": "inactive",
                "status": "disabled"
            },
            "packagekit-offline-update.service": {
                "name": "packagekit-offline-update.service",
                "source": "systemd",
                "state": "inactive",
                "status": "static"
            },
            "packagekit.service": {
                "name": "packagekit.service",
                "source": "systemd",
                "state": "inactive",
                "status": "static"
            },
            "pam_namespace.service": {
                "name": "pam_namespace.service",
                "source": "systemd",
                "state": "inactive",
                "status": "static"
            },
            "plymouth-quit-wait.service": {
                "name": "plymouth-quit-wait.service",
                "source": "systemd",
                "state": "stopped",
                "status": "not-found"
            },
            "plymouth-start.service": {
                "name": "plymouth-start.service",
                "source": "systemd",
                "state": "stopped",
                "status": "not-found"
            },
            "polkit.service": {
                "name": "polkit.service",
                "source": "systemd",
                "state": "running",
                "status": "static"
            },
            "qemu-guest-agent.service": {
                "name": "qemu-guest-agent.service",
                "source": "systemd",
                "state": "inactive",
                "status": "enabled"
            },
            "quotaon.service": {
                "name": "quotaon.service",
                "source": "systemd",
                "state": "inactive",
                "status": "static"
            },
            "raid-check.service": {
                "name": "raid-check.service",
                "source": "systemd",
                "state": "inactive",
                "status": "static"
            },
            "rbdmap.service": {
                "name": "rbdmap.service",
                "source": "systemd",
                "state": "stopped",
                "status": "not-found"
            },
            "rc-local.service": {
                "name": "rc-local.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "rdisc.service": {
                "name": "rdisc.service",
                "source": "systemd",
                "state": "inactive",
                "status": "disabled"
            },
            "rescue.service": {
                "name": "rescue.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "rhsm-facts.service": {
                "name": "rhsm-facts.service",
                "source": "systemd",
                "state": "inactive",
                "status": "disabled"
            },
            "rhsm.service": {
                "name": "rhsm.service",
                "source": "systemd",
                "state": "inactive",
                "status": "disabled"
            },
            "rhsmcertd.service": {
                "name": "rhsmcertd.service",
                "source": "systemd",
                "state": "running",
                "status": "enabled"
            },
            "rpc-gssd.service": {
                "name": "rpc-gssd.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "rpc-statd-notify.service": {
                "name": "rpc-statd-notify.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "rpc-statd.service": {
                "name": "rpc-statd.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "rpc-svcgssd.service": {
                "name": "rpc-svcgssd.service",
                "source": "systemd",
                "state": "stopped",
                "status": "not-found"
            },
            "rpcbind.service": {
                "name": "rpcbind.service",
                "source": "systemd",
                "state": "running",
                "status": "enabled"
            },
            "rpmdb-rebuild.service": {
                "name": "rpmdb-rebuild.service",
                "source": "systemd",
                "state": "inactive",
                "status": "disabled"
            },
            "rsyslog.service": {
                "name": "rsyslog.service",
                "source": "systemd",
                "state": "running",
                "status": "enabled"
            },
            "selinux-autorelabel-mark.service": {
                "name": "selinux-autorelabel-mark.service",
                "source": "systemd",
                "state": "stopped",
                "status": "enabled"
            },
            "selinux-autorelabel.service": {
                "name": "selinux-autorelabel.service",
                "source": "systemd",
                "state": "inactive",
                "status": "static"
            },
            "serial-getty@.service": {
                "name": "serial-getty@.service",
                "source": "systemd",
                "state": "unknown",
                "status": "indirect"
            },
            "serial-getty@ttyS0.service": {
                "name": "serial-getty@ttyS0.service",
                "source": "systemd",
                "state": "running",
                "status": "active"
            },
            "sntp.service": {
                "name": "sntp.service",
                "source": "systemd",
                "state": "stopped",
                "status": "not-found"
            },
            "sshd-keygen.service": {
                "name": "sshd-keygen.service",
                "source": "systemd",
                "state": "stopped",
                "status": "not-found"
            },
            "sshd-keygen@.service": {
                "name": "sshd-keygen@.service",
                "source": "systemd",
                "state": "unknown",
                "status": "disabled"
            },
            "sshd-keygen@ecdsa.service": {
                "name": "sshd-keygen@ecdsa.service",
                "source": "systemd",
                "state": "stopped",
                "status": "inactive"
            },
            "sshd-keygen@ed25519.service": {
                "name": "sshd-keygen@ed25519.service",
                "source": "systemd",
                "state": "stopped",
                "status": "inactive"
            },
            "sshd-keygen@rsa.service": {
                "name": "sshd-keygen@rsa.service",
                "source": "systemd",
                "state": "stopped",
                "status": "inactive"
            },
            "sshd.service": {
                "name": "sshd.service",
                "source": "systemd",
                "state": "running",
                "status": "enabled"
            },
            "sshd@.service": {
                "name": "sshd@.service",
                "source": "systemd",
                "state": "unknown",
                "status": "static"
            },
            "sssd-autofs.service": {
                "name": "sssd-autofs.service",
                "source": "systemd",
                "state": "inactive",
                "status": "indirect"
            },
            "sssd-kcm.service": {
                "name": "sssd-kcm.service",
                "source": "systemd",
                "state": "stopped",
                "status": "indirect"
            },
            "sssd-nss.service": {
                "name": "sssd-nss.service",
                "source": "systemd",
                "state": "inactive",
                "status": "indirect"
            },
            "sssd-pac.service": {
                "name": "sssd-pac.service",
                "source": "systemd",
                "state": "inactive",
                "status": "indirect"
            },
            "sssd-pam.service": {
                "name": "sssd-pam.service",
                "source": "systemd",
                "state": "inactive",
                "status": "indirect"
            },
            "sssd-ssh.service": {
                "name": "sssd-ssh.service",
                "source": "systemd",
                "state": "inactive",
                "status": "indirect"
            },
            "sssd-sudo.service": {
                "name": "sssd-sudo.service",
                "source": "systemd",
                "state": "inactive",
                "status": "indirect"
            },
            "sssd.service": {
                "name": "sssd.service",
                "source": "systemd",
                "state": "stopped",
                "status": "enabled"
            },
            "syslog.service": {
                "name": "syslog.service",
                "source": "systemd",
                "state": "stopped",
                "status": "not-found"
            },
            "system-update-cleanup.service": {
                "name": "system-update-cleanup.service",
                "source": "systemd",
                "state": "inactive",
                "status": "static"
            },
            "systemd-ask-password-console.service": {
                "name": "systemd-ask-password-console.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "systemd-ask-password-wall.service": {
                "name": "systemd-ask-password-wall.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "systemd-backlight@.service": {
                "name": "systemd-backlight@.service",
                "source": "systemd",
                "state": "unknown",
                "status": "static"
            },
            "systemd-binfmt.service": {
                "name": "systemd-binfmt.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "systemd-boot-check-no-failures.service": {
                "name": "systemd-boot-check-no-failures.service",
                "source": "systemd",
                "state": "inactive",
                "status": "disabled"
            },
            "systemd-coredump@.service": {
                "name": "systemd-coredump@.service",
                "source": "systemd",
                "state": "unknown",
                "status": "static"
            },
            "systemd-exit.service": {
                "name": "systemd-exit.service",
                "source": "systemd",
                "state": "inactive",
                "status": "static"
            },
            "systemd-firstboot.service": {
                "name": "systemd-firstboot.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "systemd-fsck-root.service": {
                "name": "systemd-fsck-root.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "systemd-fsck@.service": {
                "name": "systemd-fsck@.service",
                "source": "systemd",
                "state": "unknown",
                "status": "static"
            },
            "systemd-fsck@dev-disk-by\\x2duuid-7B77\\x2d95E7.service": {
                "name": "systemd-fsck@dev-disk-by\\x2duuid-7B77\\x2d95E7.service",
                "source": "systemd",
                "state": "stopped",
                "status": "active"
            },
            "systemd-fsck@dev-vda2.service": {
                "name": "systemd-fsck@dev-vda2.service",
                "source": "systemd",
                "state": "stopped",
                "status": "inactive"
            },
            "systemd-fsck@dev-vdb1.service": {
                "name": "systemd-fsck@dev-vdb1.service",
                "source": "systemd",
                "state": "stopped",
                "status": "active"
            },
            "systemd-fsck@dev-vdc1.service": {
                "name": "systemd-fsck@dev-vdc1.service",
                "source": "systemd",
                "state": "stopped",
                "status": "active"
            },
            "systemd-halt.service": {
                "name": "systemd-halt.service",
                "source": "systemd",
                "state": "inactive",
                "status": "static"
            },
            "systemd-hibernate-resume@.service": {
                "name": "systemd-hibernate-resume@.service",
                "source": "systemd",
                "state": "unknown",
                "status": "static"
            },
            "systemd-hibernate.service": {
                "name": "systemd-hibernate.service",
                "source": "systemd",
                "state": "inactive",
                "status": "static"
            },
            "systemd-hostnamed.service": {
                "name": "systemd-hostnamed.service",
                "source": "systemd",
                "state": "running",
                "status": "static"
            },
            "systemd-hwdb-update.service": {
                "name": "systemd-hwdb-update.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "systemd-hybrid-sleep.service": {
                "name": "systemd-hybrid-sleep.service",
                "source": "systemd",
                "state": "inactive",
                "status": "static"
            },
            "systemd-initctl.service": {
                "name": "systemd-initctl.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "systemd-journal-catalog-update.service": {
                "name": "systemd-journal-catalog-update.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "systemd-journal-flush.service": {
                "name": "systemd-journal-flush.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "systemd-journald.service": {
                "name": "systemd-journald.service",
                "source": "systemd",
                "state": "running",
                "status": "static"
            },
            "systemd-journald@.service": {
                "name": "systemd-journald@.service",
                "source": "systemd",
                "state": "unknown",
                "status": "static"
            },
            "systemd-kexec.service": {
                "name": "systemd-kexec.service",
                "source": "systemd",
                "state": "inactive",
                "status": "static"
            },
            "systemd-localed.service": {
                "name": "systemd-localed.service",
                "source": "systemd",
                "state": "inactive",
                "status": "static"
            },
            "systemd-logind.service": {
                "name": "systemd-logind.service",
                "source": "systemd",
                "state": "running",
                "status": "static"
            },
            "systemd-machine-id-commit.service": {
                "name": "systemd-machine-id-commit.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "systemd-modules-load.service": {
                "name": "systemd-modules-load.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "systemd-network-generator.service": {
                "name": "systemd-network-generator.service",
                "source": "systemd",
                "state": "stopped",
                "status": "enabled"
            },
            "systemd-poweroff.service": {
                "name": "systemd-poweroff.service",
                "source": "systemd",
                "state": "inactive",
                "status": "static"
            },
            "systemd-pstore.service": {
                "name": "systemd-pstore.service",
                "source": "systemd",
                "state": "inactive",
                "status": "disabled"
            },
            "systemd-quotacheck.service": {
                "name": "systemd-quotacheck.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "systemd-random-seed.service": {
                "name": "systemd-random-seed.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "systemd-reboot.service": {
                "name": "systemd-reboot.service",
                "source": "systemd",
                "state": "inactive",
                "status": "static"
            },
            "systemd-remount-fs.service": {
                "name": "systemd-remount-fs.service",
                "source": "systemd",
                "state": "stopped",
                "status": "enabled-runtime"
            },
            "systemd-repart.service": {
                "name": "systemd-repart.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "systemd-rfkill.service": {
                "name": "systemd-rfkill.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "systemd-suspend-then-hibernate.service": {
                "name": "systemd-suspend-then-hibernate.service",
                "source": "systemd",
                "state": "inactive",
                "status": "static"
            },
            "systemd-suspend.service": {
                "name": "systemd-suspend.service",
                "source": "systemd",
                "state": "inactive",
                "status": "static"
            },
            "systemd-sysctl.service": {
                "name": "systemd-sysctl.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "systemd-sysext.service": {
                "name": "systemd-sysext.service",
                "source": "systemd",
                "state": "inactive",
                "status": "disabled"
            },
            "systemd-sysusers.service": {
                "name": "systemd-sysusers.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "systemd-timedated.service": {
                "name": "systemd-timedated.service",
                "source": "systemd",
                "state": "inactive",
                "status": "static"
            },
            "systemd-timesyncd.service": {
                "name": "systemd-timesyncd.service",
                "source": "systemd",
                "state": "stopped",
                "status": "not-found"
            },
            "systemd-tmpfiles-clean.service": {
                "name": "systemd-tmpfiles-clean.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "systemd-tmpfiles-setup-dev.service": {
                "name": "systemd-tmpfiles-setup-dev.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "systemd-tmpfiles-setup.service": {
                "name": "systemd-tmpfiles-setup.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "systemd-udev-settle.service": {
                "name": "systemd-udev-settle.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "systemd-udev-trigger.service": {
                "name": "systemd-udev-trigger.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "systemd-udevd.service": {
                "name": "systemd-udevd.service",
                "source": "systemd",
                "state": "running",
                "status": "static"
            },
            "systemd-update-done.service": {
                "name": "systemd-update-done.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "systemd-update-utmp-runlevel.service": {
                "name": "systemd-update-utmp-runlevel.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "systemd-update-utmp.service": {
                "name": "systemd-update-utmp.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "systemd-user-sessions.service": {
                "name": "systemd-user-sessions.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "systemd-vconsole-setup.service": {
                "name": "systemd-vconsole-setup.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "systemd-volatile-root.service": {
                "name": "systemd-volatile-root.service",
                "source": "systemd",
                "state": "inactive",
                "status": "static"
            },
            "teamd@.service": {
                "name": "teamd@.service",
                "source": "systemd",
                "state": "unknown",
                "status": "static"
            },
            "tuned.service": {
                "name": "tuned.service",
                "source": "systemd",
                "state": "running",
                "status": "enabled"
            },
            "user-runtime-dir@.service": {
                "name": "user-runtime-dir@.service",
                "source": "systemd",
                "state": "unknown",
                "status": "static"
            },
            "user-runtime-dir@0.service": {
                "name": "user-runtime-dir@0.service",
                "source": "systemd",
                "state": "stopped",
                "status": "active"
            },
            "user@.service": {
                "name": "user@.service",
                "source": "systemd",
                "state": "unknown",
                "status": "static"
            },
            "user@0.service": {
                "name": "user@0.service",
                "source": "systemd",
                "state": "running",
                "status": "active"
            },
            "ypbind.service": {
                "name": "ypbind.service",
                "source": "systemd",
                "state": "stopped",
                "status": "not-found"
            },
            "yppasswdd.service": {
                "name": "yppasswdd.service",
                "source": "systemd",
                "state": "stopped",
                "status": "not-found"
            },
            "ypserv.service": {
                "name": "ypserv.service",
                "source": "systemd",
                "state": "stopped",
                "status": "not-found"
            },
            "ypxfrd.service": {
                "name": "ypxfrd.service",
                "source": "systemd",
                "state": "stopped",
                "status": "not-found"
            }
        }
    },
    "changed": false
}

TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************
task path: /tmp/tmpsvhdx31t/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58
Thursday 21 July 2022  07:14:11 +0000 (0:00:01.891)       0:00:07.273 ********* 
ok: [/cache/rhel-9.qcow2.snap] => {
    "ansible_facts": {
        "storage_cryptsetup_services": []
    },
    "changed": false
}

TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] *******
task path: /tmp/tmpsvhdx31t/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71
Thursday 21 July 2022  07:14:11 +0000 (0:00:00.055)       0:00:07.329 ********* 

TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] ***
task path: /tmp/tmpsvhdx31t/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77
Thursday 21 July 2022  07:14:11 +0000 (0:00:00.021)       0:00:07.351 ********* 
ok: [/cache/rhel-9.qcow2.snap] => {
    "actions": [],
    "changed": false,
    "crypts": [],
    "leaves": [],
    "mounts": [],
    "packages": [],
    "pools": [],
    "volumes": []
}

TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] ***
task path: /tmp/tmpsvhdx31t/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:91
Thursday 21 July 2022  07:14:12 +0000 (0:00:00.518)       0:00:07.869 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] *****
task path: /tmp/tmpsvhdx31t/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103
Thursday 21 July 2022  07:14:12 +0000 (0:00:00.036)       0:00:07.906 ********* 

TASK [linux-system-roles.storage : show blivet_output] *************************
task path: /tmp/tmpsvhdx31t/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:109
Thursday 21 July 2022  07:14:12 +0000 (0:00:00.022)       0:00:07.928 ********* 
ok: [/cache/rhel-9.qcow2.snap] => {
    "blivet_output": {
        "actions": [],
        "changed": false,
        "crypts": [],
        "failed": false,
        "leaves": [],
        "mounts": [],
        "packages": [],
        "pools": [],
        "volumes": []
    }
}

TASK [linux-system-roles.storage : set the list of pools for test verification] ***
task path: /tmp/tmpsvhdx31t/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:114
Thursday 21 July 2022  07:14:12 +0000 (0:00:00.034)       0:00:07.963 ********* 
ok: [/cache/rhel-9.qcow2.snap] => {
    "ansible_facts": {
        "_storage_pools_list": []
    },
    "changed": false
}

TASK [linux-system-roles.storage : set the list of volumes for test verification] ***
task path: /tmp/tmpsvhdx31t/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:118
Thursday 21 July 2022  07:14:12 +0000 (0:00:00.035)       0:00:07.998 ********* 
ok: [/cache/rhel-9.qcow2.snap] => {
    "ansible_facts": {
        "_storage_volumes_list": []
    },
    "changed": false
}

TASK [linux-system-roles.storage : remove obsolete mounts] *********************
task path: /tmp/tmpsvhdx31t/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:134
Thursday 21 July 2022  07:14:12 +0000 (0:00:00.035)       0:00:08.034 ********* 

TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] ***
task path: /tmp/tmpsvhdx31t/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146
Thursday 21 July 2022  07:14:12 +0000 (0:00:00.038)       0:00:08.073 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [linux-system-roles.storage : set up new/current mounts] ******************
task path: /tmp/tmpsvhdx31t/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151
Thursday 21 July 2022  07:14:12 +0000 (0:00:00.025)       0:00:08.098 ********* 

TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] ***
task path: /tmp/tmpsvhdx31t/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163
Thursday 21 July 2022  07:14:12 +0000 (0:00:00.038)       0:00:08.137 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [linux-system-roles.storage : retrieve facts for the /etc/crypttab file] ***
task path: /tmp/tmpsvhdx31t/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171
Thursday 21 July 2022  07:14:12 +0000 (0:00:00.024)       0:00:08.161 ********* 
ok: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "stat": {
        "exists": false
    }
}

TASK [linux-system-roles.storage : manage /etc/crypttab to account for changes we just made] ***
task path: /tmp/tmpsvhdx31t/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:176
Thursday 21 July 2022  07:14:13 +0000 (0:00:00.525)       0:00:08.686 ********* 

TASK [linux-system-roles.storage : Update facts] *******************************
task path: /tmp/tmpsvhdx31t/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198
Thursday 21 July 2022  07:14:13 +0000 (0:00:00.021)       0:00:08.707 ********* 
ok: [/cache/rhel-9.qcow2.snap]
META: role_complete for /cache/rhel-9.qcow2.snap

TASK [Mark tasks to be skipped] ************************************************
task path: /tmp/tmpsvhdx31t/tests/tests_change_disk_mount.yml:13
Thursday 21 July 2022  07:14:14 +0000 (0:00:01.003)       0:00:09.711 ********* 
ok: [/cache/rhel-9.qcow2.snap] => {
    "ansible_facts": {
        "storage_skip_checks": [
            "blivet_available",
            "packages_installed",
            "service_facts"
        ]
    },
    "changed": false
}

TASK [include_tasks] ***********************************************************
task path: /tmp/tmpsvhdx31t/tests/tests_change_disk_mount.yml:20
Thursday 21 July 2022  07:14:14 +0000 (0:00:00.078)       0:00:09.790 ********* 
included: /tmp/tmpsvhdx31t/tests/get_unused_disk.yml for /cache/rhel-9.qcow2.snap

TASK [Find unused disks in the system] *****************************************
task path: /tmp/tmpsvhdx31t/tests/get_unused_disk.yml:2
Thursday 21 July 2022  07:14:14 +0000 (0:00:00.073)       0:00:09.864 ********* 
ok: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "disks": [
        "sda"
    ]
}

TASK [Set unused_disks if necessary] *******************************************
task path: /tmp/tmpsvhdx31t/tests/get_unused_disk.yml:9
Thursday 21 July 2022  07:14:14 +0000 (0:00:00.489)       0:00:10.353 ********* 
ok: [/cache/rhel-9.qcow2.snap] => {
    "ansible_facts": {
        "unused_disks": [
            "sda"
        ]
    },
    "changed": false
}

TASK [Exit playbook when there's not enough unused disks in the system] ********
task path: /tmp/tmpsvhdx31t/tests/get_unused_disk.yml:14
Thursday 21 July 2022  07:14:14 +0000 (0:00:00.037)       0:00:10.390 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [Print unused disks] ******************************************************
task path: /tmp/tmpsvhdx31t/tests/get_unused_disk.yml:19
Thursday 21 July 2022  07:14:14 +0000 (0:00:00.037)       0:00:10.428 ********* 
ok: [/cache/rhel-9.qcow2.snap] => {
    "unused_disks": [
        "sda"
    ]
}

TASK [Create a disk device mounted at "/opt/test1"] ****************************
task path: /tmp/tmpsvhdx31t/tests/tests_change_disk_mount.yml:25
Thursday 21 July 2022  07:14:14 +0000 (0:00:00.035)       0:00:10.464 ********* 

TASK [linux-system-roles.storage : set platform/version specific variables] ****
task path: /tmp/tmpsvhdx31t/tests/roles/linux-system-roles.storage/tasks/main.yml:2
Thursday 21 July 2022  07:14:14 +0000 (0:00:00.043)       0:00:10.507 ********* 
included: /tmp/tmpsvhdx31t/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-9.qcow2.snap

TASK [linux-system-roles.storage : Ensure ansible_facts used by role] **********
task path: /tmp/tmpsvhdx31t/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2
Thursday 21 July 2022  07:14:14 +0000 (0:00:00.034)       0:00:10.541 ********* 
ok: [/cache/rhel-9.qcow2.snap]

TASK [linux-system-roles.storage : Set platform/version specific variables] ****
task path: /tmp/tmpsvhdx31t/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8
Thursday 21 July 2022  07:14:15 +0000 (0:00:00.519)       0:00:11.061 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => (item=RedHat.yml)  => {
    "ansible_loop_var": "item",
    "changed": false,
    "item": "RedHat.yml",
    "skip_reason": "Conditional result was False"
}
skipping: [/cache/rhel-9.qcow2.snap] => (item=RedHat.yml)  => {
    "ansible_loop_var": "item",
    "changed": false,
    "item": "RedHat.yml",
    "skip_reason": "Conditional result was False"
}
ok: [/cache/rhel-9.qcow2.snap] => (item=RedHat_9.yml) => {
    "ansible_facts": {
        "blivet_package_list": [
            "python3-blivet",
            "libblockdev-crypto",
            "libblockdev-dm",
            "libblockdev-lvm",
            "libblockdev-mdraid",
            "libblockdev-swap",
            "vdo",
            "kmod-kvdo",
            "xfsprogs"
        ]
    },
    "ansible_included_var_files": [
        "/tmp/tmpsvhdx31t/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml"
    ],
    "ansible_loop_var": "item",
    "changed": false,
    "item": "RedHat_9.yml"
}
skipping: [/cache/rhel-9.qcow2.snap] => (item=RedHat_9.0.yml)  => {
    "ansible_loop_var": "item",
    "changed": false,
    "item": "RedHat_9.0.yml",
    "skip_reason": "Conditional result was False"
}

TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] ***
task path: /tmp/tmpsvhdx31t/tests/roles/linux-system-roles.storage/tasks/main.yml:5
Thursday 21 July 2022  07:14:15 +0000 (0:00:00.060)       0:00:11.121 ********* 
ok: [/cache/rhel-9.qcow2.snap] => {
    "ansible_facts": {
        "_storage_pools_list": []
    },
    "changed": false
}

TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] ***
task path: /tmp/tmpsvhdx31t/tests/roles/linux-system-roles.storage/tasks/main.yml:9
Thursday 21 July 2022  07:14:15 +0000 (0:00:00.032)       0:00:11.153 ********* 
ok: [/cache/rhel-9.qcow2.snap] => {
    "ansible_facts": {
        "_storage_volumes_list": []
    },
    "changed": false
}

TASK [linux-system-roles.storage : include the appropriate provider tasks] *****
task path: /tmp/tmpsvhdx31t/tests/roles/linux-system-roles.storage/tasks/main.yml:13
Thursday 21 July 2022  07:14:15 +0000 (0:00:00.034)       0:00:11.188 ********* 
redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount
redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount
included: /tmp/tmpsvhdx31t/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-9.qcow2.snap

TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] ***
task path: /tmp/tmpsvhdx31t/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2
Thursday 21 July 2022  07:14:15 +0000 (0:00:00.047)       0:00:11.236 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [linux-system-roles.storage : make sure blivet is available] **************
task path: /tmp/tmpsvhdx31t/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7
Thursday 21 July 2022  07:14:15 +0000 (0:00:00.022)       0:00:11.258 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [linux-system-roles.storage : show storage_pools] *************************
task path: /tmp/tmpsvhdx31t/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14
Thursday 21 July 2022  07:14:15 +0000 (0:00:00.030)       0:00:11.289 ********* 
ok: [/cache/rhel-9.qcow2.snap] => {
    "storage_pools": "VARIABLE IS NOT DEFINED!: 'storage_pools' is undefined"
}

TASK [linux-system-roles.storage : show storage_volumes] ***********************
task path: /tmp/tmpsvhdx31t/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19
Thursday 21 July 2022  07:14:15 +0000 (0:00:00.040)       0:00:11.329 ********* 
ok: [/cache/rhel-9.qcow2.snap] => {
    "storage_volumes": [
        {
            "disks": [
                "sda"
            ],
            "mount_point": "/opt/test1",
            "name": "test1",
            "type": "disk"
        }
    ]
}

TASK [linux-system-roles.storage : get required packages] **********************
task path: /tmp/tmpsvhdx31t/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24
Thursday 21 July 2022  07:14:15 +0000 (0:00:00.039)       0:00:11.369 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [linux-system-roles.storage : enable copr repositories if needed] *********
task path: /tmp/tmpsvhdx31t/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37
Thursday 21 July 2022  07:14:15 +0000 (0:00:00.028)       0:00:11.398 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [linux-system-roles.storage : make sure required packages are installed] ***
task path: /tmp/tmpsvhdx31t/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44
Thursday 21 July 2022  07:14:15 +0000 (0:00:00.028)       0:00:11.426 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [linux-system-roles.storage : get service facts] **************************
task path: /tmp/tmpsvhdx31t/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51
Thursday 21 July 2022  07:14:15 +0000 (0:00:00.028)       0:00:11.455 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************
task path: /tmp/tmpsvhdx31t/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58
Thursday 21 July 2022  07:14:15 +0000 (0:00:00.034)       0:00:11.489 ********* 
ok: [/cache/rhel-9.qcow2.snap] => {
    "ansible_facts": {
        "storage_cryptsetup_services": []
    },
    "changed": false
}

TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] *******
task path: /tmp/tmpsvhdx31t/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71
Thursday 21 July 2022  07:14:15 +0000 (0:00:00.069)       0:00:11.559 ********* 

TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] ***
task path: /tmp/tmpsvhdx31t/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77
Thursday 21 July 2022  07:14:16 +0000 (0:00:00.022)       0:00:11.581 ********* 
changed: [/cache/rhel-9.qcow2.snap] => {
    "actions": [
        {
            "action": "create format",
            "device": "/dev/sda",
            "fs_type": "xfs"
        }
    ],
    "changed": true,
    "crypts": [],
    "leaves": [
        "/dev/sr0",
        "/dev/vda1",
        "/dev/vda2",
        "/dev/vda3",
        "/dev/vda4",
        "/dev/sda",
        "/dev/sdb",
        "/dev/sdc",
        "/dev/nvme0n1",
        "/dev/nvme1n1",
        "/dev/nvme2n1",
        "/dev/vdb1",
        "/dev/vdc1",
        "/dev/vdd",
        "/dev/vde",
        "/dev/vdf"
    ],
    "mounts": [
        {
            "dump": 0,
            "fstype": "xfs",
            "opts": "defaults",
            "passno": 0,
            "path": "/opt/test1",
            "src": "UUID=31c6244a-312c-4308-8a54-e9662b6a934a",
            "state": "mounted"
        }
    ],
    "packages": [
        "xfsprogs",
        "e2fsprogs",
        "dosfstools"
    ],
    "pools": [],
    "volumes": [
        {
            "_device": "/dev/sda",
            "_kernel_device": "/dev/sda",
            "_mount_id": "UUID=31c6244a-312c-4308-8a54-e9662b6a934a",
            "_raw_device": "/dev/sda",
            "_raw_kernel_device": "/dev/sda",
            "cache_devices": [],
            "cache_mode": null,
            "cache_size": 0,
            "cached": false,
            "compression": null,
            "deduplication": null,
            "disks": [
                "sda"
            ],
            "encryption": false,
            "encryption_cipher": null,
            "encryption_key": null,
            "encryption_key_size": null,
            "encryption_luks_version": null,
            "encryption_password": null,
            "fs_create_options": "",
            "fs_label": "",
            "fs_overwrite_existing": true,
            "fs_type": "xfs",
            "mount_check": 0,
            "mount_device_identifier": "uuid",
            "mount_options": "defaults",
            "mount_passno": 0,
            "mount_point": "/opt/test1",
            "name": "test1",
            "raid_chunk_size": null,
            "raid_device_count": null,
            "raid_level": null,
            "raid_metadata_version": null,
            "raid_spare_count": null,
            "size": 10737418240,
            "state": "present",
            "thin": null,
            "thin_pool_name": null,
            "thin_pool_size": null,
            "type": "disk",
            "vdo_pool_size": null
        }
    ]
}

TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] ***
task path: /tmp/tmpsvhdx31t/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:91
Thursday 21 July 2022  07:14:17 +0000 (0:00:01.742)       0:00:13.323 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] *****
task path: /tmp/tmpsvhdx31t/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103
Thursday 21 July 2022  07:14:17 +0000 (0:00:00.037)       0:00:13.360 ********* 

TASK [linux-system-roles.storage : show blivet_output] *************************
task path: /tmp/tmpsvhdx31t/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:109
Thursday 21 July 2022  07:14:17 +0000 (0:00:00.023)       0:00:13.383 ********* 
ok: [/cache/rhel-9.qcow2.snap] => {
    "blivet_output": {
        "actions": [
            {
                "action": "create format",
                "device": "/dev/sda",
                "fs_type": "xfs"
            }
        ],
        "changed": true,
        "crypts": [],
        "failed": false,
        "leaves": [
            "/dev/sr0",
            "/dev/vda1",
            "/dev/vda2",
            "/dev/vda3",
            "/dev/vda4",
            "/dev/sda",
            "/dev/sdb",
            "/dev/sdc",
            "/dev/nvme0n1",
            "/dev/nvme1n1",
            "/dev/nvme2n1",
            "/dev/vdb1",
            "/dev/vdc1",
            "/dev/vdd",
            "/dev/vde",
            "/dev/vdf"
        ],
        "mounts": [
            {
                "dump": 0,
                "fstype": "xfs",
                "opts": "defaults",
                "passno": 0,
                "path": "/opt/test1",
                "src": "UUID=31c6244a-312c-4308-8a54-e9662b6a934a",
                "state": "mounted"
            }
        ],
        "packages": [
            "xfsprogs",
            "e2fsprogs",
            "dosfstools"
        ],
        "pools": [],
        "volumes": [
            {
                "_device": "/dev/sda",
                "_kernel_device": "/dev/sda",
                "_mount_id": "UUID=31c6244a-312c-4308-8a54-e9662b6a934a",
                "_raw_device": "/dev/sda",
                "_raw_kernel_device": "/dev/sda",
                "cache_devices": [],
                "cache_mode": null,
                "cache_size": 0,
                "cached": false,
                "compression": null,
                "deduplication": null,
                "disks": [
                    "sda"
                ],
                "encryption": false,
                "encryption_cipher": null,
                "encryption_key": null,
                "encryption_key_size": null,
                "encryption_luks_version": null,
                "encryption_password": null,
                "fs_create_options": "",
                "fs_label": "",
                "fs_overwrite_existing": true,
                "fs_type": "xfs",
                "mount_check": 0,
                "mount_device_identifier": "uuid",
                "mount_options": "defaults",
                "mount_passno": 0,
                "mount_point": "/opt/test1",
                "name": "test1",
                "raid_chunk_size": null,
                "raid_device_count": null,
                "raid_level": null,
                "raid_metadata_version": null,
                "raid_spare_count": null,
                "size": 10737418240,
                "state": "present",
                "thin": null,
                "thin_pool_name": null,
                "thin_pool_size": null,
                "type": "disk",
                "vdo_pool_size": null
            }
        ]
    }
}

TASK [linux-system-roles.storage : set the list of pools for test verification] ***
task path: /tmp/tmpsvhdx31t/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:114
Thursday 21 July 2022  07:14:17 +0000 (0:00:00.037)       0:00:13.421 ********* 
ok: [/cache/rhel-9.qcow2.snap] => {
    "ansible_facts": {
        "_storage_pools_list": []
    },
    "changed": false
}

TASK [linux-system-roles.storage : set the list of volumes for test verification] ***
task path: /tmp/tmpsvhdx31t/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:118
Thursday 21 July 2022  07:14:17 +0000 (0:00:00.037)       0:00:13.458 ********* 
ok: [/cache/rhel-9.qcow2.snap] => {
    "ansible_facts": {
        "_storage_volumes_list": [
            {
                "_device": "/dev/sda",
                "_kernel_device": "/dev/sda",
                "_mount_id": "UUID=31c6244a-312c-4308-8a54-e9662b6a934a",
                "_raw_device": "/dev/sda",
                "_raw_kernel_device": "/dev/sda",
                "cache_devices": [],
                "cache_mode": null,
                "cache_size": 0,
                "cached": false,
                "compression": null,
                "deduplication": null,
                "disks": [
                    "sda"
                ],
                "encryption": false,
                "encryption_cipher": null,
                "encryption_key": null,
                "encryption_key_size": null,
                "encryption_luks_version": null,
                "encryption_password": null,
                "fs_create_options": "",
                "fs_label": "",
                "fs_overwrite_existing": true,
                "fs_type": "xfs",
                "mount_check": 0,
                "mount_device_identifier": "uuid",
                "mount_options": "defaults",
                "mount_passno": 0,
                "mount_point": "/opt/test1",
                "name": "test1",
                "raid_chunk_size": null,
                "raid_device_count": null,
                "raid_level": null,
                "raid_metadata_version": null,
                "raid_spare_count": null,
                "size": 10737418240,
                "state": "present",
                "thin": null,
                "thin_pool_name": null,
                "thin_pool_size": null,
                "type": "disk",
                "vdo_pool_size": null
            }
        ]
    },
    "changed": false
}

TASK [linux-system-roles.storage : remove obsolete mounts] *********************
task path: /tmp/tmpsvhdx31t/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:134
Thursday 21 July 2022  07:14:17 +0000 (0:00:00.038)       0:00:13.497 ********* 

TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] ***
task path: /tmp/tmpsvhdx31t/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146
Thursday 21 July 2022  07:14:17 +0000 (0:00:00.039)       0:00:13.536 ********* 
ok: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "name": null,
    "status": {}
}

TASK [linux-system-roles.storage : set up new/current mounts] ******************
task path: /tmp/tmpsvhdx31t/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151
Thursday 21 July 2022  07:14:18 +0000 (0:00:00.946)       0:00:14.483 ********* 
redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount
changed: [/cache/rhel-9.qcow2.snap] => (item={'src': 'UUID=31c6244a-312c-4308-8a54-e9662b6a934a', 'path': '/opt/test1', 'fstype': 'xfs', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted'}) => {
    "ansible_loop_var": "mount_info",
    "backup_file": "",
    "boot": "yes",
    "changed": true,
    "dump": "0",
    "fstab": "/etc/fstab",
    "fstype": "xfs",
    "mount_info": {
        "dump": 0,
        "fstype": "xfs",
        "opts": "defaults",
        "passno": 0,
        "path": "/opt/test1",
        "src": "UUID=31c6244a-312c-4308-8a54-e9662b6a934a",
        "state": "mounted"
    },
    "name": "/opt/test1",
    "opts": "defaults",
    "passno": "0",
    "src": "UUID=31c6244a-312c-4308-8a54-e9662b6a934a"
}

TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] ***
task path: /tmp/tmpsvhdx31t/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163
Thursday 21 July 2022  07:14:19 +0000 (0:00:00.519)       0:00:15.002 ********* 
ok: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "name": null,
    "status": {}
}

TASK [linux-system-roles.storage : retrieve facts for the /etc/crypttab file] ***
task path: /tmp/tmpsvhdx31t/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171
Thursday 21 July 2022  07:14:20 +0000 (0:00:00.634)       0:00:15.637 ********* 
ok: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "stat": {
        "exists": false
    }
}

TASK [linux-system-roles.storage : manage /etc/crypttab to account for changes we just made] ***
task path: /tmp/tmpsvhdx31t/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:176
Thursday 21 July 2022  07:14:20 +0000 (0:00:00.356)       0:00:15.994 ********* 

TASK [linux-system-roles.storage : Update facts] *******************************
task path: /tmp/tmpsvhdx31t/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198
Thursday 21 July 2022  07:14:20 +0000 (0:00:00.023)       0:00:16.017 ********* 
ok: [/cache/rhel-9.qcow2.snap]
META: role_complete for /cache/rhel-9.qcow2.snap

TASK [include_tasks] ***********************************************************
task path: /tmp/tmpsvhdx31t/tests/tests_change_disk_mount.yml:35
Thursday 21 July 2022  07:14:21 +0000 (0:00:00.976)       0:00:16.994 ********* 
included: /tmp/tmpsvhdx31t/tests/verify-role-results.yml for /cache/rhel-9.qcow2.snap

TASK [Print out pool information] **********************************************
task path: /tmp/tmpsvhdx31t/tests/verify-role-results.yml:1
Thursday 21 July 2022  07:14:21 +0000 (0:00:00.041)       0:00:17.035 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {}

TASK [Print out volume information] ********************************************
task path: /tmp/tmpsvhdx31t/tests/verify-role-results.yml:6
Thursday 21 July 2022  07:14:21 +0000 (0:00:00.036)       0:00:17.072 ********* 
ok: [/cache/rhel-9.qcow2.snap] => {
    "_storage_volumes_list": [
        {
            "_device": "/dev/sda",
            "_kernel_device": "/dev/sda",
            "_mount_id": "UUID=31c6244a-312c-4308-8a54-e9662b6a934a",
            "_raw_device": "/dev/sda",
            "_raw_kernel_device": "/dev/sda",
            "cache_devices": [],
            "cache_mode": null,
            "cache_size": 0,
            "cached": false,
            "compression": null,
            "deduplication": null,
            "disks": [
                "sda"
            ],
            "encryption": false,
            "encryption_cipher": null,
            "encryption_key": null,
            "encryption_key_size": null,
            "encryption_luks_version": null,
            "encryption_password": null,
            "fs_create_options": "",
            "fs_label": "",
            "fs_overwrite_existing": true,
            "fs_type": "xfs",
            "mount_check": 0,
            "mount_device_identifier": "uuid",
            "mount_options": "defaults",
            "mount_passno": 0,
            "mount_point": "/opt/test1",
            "name": "test1",
            "raid_chunk_size": null,
            "raid_device_count": null,
            "raid_level": null,
            "raid_metadata_version": null,
            "raid_spare_count": null,
            "size": 10737418240,
            "state": "present",
            "thin": null,
            "thin_pool_name": null,
            "thin_pool_size": null,
            "type": "disk",
            "vdo_pool_size": null
        }
    ]
}

TASK [Collect info about the volumes.] *****************************************
task path: /tmp/tmpsvhdx31t/tests/verify-role-results.yml:14
Thursday 21 July 2022  07:14:21 +0000 (0:00:00.082)       0:00:17.155 ********* 
ok: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "info": {
        "/dev/nvme0n1": {
            "fstype": "",
            "label": "",
            "name": "/dev/nvme0n1",
            "size": "10G",
            "type": "disk",
            "uuid": ""
        },
        "/dev/nvme1n1": {
            "fstype": "",
            "label": "",
            "name": "/dev/nvme1n1",
            "size": "10G",
            "type": "disk",
            "uuid": ""
        },
        "/dev/nvme2n1": {
            "fstype": "",
            "label": "",
            "name": "/dev/nvme2n1",
            "size": "10G",
            "type": "disk",
            "uuid": ""
        },
        "/dev/sda": {
            "fstype": "xfs",
            "label": "",
            "name": "/dev/sda",
            "size": "10G",
            "type": "disk",
            "uuid": "31c6244a-312c-4308-8a54-e9662b6a934a"
        },
        "/dev/sdb": {
            "fstype": "",
            "label": "",
            "name": "/dev/sdb",
            "size": "10G",
            "type": "disk",
            "uuid": ""
        },
        "/dev/sdc": {
            "fstype": "",
            "label": "",
            "name": "/dev/sdc",
            "size": "10G",
            "type": "disk",
            "uuid": ""
        },
        "/dev/sr0": {
            "fstype": "iso9660",
            "label": "cidata",
            "name": "/dev/sr0",
            "size": "364K",
            "type": "rom",
            "uuid": "2022-07-21-07-13-51-00"
        },
        "/dev/vda": {
            "fstype": "",
            "label": "",
            "name": "/dev/vda",
            "size": "10G",
            "type": "disk",
            "uuid": ""
        },
        "/dev/vda1": {
            "fstype": "",
            "label": "",
            "name": "/dev/vda1",
            "size": "1M",
            "type": "partition",
            "uuid": ""
        },
        "/dev/vda2": {
            "fstype": "vfat",
            "label": "",
            "name": "/dev/vda2",
            "size": "200M",
            "type": "partition",
            "uuid": "7B77-95E7"
        },
        "/dev/vda3": {
            "fstype": "xfs",
            "label": "boot",
            "name": "/dev/vda3",
            "size": "500M",
            "type": "partition",
            "uuid": "6e74e171-0370-451f-8340-f16ad2839183"
        },
        "/dev/vda4": {
            "fstype": "xfs",
            "label": "root",
            "name": "/dev/vda4",
            "size": "9.3G",
            "type": "partition",
            "uuid": "1bb53e4d-984c-4316-908a-59b5a62fa30e"
        },
        "/dev/vdb": {
            "fstype": "",
            "label": "",
            "name": "/dev/vdb",
            "size": "2G",
            "type": "disk",
            "uuid": ""
        },
        "/dev/vdb1": {
            "fstype": "ext4",
            "label": "yumcache",
            "name": "/dev/vdb1",
            "size": "2G",
            "type": "partition",
            "uuid": "12be66f7-7f03-4d54-9a48-a520825d8fa9"
        },
        "/dev/vdc": {
            "fstype": "",
            "label": "",
            "name": "/dev/vdc",
            "size": "2G",
            "type": "disk",
            "uuid": ""
        },
        "/dev/vdc1": {
            "fstype": "ext4",
            "label": "yumvarlib",
            "name": "/dev/vdc1",
            "size": "2G",
            "type": "partition",
            "uuid": "9bd037ed-b1bb-4b9c-a008-9f07d34ab071"
        },
        "/dev/vdd": {
            "fstype": "",
            "label": "",
            "name": "/dev/vdd",
            "size": "10G",
            "type": "disk",
            "uuid": ""
        },
        "/dev/vde": {
            "fstype": "",
            "label": "",
            "name": "/dev/vde",
            "size": "10G",
            "type": "disk",
            "uuid": ""
        },
        "/dev/vdf": {
            "fstype": "",
            "label": "",
            "name": "/dev/vdf",
            "size": "10G",
            "type": "disk",
            "uuid": ""
        }
    }
}

TASK [Read the /etc/fstab file for volume existence] ***************************
task path: /tmp/tmpsvhdx31t/tests/verify-role-results.yml:19
Thursday 21 July 2022  07:14:22 +0000 (0:00:00.495)       0:00:17.651 ********* 
ok: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "cmd": [
        "cat",
        "/etc/fstab"
    ],
    "delta": "0:00:00.003069",
    "end": "2022-07-21 03:14:21.860544",
    "rc": 0,
    "start": "2022-07-21 03:14:21.857475"
}

STDOUT:

UUID=6e74e171-0370-451f-8340-f16ad2839183	/boot	xfs	defaults	0	0
UUID=1bb53e4d-984c-4316-908a-59b5a62fa30e	/	xfs	defaults	0	0
UUID=7B77-95E7	/boot/efi	vfat	defaults,uid=0,gid=0,umask=077,shortname=winnt	0	2
/dev/vdb1	/var/cache/dnf	auto	defaults,nofail,x-systemd.requires=cloud-init.service,_netdev,comment=cloudconfig	0	2
/dev/vdc1	/var/lib/dnf	auto	defaults,nofail,x-systemd.requires=cloud-init.service,_netdev,comment=cloudconfig	0	2
UUID=31c6244a-312c-4308-8a54-e9662b6a934a /opt/test1 xfs defaults 0 0

TASK [Read the /etc/crypttab file] *********************************************
task path: /tmp/tmpsvhdx31t/tests/verify-role-results.yml:24
Thursday 21 July 2022  07:14:22 +0000 (0:00:00.492)       0:00:18.143 ********* 
ok: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "cmd": [
        "cat",
        "/etc/crypttab"
    ],
    "delta": "0:00:00.003450",
    "end": "2022-07-21 03:14:22.228765",
    "failed_when_result": false,
    "rc": 1,
    "start": "2022-07-21 03:14:22.225315"
}

STDERR:

cat: /etc/crypttab: No such file or directory


MSG:

non-zero return code

TASK [Verify the volumes listed in storage_pools were correctly managed] *******
task path: /tmp/tmpsvhdx31t/tests/verify-role-results.yml:33
Thursday 21 July 2022  07:14:22 +0000 (0:00:00.365)       0:00:18.509 ********* 

TASK [Verify the volumes with no pool were correctly managed] ******************
task path: /tmp/tmpsvhdx31t/tests/verify-role-results.yml:43
Thursday 21 July 2022  07:14:22 +0000 (0:00:00.022)       0:00:18.532 ********* 
included: /tmp/tmpsvhdx31t/tests/test-verify-volume.yml for /cache/rhel-9.qcow2.snap => (item={'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'name': 'test1', 'raid_level': None, 'size': 10737418240, 'state': 'present', 'type': 'disk', 'disks': ['sda'], 'raid_device_count': None, 'raid_spare_count': None, 'raid_metadata_version': None, 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_chunk_size': None, 'compression': None, 'deduplication': None, 'vdo_pool_size': None, 'thin': None, 'thin_pool_name': None, 'thin_pool_size': None, 'cached': False, 'cache_size': 0, 'cache_mode': None, 'cache_devices': [], '_device': '/dev/sda', '_raw_device': '/dev/sda', '_mount_id': 'UUID=31c6244a-312c-4308-8a54-e9662b6a934a', '_kernel_device': '/dev/sda', '_raw_kernel_device': '/dev/sda'})

TASK [set_fact] ****************************************************************
task path: /tmp/tmpsvhdx31t/tests/test-verify-volume.yml:2
Thursday 21 July 2022  07:14:23 +0000 (0:00:00.062)       0:00:18.595 ********* 
ok: [/cache/rhel-9.qcow2.snap] => {
    "ansible_facts": {
        "_storage_test_volume_present": true,
        "_storage_volume_tests": [
            "mount",
            "fstab",
            "fs",
            "device",
            "encryption",
            "md",
            "size",
            "cache"
        ]
    },
    "changed": false
}

TASK [include_tasks] ***********************************************************
task path: /tmp/tmpsvhdx31t/tests/test-verify-volume.yml:10
Thursday 21 July 2022  07:14:23 +0000 (0:00:00.048)       0:00:18.643 ********* 
included: /tmp/tmpsvhdx31t/tests/test-verify-volume-mount.yml for /cache/rhel-9.qcow2.snap => (item=mount)
included: /tmp/tmpsvhdx31t/tests/test-verify-volume-fstab.yml for /cache/rhel-9.qcow2.snap => (item=fstab)
included: /tmp/tmpsvhdx31t/tests/test-verify-volume-fs.yml for /cache/rhel-9.qcow2.snap => (item=fs)
included: /tmp/tmpsvhdx31t/tests/test-verify-volume-device.yml for /cache/rhel-9.qcow2.snap => (item=device)
included: /tmp/tmpsvhdx31t/tests/test-verify-volume-encryption.yml for /cache/rhel-9.qcow2.snap => (item=encryption)
included: /tmp/tmpsvhdx31t/tests/test-verify-volume-md.yml for /cache/rhel-9.qcow2.snap => (item=md)
included: /tmp/tmpsvhdx31t/tests/test-verify-volume-size.yml for /cache/rhel-9.qcow2.snap => (item=size)
included: /tmp/tmpsvhdx31t/tests/test-verify-volume-cache.yml for /cache/rhel-9.qcow2.snap => (item=cache)

TASK [Get expected mount device based on device type] **************************
task path: /tmp/tmpsvhdx31t/tests/test-verify-volume-mount.yml:6
Thursday 21 July 2022  07:14:23 +0000 (0:00:00.076)       0:00:18.720 ********* 
ok: [/cache/rhel-9.qcow2.snap] => {
    "ansible_facts": {
        "storage_test_device_path": "/dev/sda"
    },
    "changed": false
}

TASK [Set some facts] **********************************************************
task path: /tmp/tmpsvhdx31t/tests/test-verify-volume-mount.yml:14
Thursday 21 July 2022  07:14:23 +0000 (0:00:00.042)       0:00:18.762 ********* 
ok: [/cache/rhel-9.qcow2.snap] => {
    "ansible_facts": {
        "storage_test_mount_device_matches": [
            {
                "block_available": 2592358,
                "block_size": 4096,
                "block_total": 2618880,
                "block_used": 26522,
                "device": "/dev/sda",
                "fstype": "xfs",
                "inode_available": 5242877,
                "inode_total": 5242880,
                "inode_used": 3,
                "mount": "/opt/test1",
                "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota",
                "size_available": 10618298368,
                "size_total": 10726932480,
                "uuid": "31c6244a-312c-4308-8a54-e9662b6a934a"
            }
        ],
        "storage_test_mount_expected_match_count": "1",
        "storage_test_mount_point_matches": [
            {
                "block_available": 2592358,
                "block_size": 4096,
                "block_total": 2618880,
                "block_used": 26522,
                "device": "/dev/sda",
                "fstype": "xfs",
                "inode_available": 5242877,
                "inode_total": 5242880,
                "inode_used": 3,
                "mount": "/opt/test1",
                "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota",
                "size_available": 10618298368,
                "size_total": 10726932480,
                "uuid": "31c6244a-312c-4308-8a54-e9662b6a934a"
            }
        ],
        "storage_test_swap_expected_matches": "0"
    },
    "changed": false
}

TASK [Verify the current mount state by device] ********************************
task path: /tmp/tmpsvhdx31t/tests/test-verify-volume-mount.yml:28
Thursday 21 July 2022  07:14:23 +0000 (0:00:00.055)       0:00:18.818 ********* 
ok: [/cache/rhel-9.qcow2.snap] => {
    "changed": false
}

MSG:

All assertions passed

TASK [Verify the current mount state by mount point] ***************************
task path: /tmp/tmpsvhdx31t/tests/test-verify-volume-mount.yml:37
Thursday 21 July 2022  07:14:23 +0000 (0:00:00.050)       0:00:18.869 ********* 
ok: [/cache/rhel-9.qcow2.snap] => {
    "changed": false
}

MSG:

All assertions passed

TASK [Verify the mount fs type] ************************************************
task path: /tmp/tmpsvhdx31t/tests/test-verify-volume-mount.yml:45
Thursday 21 July 2022  07:14:23 +0000 (0:00:00.051)       0:00:18.920 ********* 
ok: [/cache/rhel-9.qcow2.snap] => {
    "changed": false
}

MSG:

All assertions passed

TASK [command] *****************************************************************
task path: /tmp/tmpsvhdx31t/tests/test-verify-volume-mount.yml:54
Thursday 21 July 2022  07:14:23 +0000 (0:00:00.051)       0:00:18.971 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [Gather swap info] ********************************************************
task path: /tmp/tmpsvhdx31t/tests/test-verify-volume-mount.yml:58
Thursday 21 July 2022  07:14:23 +0000 (0:00:00.024)       0:00:18.995 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [Verify swap status] ******************************************************
task path: /tmp/tmpsvhdx31t/tests/test-verify-volume-mount.yml:63
Thursday 21 July 2022  07:14:23 +0000 (0:00:00.024)       0:00:19.020 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [Unset facts] *************************************************************
task path: /tmp/tmpsvhdx31t/tests/test-verify-volume-mount.yml:75
Thursday 21 July 2022  07:14:23 +0000 (0:00:00.035)       0:00:19.056 ********* 
ok: [/cache/rhel-9.qcow2.snap] => {
    "ansible_facts": {
        "storage_test_mount_device_matches": null,
        "storage_test_mount_expected_match_count": null,
        "storage_test_mount_point_matches": null,
        "storage_test_swap_expected_matches": null,
        "storage_test_swaps": null,
        "storage_test_sys_node": null
    },
    "changed": false
}

TASK [Set some variables for fstab checking] ***********************************
task path: /tmp/tmpsvhdx31t/tests/test-verify-volume-fstab.yml:2
Thursday 21 July 2022  07:14:23 +0000 (0:00:00.038)       0:00:19.094 ********* 
ok: [/cache/rhel-9.qcow2.snap] => {
    "ansible_facts": {
        "storage_test_fstab_expected_id_matches": "1",
        "storage_test_fstab_expected_mount_options_matches": "1",
        "storage_test_fstab_expected_mount_point_matches": "1",
        "storage_test_fstab_id_matches": [
            "UUID=31c6244a-312c-4308-8a54-e9662b6a934a "
        ],
        "storage_test_fstab_mount_options_matches": [
            " /opt/test1 xfs defaults "
        ],
        "storage_test_fstab_mount_point_matches": [
            " /opt/test1 "
        ]
    },
    "changed": false
}

TASK [Verify that the device identifier appears in /etc/fstab] *****************
task path: /tmp/tmpsvhdx31t/tests/test-verify-volume-fstab.yml:25
Thursday 21 July 2022  07:14:23 +0000 (0:00:00.064)       0:00:19.158 ********* 
ok: [/cache/rhel-9.qcow2.snap] => {
    "changed": false
}

MSG:

All assertions passed

TASK [Verify the fstab mount point] ********************************************
task path: /tmp/tmpsvhdx31t/tests/test-verify-volume-fstab.yml:32
Thursday 21 July 2022  07:14:23 +0000 (0:00:00.052)       0:00:19.210 ********* 
ok: [/cache/rhel-9.qcow2.snap] => {
    "changed": false
}

MSG:

All assertions passed

TASK [Verify mount_options] ****************************************************
task path: /tmp/tmpsvhdx31t/tests/test-verify-volume-fstab.yml:39
Thursday 21 July 2022  07:14:23 +0000 (0:00:00.050)       0:00:19.261 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [Clean up variables] ******************************************************
task path: /tmp/tmpsvhdx31t/tests/test-verify-volume-fstab.yml:49
Thursday 21 July 2022  07:14:23 +0000 (0:00:00.035)       0:00:19.296 ********* 
ok: [/cache/rhel-9.qcow2.snap] => {
    "ansible_facts": {
        "storage_test_fstab_expected_id_matches": null,
        "storage_test_fstab_expected_mount_options_matches": null,
        "storage_test_fstab_expected_mount_point_matches": null,
        "storage_test_fstab_id_matches": null,
        "storage_test_fstab_mount_options_matches": null,
        "storage_test_fstab_mount_point_matches": null
    },
    "changed": false
}

TASK [Verify fs type] **********************************************************
task path: /tmp/tmpsvhdx31t/tests/test-verify-volume-fs.yml:4
Thursday 21 July 2022  07:14:23 +0000 (0:00:00.081)       0:00:19.378 ********* 
ok: [/cache/rhel-9.qcow2.snap] => {
    "changed": false
}

MSG:

All assertions passed

TASK [Verify fs label] *********************************************************
task path: /tmp/tmpsvhdx31t/tests/test-verify-volume-fs.yml:10
Thursday 21 July 2022  07:14:23 +0000 (0:00:00.069)       0:00:19.447 ********* 
ok: [/cache/rhel-9.qcow2.snap] => {
    "changed": false
}

MSG:

All assertions passed

TASK [See whether the device node is present] **********************************
task path: /tmp/tmpsvhdx31t/tests/test-verify-volume-device.yml:4
Thursday 21 July 2022  07:14:23 +0000 (0:00:00.039)       0:00:19.486 ********* 
ok: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "stat": {
        "atime": 1658387656.9902117,
        "attr_flags": "",
        "attributes": [],
        "block_size": 4096,
        "blocks": 0,
        "charset": "binary",
        "ctime": 1658387656.9902117,
        "dev": 5,
        "device_type": 2048,
        "executable": false,
        "exists": true,
        "gid": 6,
        "gr_name": "disk",
        "inode": 363,
        "isblk": true,
        "ischr": false,
        "isdir": false,
        "isfifo": false,
        "isgid": false,
        "islnk": false,
        "isreg": false,
        "issock": false,
        "isuid": false,
        "mimetype": "inode/blockdevice",
        "mode": "0660",
        "mtime": 1658387656.9902117,
        "nlink": 1,
        "path": "/dev/sda",
        "pw_name": "root",
        "readable": true,
        "rgrp": true,
        "roth": false,
        "rusr": true,
        "size": 0,
        "uid": 0,
        "version": null,
        "wgrp": true,
        "woth": false,
        "writeable": true,
        "wusr": true,
        "xgrp": false,
        "xoth": false,
        "xusr": false
    }
}

TASK [Verify the presence/absence of the device node] **************************
task path: /tmp/tmpsvhdx31t/tests/test-verify-volume-device.yml:10
Thursday 21 July 2022  07:14:24 +0000 (0:00:00.382)       0:00:19.868 ********* 
ok: [/cache/rhel-9.qcow2.snap] => {
    "changed": false
}

MSG:

All assertions passed

TASK [Make sure we got info about this volume] *********************************
task path: /tmp/tmpsvhdx31t/tests/test-verify-volume-device.yml:18
Thursday 21 July 2022  07:14:24 +0000 (0:00:00.038)       0:00:19.906 ********* 
ok: [/cache/rhel-9.qcow2.snap] => {
    "changed": false
}

MSG:

All assertions passed

TASK [(1/2) Process volume type (set initial value)] ***************************
task path: /tmp/tmpsvhdx31t/tests/test-verify-volume-device.yml:24
Thursday 21 July 2022  07:14:24 +0000 (0:00:00.036)       0:00:19.943 ********* 
ok: [/cache/rhel-9.qcow2.snap] => {
    "ansible_facts": {
        "st_volume_type": "disk"
    },
    "changed": false
}

TASK [(2/2) Process volume type (get RAID value)] ******************************
task path: /tmp/tmpsvhdx31t/tests/test-verify-volume-device.yml:28
Thursday 21 July 2022  07:14:24 +0000 (0:00:00.037)       0:00:19.981 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [Verify the volume's device type] *****************************************
task path: /tmp/tmpsvhdx31t/tests/test-verify-volume-device.yml:33
Thursday 21 July 2022  07:14:24 +0000 (0:00:00.024)       0:00:20.005 ********* 
ok: [/cache/rhel-9.qcow2.snap] => {
    "changed": false
}

MSG:

All assertions passed

TASK [Stat the LUKS device, if encrypted] **************************************
task path: /tmp/tmpsvhdx31t/tests/test-verify-volume-encryption.yml:3
Thursday 21 July 2022  07:14:24 +0000 (0:00:00.039)       0:00:20.045 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [Ensure cryptsetup is present] ********************************************
task path: /tmp/tmpsvhdx31t/tests/test-verify-volume-encryption.yml:10
Thursday 21 July 2022  07:14:24 +0000 (0:00:00.023)       0:00:20.069 ********* 
ok: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "rc": 0,
    "results": []
}

MSG:

Nothing to do

TASK [Collect LUKS info for this volume] ***************************************
task path: /tmp/tmpsvhdx31t/tests/test-verify-volume-encryption.yml:15
Thursday 21 July 2022  07:14:25 +0000 (0:00:00.871)       0:00:20.940 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [Verify the presence/absence of the LUKS device node] *********************
task path: /tmp/tmpsvhdx31t/tests/test-verify-volume-encryption.yml:21
Thursday 21 July 2022  07:14:25 +0000 (0:00:00.023)       0:00:20.963 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [Verify that the raw device is the same as the device if not encrypted] ***
task path: /tmp/tmpsvhdx31t/tests/test-verify-volume-encryption.yml:30
Thursday 21 July 2022  07:14:25 +0000 (0:00:00.022)       0:00:20.986 ********* 
ok: [/cache/rhel-9.qcow2.snap] => {
    "changed": false
}

MSG:

All assertions passed

TASK [Make sure we got info about the LUKS volume if encrypted] ****************
task path: /tmp/tmpsvhdx31t/tests/test-verify-volume-encryption.yml:38
Thursday 21 July 2022  07:14:25 +0000 (0:00:00.049)       0:00:21.035 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [Verify the LUKS volume's device type if encrypted] ***********************
task path: /tmp/tmpsvhdx31t/tests/test-verify-volume-encryption.yml:44
Thursday 21 July 2022  07:14:25 +0000 (0:00:00.023)       0:00:21.059 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [Check LUKS version] ******************************************************
task path: /tmp/tmpsvhdx31t/tests/test-verify-volume-encryption.yml:49
Thursday 21 July 2022  07:14:25 +0000 (0:00:00.023)       0:00:21.082 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [Check LUKS key size] *****************************************************
task path: /tmp/tmpsvhdx31t/tests/test-verify-volume-encryption.yml:55
Thursday 21 July 2022  07:14:25 +0000 (0:00:00.023)       0:00:21.106 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [Check LUKS cipher] *******************************************************
task path: /tmp/tmpsvhdx31t/tests/test-verify-volume-encryption.yml:61
Thursday 21 July 2022  07:14:25 +0000 (0:00:00.023)       0:00:21.129 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [set_fact] ****************************************************************
task path: /tmp/tmpsvhdx31t/tests/test-verify-volume-encryption.yml:67
Thursday 21 July 2022  07:14:25 +0000 (0:00:00.022)       0:00:21.152 ********* 
ok: [/cache/rhel-9.qcow2.snap] => {
    "ansible_facts": {
        "_storage_test_crypttab_entries": [],
        "_storage_test_expected_crypttab_entries": "0",
        "_storage_test_expected_crypttab_key_file": "-"
    },
    "changed": false
}

TASK [Check for /etc/crypttab entry] *******************************************
task path: /tmp/tmpsvhdx31t/tests/test-verify-volume-encryption.yml:74
Thursday 21 July 2022  07:14:25 +0000 (0:00:00.050)       0:00:21.203 ********* 
ok: [/cache/rhel-9.qcow2.snap] => {
    "changed": false
}

MSG:

All assertions passed

TASK [Validate the format of the crypttab entry] *******************************
task path: /tmp/tmpsvhdx31t/tests/test-verify-volume-encryption.yml:79
Thursday 21 July 2022  07:14:25 +0000 (0:00:00.049)       0:00:21.252 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [Check backing device of crypttab entry] **********************************
task path: /tmp/tmpsvhdx31t/tests/test-verify-volume-encryption.yml:85
Thursday 21 July 2022  07:14:25 +0000 (0:00:00.035)       0:00:21.288 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [Check key file of crypttab entry] ****************************************
task path: /tmp/tmpsvhdx31t/tests/test-verify-volume-encryption.yml:91
Thursday 21 July 2022  07:14:25 +0000 (0:00:00.035)       0:00:21.323 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [set_fact] ****************************************************************
task path: /tmp/tmpsvhdx31t/tests/test-verify-volume-encryption.yml:97
Thursday 21 July 2022  07:14:25 +0000 (0:00:00.035)       0:00:21.358 ********* 
ok: [/cache/rhel-9.qcow2.snap] => {
    "ansible_facts": {
        "_storage_test_crypttab_entries": null,
        "_storage_test_expected_crypttab_entries": null,
        "_storage_test_expected_crypttab_key_file": null
    },
    "changed": false
}

TASK [get information about RAID] **********************************************
task path: /tmp/tmpsvhdx31t/tests/test-verify-volume-md.yml:7
Thursday 21 July 2022  07:14:25 +0000 (0:00:00.037)       0:00:21.396 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [set_fact] ****************************************************************
task path: /tmp/tmpsvhdx31t/tests/test-verify-volume-md.yml:13
Thursday 21 July 2022  07:14:25 +0000 (0:00:00.037)       0:00:21.434 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [set_fact] ****************************************************************
task path: /tmp/tmpsvhdx31t/tests/test-verify-volume-md.yml:17
Thursday 21 July 2022  07:14:25 +0000 (0:00:00.037)       0:00:21.471 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [set_fact] ****************************************************************
task path: /tmp/tmpsvhdx31t/tests/test-verify-volume-md.yml:21
Thursday 21 July 2022  07:14:25 +0000 (0:00:00.037)       0:00:21.508 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [check RAID active devices count] *****************************************
task path: /tmp/tmpsvhdx31t/tests/test-verify-volume-md.yml:25
Thursday 21 July 2022  07:14:25 +0000 (0:00:00.041)       0:00:21.550 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [check RAID spare devices count] ******************************************
task path: /tmp/tmpsvhdx31t/tests/test-verify-volume-md.yml:31
Thursday 21 July 2022  07:14:26 +0000 (0:00:00.036)       0:00:21.586 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [check RAID metadata version] *********************************************
task path: /tmp/tmpsvhdx31t/tests/test-verify-volume-md.yml:37
Thursday 21 July 2022  07:14:26 +0000 (0:00:00.035)       0:00:21.622 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [parse the actual size of the volume] *************************************
task path: /tmp/tmpsvhdx31t/tests/test-verify-volume-size.yml:3
Thursday 21 July 2022  07:14:26 +0000 (0:00:00.035)       0:00:21.657 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [parse the requested size of the volume] **********************************
task path: /tmp/tmpsvhdx31t/tests/test-verify-volume-size.yml:9
Thursday 21 July 2022  07:14:26 +0000 (0:00:00.024)       0:00:21.682 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [Establish base value for expected size] **********************************
task path: /tmp/tmpsvhdx31t/tests/test-verify-volume-size.yml:15
Thursday 21 July 2022  07:14:26 +0000 (0:00:00.036)       0:00:21.718 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [debug] *******************************************************************
task path: /tmp/tmpsvhdx31t/tests/test-verify-volume-size.yml:20
Thursday 21 July 2022  07:14:26 +0000 (0:00:00.035)       0:00:21.753 ********* 
ok: [/cache/rhel-9.qcow2.snap] => {
    "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined"
}

TASK [debug] *******************************************************************
task path: /tmp/tmpsvhdx31t/tests/test-verify-volume-size.yml:25
Thursday 21 July 2022  07:14:26 +0000 (0:00:00.065)       0:00:21.819 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {}

TASK [debug] *******************************************************************
task path: /tmp/tmpsvhdx31t/tests/test-verify-volume-size.yml:28
Thursday 21 July 2022  07:14:26 +0000 (0:00:00.036)       0:00:21.855 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {}

TASK [Get the size of parent/pool device] **************************************
task path: /tmp/tmpsvhdx31t/tests/test-verify-volume-size.yml:31
Thursday 21 July 2022  07:14:26 +0000 (0:00:00.039)       0:00:21.895 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [debug] *******************************************************************
task path: /tmp/tmpsvhdx31t/tests/test-verify-volume-size.yml:36
Thursday 21 July 2022  07:14:26 +0000 (0:00:00.035)       0:00:21.930 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {}

TASK [Calculate the expected size based on pool size and percentage value] *****
task path: /tmp/tmpsvhdx31t/tests/test-verify-volume-size.yml:39
Thursday 21 July 2022  07:14:26 +0000 (0:00:00.036)       0:00:21.966 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [debug] *******************************************************************
task path: /tmp/tmpsvhdx31t/tests/test-verify-volume-size.yml:44
Thursday 21 July 2022  07:14:26 +0000 (0:00:00.065)       0:00:22.032 ********* 
ok: [/cache/rhel-9.qcow2.snap] => {
    "storage_test_actual_size": {
        "changed": false,
        "skip_reason": "Conditional result was False",
        "skipped": true
    }
}

TASK [debug] *******************************************************************
task path: /tmp/tmpsvhdx31t/tests/test-verify-volume-size.yml:47
Thursday 21 July 2022  07:14:26 +0000 (0:00:00.037)       0:00:22.069 ********* 
ok: [/cache/rhel-9.qcow2.snap] => {
    "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined"
}

TASK [assert] ******************************************************************
task path: /tmp/tmpsvhdx31t/tests/test-verify-volume-size.yml:50
Thursday 21 July 2022  07:14:26 +0000 (0:00:00.036)       0:00:22.105 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [Get information about the LV] ********************************************
task path: /tmp/tmpsvhdx31t/tests/test-verify-volume-cache.yml:6
Thursday 21 July 2022  07:14:26 +0000 (0:00:00.024)       0:00:22.130 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [set_fact] ****************************************************************
task path: /tmp/tmpsvhdx31t/tests/test-verify-volume-cache.yml:14
Thursday 21 July 2022  07:14:26 +0000 (0:00:00.023)       0:00:22.153 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [check segment type] ******************************************************
task path: /tmp/tmpsvhdx31t/tests/test-verify-volume-cache.yml:17
Thursday 21 July 2022  07:14:26 +0000 (0:00:00.023)       0:00:22.177 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [set_fact] ****************************************************************
task path: /tmp/tmpsvhdx31t/tests/test-verify-volume-cache.yml:22
Thursday 21 July 2022  07:14:26 +0000 (0:00:00.023)       0:00:22.200 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [parse the requested cache size] ******************************************
task path: /tmp/tmpsvhdx31t/tests/test-verify-volume-cache.yml:26
Thursday 21 July 2022  07:14:26 +0000 (0:00:00.028)       0:00:22.229 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [set_fact] ****************************************************************
task path: /tmp/tmpsvhdx31t/tests/test-verify-volume-cache.yml:32
Thursday 21 July 2022  07:14:26 +0000 (0:00:00.025)       0:00:22.255 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [Check cache size] ********************************************************
task path: /tmp/tmpsvhdx31t/tests/test-verify-volume-cache.yml:36
Thursday 21 July 2022  07:14:26 +0000 (0:00:00.025)       0:00:22.281 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [Clean up facts] **********************************************************
task path: /tmp/tmpsvhdx31t/tests/test-verify-volume.yml:16
Thursday 21 July 2022  07:14:26 +0000 (0:00:00.025)       0:00:22.307 ********* 
ok: [/cache/rhel-9.qcow2.snap] => {
    "ansible_facts": {
        "_storage_test_volume_present": null
    },
    "changed": false
}

TASK [Clean up variable namespace] *********************************************
task path: /tmp/tmpsvhdx31t/tests/verify-role-results.yml:53
Thursday 21 July 2022  07:14:26 +0000 (0:00:00.039)       0:00:22.346 ********* 
ok: [/cache/rhel-9.qcow2.snap] => {
    "ansible_facts": {
        "storage_test_blkinfo": null,
        "storage_test_crypttab": null,
        "storage_test_fstab": null
    },
    "changed": false
}

TASK [Change the disk device mount location to "/opt/test2"] *******************
task path: /tmp/tmpsvhdx31t/tests/tests_change_disk_mount.yml:37
Thursday 21 July 2022  07:14:26 +0000 (0:00:00.034)       0:00:22.381 ********* 

TASK [linux-system-roles.storage : set platform/version specific variables] ****
task path: /tmp/tmpsvhdx31t/tests/roles/linux-system-roles.storage/tasks/main.yml:2
Thursday 21 July 2022  07:14:26 +0000 (0:00:00.049)       0:00:22.430 ********* 
included: /tmp/tmpsvhdx31t/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-9.qcow2.snap

TASK [linux-system-roles.storage : Ensure ansible_facts used by role] **********
task path: /tmp/tmpsvhdx31t/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2
Thursday 21 July 2022  07:14:26 +0000 (0:00:00.035)       0:00:22.465 ********* 
ok: [/cache/rhel-9.qcow2.snap]

TASK [linux-system-roles.storage : Set platform/version specific variables] ****
task path: /tmp/tmpsvhdx31t/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8
Thursday 21 July 2022  07:14:27 +0000 (0:00:00.518)       0:00:22.984 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => (item=RedHat.yml)  => {
    "ansible_loop_var": "item",
    "changed": false,
    "item": "RedHat.yml",
    "skip_reason": "Conditional result was False"
}
skipping: [/cache/rhel-9.qcow2.snap] => (item=RedHat.yml)  => {
    "ansible_loop_var": "item",
    "changed": false,
    "item": "RedHat.yml",
    "skip_reason": "Conditional result was False"
}
ok: [/cache/rhel-9.qcow2.snap] => (item=RedHat_9.yml) => {
    "ansible_facts": {
        "blivet_package_list": [
            "python3-blivet",
            "libblockdev-crypto",
            "libblockdev-dm",
            "libblockdev-lvm",
            "libblockdev-mdraid",
            "libblockdev-swap",
            "vdo",
            "kmod-kvdo",
            "xfsprogs"
        ]
    },
    "ansible_included_var_files": [
        "/tmp/tmpsvhdx31t/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml"
    ],
    "ansible_loop_var": "item",
    "changed": false,
    "item": "RedHat_9.yml"
}
skipping: [/cache/rhel-9.qcow2.snap] => (item=RedHat_9.0.yml)  => {
    "ansible_loop_var": "item",
    "changed": false,
    "item": "RedHat_9.0.yml",
    "skip_reason": "Conditional result was False"
}

TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] ***
task path: /tmp/tmpsvhdx31t/tests/roles/linux-system-roles.storage/tasks/main.yml:5
Thursday 21 July 2022  07:14:27 +0000 (0:00:00.062)       0:00:23.046 ********* 
ok: [/cache/rhel-9.qcow2.snap] => {
    "ansible_facts": {
        "_storage_pools_list": []
    },
    "changed": false
}

TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] ***
task path: /tmp/tmpsvhdx31t/tests/roles/linux-system-roles.storage/tasks/main.yml:9
Thursday 21 July 2022  07:14:27 +0000 (0:00:00.034)       0:00:23.081 ********* 
ok: [/cache/rhel-9.qcow2.snap] => {
    "ansible_facts": {
        "_storage_volumes_list": []
    },
    "changed": false
}

TASK [linux-system-roles.storage : include the appropriate provider tasks] *****
task path: /tmp/tmpsvhdx31t/tests/roles/linux-system-roles.storage/tasks/main.yml:13
Thursday 21 July 2022  07:14:27 +0000 (0:00:00.032)       0:00:23.113 ********* 
redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount
redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount
included: /tmp/tmpsvhdx31t/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-9.qcow2.snap

TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] ***
task path: /tmp/tmpsvhdx31t/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2
Thursday 21 July 2022  07:14:27 +0000 (0:00:00.045)       0:00:23.159 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [linux-system-roles.storage : make sure blivet is available] **************
task path: /tmp/tmpsvhdx31t/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7
Thursday 21 July 2022  07:14:27 +0000 (0:00:00.022)       0:00:23.181 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [linux-system-roles.storage : show storage_pools] *************************
task path: /tmp/tmpsvhdx31t/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14
Thursday 21 July 2022  07:14:27 +0000 (0:00:00.027)       0:00:23.209 ********* 
ok: [/cache/rhel-9.qcow2.snap] => {
    "storage_pools": "VARIABLE IS NOT DEFINED!: 'storage_pools' is undefined"
}

TASK [linux-system-roles.storage : show storage_volumes] ***********************
task path: /tmp/tmpsvhdx31t/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19
Thursday 21 July 2022  07:14:27 +0000 (0:00:00.035)       0:00:23.244 ********* 
ok: [/cache/rhel-9.qcow2.snap] => {
    "storage_volumes": [
        {
            "disks": [
                "sda"
            ],
            "mount_point": "/opt/test2",
            "name": "test1",
            "type": "disk"
        }
    ]
}

TASK [linux-system-roles.storage : get required packages] **********************
task path: /tmp/tmpsvhdx31t/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24
Thursday 21 July 2022  07:14:27 +0000 (0:00:00.038)       0:00:23.282 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [linux-system-roles.storage : enable copr repositories if needed] *********
task path: /tmp/tmpsvhdx31t/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37
Thursday 21 July 2022  07:14:27 +0000 (0:00:00.029)       0:00:23.311 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [linux-system-roles.storage : make sure required packages are installed] ***
task path: /tmp/tmpsvhdx31t/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44
Thursday 21 July 2022  07:14:27 +0000 (0:00:00.031)       0:00:23.343 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [linux-system-roles.storage : get service facts] **************************
task path: /tmp/tmpsvhdx31t/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51
Thursday 21 July 2022  07:14:27 +0000 (0:00:00.028)       0:00:23.371 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************
task path: /tmp/tmpsvhdx31t/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58
Thursday 21 July 2022  07:14:27 +0000 (0:00:00.031)       0:00:23.403 ********* 
ok: [/cache/rhel-9.qcow2.snap] => {
    "ansible_facts": {
        "storage_cryptsetup_services": []
    },
    "changed": false
}

TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] *******
task path: /tmp/tmpsvhdx31t/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71
Thursday 21 July 2022  07:14:27 +0000 (0:00:00.058)       0:00:23.461 ********* 

TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] ***
task path: /tmp/tmpsvhdx31t/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77
Thursday 21 July 2022  07:14:27 +0000 (0:00:00.021)       0:00:23.482 ********* 
ok: [/cache/rhel-9.qcow2.snap] => {
    "actions": [],
    "changed": false,
    "crypts": [],
    "leaves": [
        "/dev/sr0",
        "/dev/vda1",
        "/dev/vda2",
        "/dev/vda3",
        "/dev/vda4",
        "/dev/sda",
        "/dev/sdb",
        "/dev/sdc",
        "/dev/nvme0n1",
        "/dev/nvme1n1",
        "/dev/nvme2n1",
        "/dev/vdb1",
        "/dev/vdc1",
        "/dev/vdd",
        "/dev/vde",
        "/dev/vdf"
    ],
    "mounts": [
        {
            "path": "/opt/test1",
            "state": "absent"
        },
        {
            "dump": 0,
            "fstype": "xfs",
            "opts": "defaults",
            "passno": 0,
            "path": "/opt/test2",
            "src": "UUID=31c6244a-312c-4308-8a54-e9662b6a934a",
            "state": "mounted"
        }
    ],
    "packages": [
        "e2fsprogs",
        "dosfstools",
        "xfsprogs"
    ],
    "pools": [],
    "volumes": [
        {
            "_device": "/dev/sda",
            "_kernel_device": "/dev/sda",
            "_mount_id": "UUID=31c6244a-312c-4308-8a54-e9662b6a934a",
            "_raw_device": "/dev/sda",
            "_raw_kernel_device": "/dev/sda",
            "cache_devices": [],
            "cache_mode": null,
            "cache_size": 0,
            "cached": false,
            "compression": null,
            "deduplication": null,
            "disks": [
                "sda"
            ],
            "encryption": false,
            "encryption_cipher": null,
            "encryption_key": null,
            "encryption_key_size": null,
            "encryption_luks_version": null,
            "encryption_password": null,
            "fs_create_options": "",
            "fs_label": "",
            "fs_overwrite_existing": true,
            "fs_type": "xfs",
            "mount_check": 0,
            "mount_device_identifier": "uuid",
            "mount_options": "defaults",
            "mount_passno": 0,
            "mount_point": "/opt/test2",
            "name": "test1",
            "raid_chunk_size": null,
            "raid_device_count": null,
            "raid_level": null,
            "raid_metadata_version": null,
            "raid_spare_count": null,
            "size": 10737418240,
            "state": "present",
            "thin": null,
            "thin_pool_name": null,
            "thin_pool_size": null,
            "type": "disk",
            "vdo_pool_size": null
        }
    ]
}

TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] ***
task path: /tmp/tmpsvhdx31t/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:91
Thursday 21 July 2022  07:14:29 +0000 (0:00:01.383)       0:00:24.866 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] *****
task path: /tmp/tmpsvhdx31t/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103
Thursday 21 July 2022  07:14:29 +0000 (0:00:00.035)       0:00:24.902 ********* 

TASK [linux-system-roles.storage : show blivet_output] *************************
task path: /tmp/tmpsvhdx31t/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:109
Thursday 21 July 2022  07:14:29 +0000 (0:00:00.020)       0:00:24.923 ********* 
ok: [/cache/rhel-9.qcow2.snap] => {
    "blivet_output": {
        "actions": [],
        "changed": false,
        "crypts": [],
        "failed": false,
        "leaves": [
            "/dev/sr0",
            "/dev/vda1",
            "/dev/vda2",
            "/dev/vda3",
            "/dev/vda4",
            "/dev/sda",
            "/dev/sdb",
            "/dev/sdc",
            "/dev/nvme0n1",
            "/dev/nvme1n1",
            "/dev/nvme2n1",
            "/dev/vdb1",
            "/dev/vdc1",
            "/dev/vdd",
            "/dev/vde",
            "/dev/vdf"
        ],
        "mounts": [
            {
                "path": "/opt/test1",
                "state": "absent"
            },
            {
                "dump": 0,
                "fstype": "xfs",
                "opts": "defaults",
                "passno": 0,
                "path": "/opt/test2",
                "src": "UUID=31c6244a-312c-4308-8a54-e9662b6a934a",
                "state": "mounted"
            }
        ],
        "packages": [
            "e2fsprogs",
            "dosfstools",
            "xfsprogs"
        ],
        "pools": [],
        "volumes": [
            {
                "_device": "/dev/sda",
                "_kernel_device": "/dev/sda",
                "_mount_id": "UUID=31c6244a-312c-4308-8a54-e9662b6a934a",
                "_raw_device": "/dev/sda",
                "_raw_kernel_device": "/dev/sda",
                "cache_devices": [],
                "cache_mode": null,
                "cache_size": 0,
                "cached": false,
                "compression": null,
                "deduplication": null,
                "disks": [
                    "sda"
                ],
                "encryption": false,
                "encryption_cipher": null,
                "encryption_key": null,
                "encryption_key_size": null,
                "encryption_luks_version": null,
                "encryption_password": null,
                "fs_create_options": "",
                "fs_label": "",
                "fs_overwrite_existing": true,
                "fs_type": "xfs",
                "mount_check": 0,
                "mount_device_identifier": "uuid",
                "mount_options": "defaults",
                "mount_passno": 0,
                "mount_point": "/opt/test2",
                "name": "test1",
                "raid_chunk_size": null,
                "raid_device_count": null,
                "raid_level": null,
                "raid_metadata_version": null,
                "raid_spare_count": null,
                "size": 10737418240,
                "state": "present",
                "thin": null,
                "thin_pool_name": null,
                "thin_pool_size": null,
                "type": "disk",
                "vdo_pool_size": null
            }
        ]
    }
}

TASK [linux-system-roles.storage : set the list of pools for test verification] ***
task path: /tmp/tmpsvhdx31t/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:114
Thursday 21 July 2022  07:14:29 +0000 (0:00:00.037)       0:00:24.961 ********* 
ok: [/cache/rhel-9.qcow2.snap] => {
    "ansible_facts": {
        "_storage_pools_list": []
    },
    "changed": false
}

TASK [linux-system-roles.storage : set the list of volumes for test verification] ***
task path: /tmp/tmpsvhdx31t/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:118
Thursday 21 July 2022  07:14:29 +0000 (0:00:00.036)       0:00:24.998 ********* 
ok: [/cache/rhel-9.qcow2.snap] => {
    "ansible_facts": {
        "_storage_volumes_list": [
            {
                "_device": "/dev/sda",
                "_kernel_device": "/dev/sda",
                "_mount_id": "UUID=31c6244a-312c-4308-8a54-e9662b6a934a",
                "_raw_device": "/dev/sda",
                "_raw_kernel_device": "/dev/sda",
                "cache_devices": [],
                "cache_mode": null,
                "cache_size": 0,
                "cached": false,
                "compression": null,
                "deduplication": null,
                "disks": [
                    "sda"
                ],
                "encryption": false,
                "encryption_cipher": null,
                "encryption_key": null,
                "encryption_key_size": null,
                "encryption_luks_version": null,
                "encryption_password": null,
                "fs_create_options": "",
                "fs_label": "",
                "fs_overwrite_existing": true,
                "fs_type": "xfs",
                "mount_check": 0,
                "mount_device_identifier": "uuid",
                "mount_options": "defaults",
                "mount_passno": 0,
                "mount_point": "/opt/test2",
                "name": "test1",
                "raid_chunk_size": null,
                "raid_device_count": null,
                "raid_level": null,
                "raid_metadata_version": null,
                "raid_spare_count": null,
                "size": 10737418240,
                "state": "present",
                "thin": null,
                "thin_pool_name": null,
                "thin_pool_size": null,
                "type": "disk",
                "vdo_pool_size": null
            }
        ]
    },
    "changed": false
}

TASK [linux-system-roles.storage : remove obsolete mounts] *********************
task path: /tmp/tmpsvhdx31t/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:134
Thursday 21 July 2022  07:14:29 +0000 (0:00:00.038)       0:00:25.037 ********* 
redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount
changed: [/cache/rhel-9.qcow2.snap] => (item={'path': '/opt/test1', 'state': 'absent'}) => {
    "ansible_loop_var": "mount_info",
    "backup_file": "",
    "boot": "yes",
    "changed": true,
    "dump": "0",
    "fstab": "/etc/fstab",
    "mount_info": {
        "path": "/opt/test1",
        "state": "absent"
    },
    "name": "/opt/test1",
    "opts": "defaults",
    "passno": "0"
}

TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] ***
task path: /tmp/tmpsvhdx31t/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146
Thursday 21 July 2022  07:14:29 +0000 (0:00:00.389)       0:00:25.426 ********* 
ok: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "name": null,
    "status": {}
}

TASK [linux-system-roles.storage : set up new/current mounts] ******************
task path: /tmp/tmpsvhdx31t/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151
Thursday 21 July 2022  07:14:30 +0000 (0:00:00.665)       0:00:26.092 ********* 
redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount
changed: [/cache/rhel-9.qcow2.snap] => (item={'src': 'UUID=31c6244a-312c-4308-8a54-e9662b6a934a', 'path': '/opt/test2', 'fstype': 'xfs', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted'}) => {
    "ansible_loop_var": "mount_info",
    "backup_file": "",
    "boot": "yes",
    "changed": true,
    "dump": "0",
    "fstab": "/etc/fstab",
    "fstype": "xfs",
    "mount_info": {
        "dump": 0,
        "fstype": "xfs",
        "opts": "defaults",
        "passno": 0,
        "path": "/opt/test2",
        "src": "UUID=31c6244a-312c-4308-8a54-e9662b6a934a",
        "state": "mounted"
    },
    "name": "/opt/test2",
    "opts": "defaults",
    "passno": "0",
    "src": "UUID=31c6244a-312c-4308-8a54-e9662b6a934a"
}

TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] ***
task path: /tmp/tmpsvhdx31t/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163
Thursday 21 July 2022  07:14:30 +0000 (0:00:00.397)       0:00:26.490 ********* 
ok: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "name": null,
    "status": {}
}

TASK [linux-system-roles.storage : retrieve facts for the /etc/crypttab file] ***
task path: /tmp/tmpsvhdx31t/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171
Thursday 21 July 2022  07:14:31 +0000 (0:00:00.632)       0:00:27.122 ********* 
ok: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "stat": {
        "exists": false
    }
}

TASK [linux-system-roles.storage : manage /etc/crypttab to account for changes we just made] ***
task path: /tmp/tmpsvhdx31t/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:176
Thursday 21 July 2022  07:14:31 +0000 (0:00:00.349)       0:00:27.472 ********* 

TASK [linux-system-roles.storage : Update facts] *******************************
task path: /tmp/tmpsvhdx31t/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198
Thursday 21 July 2022  07:14:31 +0000 (0:00:00.023)       0:00:27.495 ********* 
ok: [/cache/rhel-9.qcow2.snap]
META: role_complete for /cache/rhel-9.qcow2.snap

TASK [include_tasks] ***********************************************************
task path: /tmp/tmpsvhdx31t/tests/tests_change_disk_mount.yml:47
Thursday 21 July 2022  07:14:32 +0000 (0:00:00.966)       0:00:28.461 ********* 
included: /tmp/tmpsvhdx31t/tests/verify-role-results.yml for /cache/rhel-9.qcow2.snap

TASK [Print out pool information] **********************************************
task path: /tmp/tmpsvhdx31t/tests/verify-role-results.yml:1
Thursday 21 July 2022  07:14:32 +0000 (0:00:00.041)       0:00:28.503 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {}

TASK [Print out volume information] ********************************************
task path: /tmp/tmpsvhdx31t/tests/verify-role-results.yml:6
Thursday 21 July 2022  07:14:32 +0000 (0:00:00.035)       0:00:28.539 ********* 
ok: [/cache/rhel-9.qcow2.snap] => {
    "_storage_volumes_list": [
        {
            "_device": "/dev/sda",
            "_kernel_device": "/dev/sda",
            "_mount_id": "UUID=31c6244a-312c-4308-8a54-e9662b6a934a",
            "_raw_device": "/dev/sda",
            "_raw_kernel_device": "/dev/sda",
            "cache_devices": [],
            "cache_mode": null,
            "cache_size": 0,
            "cached": false,
            "compression": null,
            "deduplication": null,
            "disks": [
                "sda"
            ],
            "encryption": false,
            "encryption_cipher": null,
            "encryption_key": null,
            "encryption_key_size": null,
            "encryption_luks_version": null,
            "encryption_password": null,
            "fs_create_options": "",
            "fs_label": "",
            "fs_overwrite_existing": true,
            "fs_type": "xfs",
            "mount_check": 0,
            "mount_device_identifier": "uuid",
            "mount_options": "defaults",
            "mount_passno": 0,
            "mount_point": "/opt/test2",
            "name": "test1",
            "raid_chunk_size": null,
            "raid_device_count": null,
            "raid_level": null,
            "raid_metadata_version": null,
            "raid_spare_count": null,
            "size": 10737418240,
            "state": "present",
            "thin": null,
            "thin_pool_name": null,
            "thin_pool_size": null,
            "type": "disk",
            "vdo_pool_size": null
        }
    ]
}

TASK [Collect info about the volumes.] *****************************************
task path: /tmp/tmpsvhdx31t/tests/verify-role-results.yml:14
Thursday 21 July 2022  07:14:33 +0000 (0:00:00.082)       0:00:28.621 ********* 
ok: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "info": {
        "/dev/nvme0n1": {
            "fstype": "",
            "label": "",
            "name": "/dev/nvme0n1",
            "size": "10G",
            "type": "disk",
            "uuid": ""
        },
        "/dev/nvme1n1": {
            "fstype": "",
            "label": "",
            "name": "/dev/nvme1n1",
            "size": "10G",
            "type": "disk",
            "uuid": ""
        },
        "/dev/nvme2n1": {
            "fstype": "",
            "label": "",
            "name": "/dev/nvme2n1",
            "size": "10G",
            "type": "disk",
            "uuid": ""
        },
        "/dev/sda": {
            "fstype": "xfs",
            "label": "",
            "name": "/dev/sda",
            "size": "10G",
            "type": "disk",
            "uuid": "31c6244a-312c-4308-8a54-e9662b6a934a"
        },
        "/dev/sdb": {
            "fstype": "",
            "label": "",
            "name": "/dev/sdb",
            "size": "10G",
            "type": "disk",
            "uuid": ""
        },
        "/dev/sdc": {
            "fstype": "",
            "label": "",
            "name": "/dev/sdc",
            "size": "10G",
            "type": "disk",
            "uuid": ""
        },
        "/dev/sr0": {
            "fstype": "iso9660",
            "label": "cidata",
            "name": "/dev/sr0",
            "size": "364K",
            "type": "rom",
            "uuid": "2022-07-21-07-13-51-00"
        },
        "/dev/vda": {
            "fstype": "",
            "label": "",
            "name": "/dev/vda",
            "size": "10G",
            "type": "disk",
            "uuid": ""
        },
        "/dev/vda1": {
            "fstype": "",
            "label": "",
            "name": "/dev/vda1",
            "size": "1M",
            "type": "partition",
            "uuid": ""
        },
        "/dev/vda2": {
            "fstype": "vfat",
            "label": "",
            "name": "/dev/vda2",
            "size": "200M",
            "type": "partition",
            "uuid": "7B77-95E7"
        },
        "/dev/vda3": {
            "fstype": "xfs",
            "label": "boot",
            "name": "/dev/vda3",
            "size": "500M",
            "type": "partition",
            "uuid": "6e74e171-0370-451f-8340-f16ad2839183"
        },
        "/dev/vda4": {
            "fstype": "xfs",
            "label": "root",
            "name": "/dev/vda4",
            "size": "9.3G",
            "type": "partition",
            "uuid": "1bb53e4d-984c-4316-908a-59b5a62fa30e"
        },
        "/dev/vdb": {
            "fstype": "",
            "label": "",
            "name": "/dev/vdb",
            "size": "2G",
            "type": "disk",
            "uuid": ""
        },
        "/dev/vdb1": {
            "fstype": "ext4",
            "label": "yumcache",
            "name": "/dev/vdb1",
            "size": "2G",
            "type": "partition",
            "uuid": "12be66f7-7f03-4d54-9a48-a520825d8fa9"
        },
        "/dev/vdc": {
            "fstype": "",
            "label": "",
            "name": "/dev/vdc",
            "size": "2G",
            "type": "disk",
            "uuid": ""
        },
        "/dev/vdc1": {
            "fstype": "ext4",
            "label": "yumvarlib",
            "name": "/dev/vdc1",
            "size": "2G",
            "type": "partition",
            "uuid": "9bd037ed-b1bb-4b9c-a008-9f07d34ab071"
        },
        "/dev/vdd": {
            "fstype": "",
            "label": "",
            "name": "/dev/vdd",
            "size": "10G",
            "type": "disk",
            "uuid": ""
        },
        "/dev/vde": {
            "fstype": "",
            "label": "",
            "name": "/dev/vde",
            "size": "10G",
            "type": "disk",
            "uuid": ""
        },
        "/dev/vdf": {
            "fstype": "",
            "label": "",
            "name": "/dev/vdf",
            "size": "10G",
            "type": "disk",
            "uuid": ""
        }
    }
}

TASK [Read the /etc/fstab file for volume existence] ***************************
task path: /tmp/tmpsvhdx31t/tests/verify-role-results.yml:19
Thursday 21 July 2022  07:14:33 +0000 (0:00:00.402)       0:00:29.024 ********* 
ok: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "cmd": [
        "cat",
        "/etc/fstab"
    ],
    "delta": "0:00:00.002425",
    "end": "2022-07-21 03:14:33.096446",
    "rc": 0,
    "start": "2022-07-21 03:14:33.094021"
}

STDOUT:

UUID=6e74e171-0370-451f-8340-f16ad2839183	/boot	xfs	defaults	0	0
UUID=1bb53e4d-984c-4316-908a-59b5a62fa30e	/	xfs	defaults	0	0
UUID=7B77-95E7	/boot/efi	vfat	defaults,uid=0,gid=0,umask=077,shortname=winnt	0	2
/dev/vdb1	/var/cache/dnf	auto	defaults,nofail,x-systemd.requires=cloud-init.service,_netdev,comment=cloudconfig	0	2
/dev/vdc1	/var/lib/dnf	auto	defaults,nofail,x-systemd.requires=cloud-init.service,_netdev,comment=cloudconfig	0	2
UUID=31c6244a-312c-4308-8a54-e9662b6a934a /opt/test2 xfs defaults 0 0

TASK [Read the /etc/crypttab file] *********************************************
task path: /tmp/tmpsvhdx31t/tests/verify-role-results.yml:24
Thursday 21 July 2022  07:14:33 +0000 (0:00:00.351)       0:00:29.376 ********* 
ok: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "cmd": [
        "cat",
        "/etc/crypttab"
    ],
    "delta": "0:00:00.002405",
    "end": "2022-07-21 03:14:33.446479",
    "failed_when_result": false,
    "rc": 1,
    "start": "2022-07-21 03:14:33.444074"
}

STDERR:

cat: /etc/crypttab: No such file or directory


MSG:

non-zero return code

TASK [Verify the volumes listed in storage_pools were correctly managed] *******
task path: /tmp/tmpsvhdx31t/tests/verify-role-results.yml:33
Thursday 21 July 2022  07:14:34 +0000 (0:00:00.348)       0:00:29.724 ********* 

TASK [Verify the volumes with no pool were correctly managed] ******************
task path: /tmp/tmpsvhdx31t/tests/verify-role-results.yml:43
Thursday 21 July 2022  07:14:34 +0000 (0:00:00.021)       0:00:29.746 ********* 
included: /tmp/tmpsvhdx31t/tests/test-verify-volume.yml for /cache/rhel-9.qcow2.snap => (item={'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test2', 'name': 'test1', 'raid_level': None, 'size': 10737418240, 'state': 'present', 'type': 'disk', 'disks': ['sda'], 'raid_device_count': None, 'raid_spare_count': None, 'raid_metadata_version': None, 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_chunk_size': None, 'compression': None, 'deduplication': None, 'vdo_pool_size': None, 'thin': None, 'thin_pool_name': None, 'thin_pool_size': None, 'cached': False, 'cache_size': 0, 'cache_mode': None, 'cache_devices': [], '_device': '/dev/sda', '_raw_device': '/dev/sda', '_mount_id': 'UUID=31c6244a-312c-4308-8a54-e9662b6a934a', '_kernel_device': '/dev/sda', '_raw_kernel_device': '/dev/sda'})

TASK [set_fact] ****************************************************************
task path: /tmp/tmpsvhdx31t/tests/test-verify-volume.yml:2
Thursday 21 July 2022  07:14:34 +0000 (0:00:00.055)       0:00:29.801 ********* 
ok: [/cache/rhel-9.qcow2.snap] => {
    "ansible_facts": {
        "_storage_test_volume_present": true,
        "_storage_volume_tests": [
            "mount",
            "fstab",
            "fs",
            "device",
            "encryption",
            "md",
            "size",
            "cache"
        ]
    },
    "changed": false
}

TASK [include_tasks] ***********************************************************
task path: /tmp/tmpsvhdx31t/tests/test-verify-volume.yml:10
Thursday 21 July 2022  07:14:34 +0000 (0:00:00.047)       0:00:29.849 ********* 
included: /tmp/tmpsvhdx31t/tests/test-verify-volume-mount.yml for /cache/rhel-9.qcow2.snap => (item=mount)
included: /tmp/tmpsvhdx31t/tests/test-verify-volume-fstab.yml for /cache/rhel-9.qcow2.snap => (item=fstab)
included: /tmp/tmpsvhdx31t/tests/test-verify-volume-fs.yml for /cache/rhel-9.qcow2.snap => (item=fs)
included: /tmp/tmpsvhdx31t/tests/test-verify-volume-device.yml for /cache/rhel-9.qcow2.snap => (item=device)
included: /tmp/tmpsvhdx31t/tests/test-verify-volume-encryption.yml for /cache/rhel-9.qcow2.snap => (item=encryption)
included: /tmp/tmpsvhdx31t/tests/test-verify-volume-md.yml for /cache/rhel-9.qcow2.snap => (item=md)
included: /tmp/tmpsvhdx31t/tests/test-verify-volume-size.yml for /cache/rhel-9.qcow2.snap => (item=size)
included: /tmp/tmpsvhdx31t/tests/test-verify-volume-cache.yml for /cache/rhel-9.qcow2.snap => (item=cache)

TASK [Get expected mount device based on device type] **************************
task path: /tmp/tmpsvhdx31t/tests/test-verify-volume-mount.yml:6
Thursday 21 July 2022  07:14:34 +0000 (0:00:00.071)       0:00:29.921 ********* 
ok: [/cache/rhel-9.qcow2.snap] => {
    "ansible_facts": {
        "storage_test_device_path": "/dev/sda"
    },
    "changed": false
}

TASK [Set some facts] **********************************************************
task path: /tmp/tmpsvhdx31t/tests/test-verify-volume-mount.yml:14
Thursday 21 July 2022  07:14:34 +0000 (0:00:00.039)       0:00:29.960 ********* 
ok: [/cache/rhel-9.qcow2.snap] => {
    "ansible_facts": {
        "storage_test_mount_device_matches": [
            {
                "block_available": 2592358,
                "block_size": 4096,
                "block_total": 2618880,
                "block_used": 26522,
                "device": "/dev/sda",
                "fstype": "xfs",
                "inode_available": 5242877,
                "inode_total": 5242880,
                "inode_used": 3,
                "mount": "/opt/test2",
                "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota",
                "size_available": 10618298368,
                "size_total": 10726932480,
                "uuid": "31c6244a-312c-4308-8a54-e9662b6a934a"
            }
        ],
        "storage_test_mount_expected_match_count": "1",
        "storage_test_mount_point_matches": [
            {
                "block_available": 2592358,
                "block_size": 4096,
                "block_total": 2618880,
                "block_used": 26522,
                "device": "/dev/sda",
                "fstype": "xfs",
                "inode_available": 5242877,
                "inode_total": 5242880,
                "inode_used": 3,
                "mount": "/opt/test2",
                "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota",
                "size_available": 10618298368,
                "size_total": 10726932480,
                "uuid": "31c6244a-312c-4308-8a54-e9662b6a934a"
            }
        ],
        "storage_test_swap_expected_matches": "0"
    },
    "changed": false
}

TASK [Verify the current mount state by device] ********************************
task path: /tmp/tmpsvhdx31t/tests/test-verify-volume-mount.yml:28
Thursday 21 July 2022  07:14:34 +0000 (0:00:00.052)       0:00:30.012 ********* 
ok: [/cache/rhel-9.qcow2.snap] => {
    "changed": false
}

MSG:

All assertions passed

TASK [Verify the current mount state by mount point] ***************************
task path: /tmp/tmpsvhdx31t/tests/test-verify-volume-mount.yml:37
Thursday 21 July 2022  07:14:34 +0000 (0:00:00.048)       0:00:30.061 ********* 
ok: [/cache/rhel-9.qcow2.snap] => {
    "changed": false
}

MSG:

All assertions passed

TASK [Verify the mount fs type] ************************************************
task path: /tmp/tmpsvhdx31t/tests/test-verify-volume-mount.yml:45
Thursday 21 July 2022  07:14:34 +0000 (0:00:00.051)       0:00:30.113 ********* 
ok: [/cache/rhel-9.qcow2.snap] => {
    "changed": false
}

MSG:

All assertions passed

TASK [command] *****************************************************************
task path: /tmp/tmpsvhdx31t/tests/test-verify-volume-mount.yml:54
Thursday 21 July 2022  07:14:34 +0000 (0:00:00.050)       0:00:30.163 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [Gather swap info] ********************************************************
task path: /tmp/tmpsvhdx31t/tests/test-verify-volume-mount.yml:58
Thursday 21 July 2022  07:14:34 +0000 (0:00:00.023)       0:00:30.186 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [Verify swap status] ******************************************************
task path: /tmp/tmpsvhdx31t/tests/test-verify-volume-mount.yml:63
Thursday 21 July 2022  07:14:34 +0000 (0:00:00.025)       0:00:30.211 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [Unset facts] *************************************************************
task path: /tmp/tmpsvhdx31t/tests/test-verify-volume-mount.yml:75
Thursday 21 July 2022  07:14:34 +0000 (0:00:00.026)       0:00:30.238 ********* 
ok: [/cache/rhel-9.qcow2.snap] => {
    "ansible_facts": {
        "storage_test_mount_device_matches": null,
        "storage_test_mount_expected_match_count": null,
        "storage_test_mount_point_matches": null,
        "storage_test_swap_expected_matches": null,
        "storage_test_swaps": null,
        "storage_test_sys_node": null
    },
    "changed": false
}

TASK [Set some variables for fstab checking] ***********************************
task path: /tmp/tmpsvhdx31t/tests/test-verify-volume-fstab.yml:2
Thursday 21 July 2022  07:14:34 +0000 (0:00:00.033)       0:00:30.272 ********* 
ok: [/cache/rhel-9.qcow2.snap] => {
    "ansible_facts": {
        "storage_test_fstab_expected_id_matches": "1",
        "storage_test_fstab_expected_mount_options_matches": "1",
        "storage_test_fstab_expected_mount_point_matches": "1",
        "storage_test_fstab_id_matches": [
            "UUID=31c6244a-312c-4308-8a54-e9662b6a934a "
        ],
        "storage_test_fstab_mount_options_matches": [
            " /opt/test2 xfs defaults "
        ],
        "storage_test_fstab_mount_point_matches": [
            " /opt/test2 "
        ]
    },
    "changed": false
}

TASK [Verify that the device identifier appears in /etc/fstab] *****************
task path: /tmp/tmpsvhdx31t/tests/test-verify-volume-fstab.yml:25
Thursday 21 July 2022  07:14:34 +0000 (0:00:00.060)       0:00:30.332 ********* 
ok: [/cache/rhel-9.qcow2.snap] => {
    "changed": false
}

MSG:

All assertions passed

TASK [Verify the fstab mount point] ********************************************
task path: /tmp/tmpsvhdx31t/tests/test-verify-volume-fstab.yml:32
Thursday 21 July 2022  07:14:34 +0000 (0:00:00.054)       0:00:30.387 ********* 
ok: [/cache/rhel-9.qcow2.snap] => {
    "changed": false
}

MSG:

All assertions passed

TASK [Verify mount_options] ****************************************************
task path: /tmp/tmpsvhdx31t/tests/test-verify-volume-fstab.yml:39
Thursday 21 July 2022  07:14:34 +0000 (0:00:00.048)       0:00:30.435 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [Clean up variables] ******************************************************
task path: /tmp/tmpsvhdx31t/tests/test-verify-volume-fstab.yml:49
Thursday 21 July 2022  07:14:34 +0000 (0:00:00.037)       0:00:30.473 ********* 
ok: [/cache/rhel-9.qcow2.snap] => {
    "ansible_facts": {
        "storage_test_fstab_expected_id_matches": null,
        "storage_test_fstab_expected_mount_options_matches": null,
        "storage_test_fstab_expected_mount_point_matches": null,
        "storage_test_fstab_id_matches": null,
        "storage_test_fstab_mount_options_matches": null,
        "storage_test_fstab_mount_point_matches": null
    },
    "changed": false
}

TASK [Verify fs type] **********************************************************
task path: /tmp/tmpsvhdx31t/tests/test-verify-volume-fs.yml:4
Thursday 21 July 2022  07:14:34 +0000 (0:00:00.035)       0:00:30.508 ********* 
ok: [/cache/rhel-9.qcow2.snap] => {
    "changed": false
}

MSG:

All assertions passed

TASK [Verify fs label] *********************************************************
task path: /tmp/tmpsvhdx31t/tests/test-verify-volume-fs.yml:10
Thursday 21 July 2022  07:14:34 +0000 (0:00:00.039)       0:00:30.548 ********* 
ok: [/cache/rhel-9.qcow2.snap] => {
    "changed": false
}

MSG:

All assertions passed

TASK [See whether the device node is present] **********************************
task path: /tmp/tmpsvhdx31t/tests/test-verify-volume-device.yml:4
Thursday 21 July 2022  07:14:35 +0000 (0:00:00.038)       0:00:30.587 ********* 
ok: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "stat": {
        "atime": 1658387656.9902117,
        "attr_flags": "",
        "attributes": [],
        "block_size": 4096,
        "blocks": 0,
        "charset": "binary",
        "ctime": 1658387656.9902117,
        "dev": 5,
        "device_type": 2048,
        "executable": false,
        "exists": true,
        "gid": 6,
        "gr_name": "disk",
        "inode": 363,
        "isblk": true,
        "ischr": false,
        "isdir": false,
        "isfifo": false,
        "isgid": false,
        "islnk": false,
        "isreg": false,
        "issock": false,
        "isuid": false,
        "mimetype": "inode/blockdevice",
        "mode": "0660",
        "mtime": 1658387656.9902117,
        "nlink": 1,
        "path": "/dev/sda",
        "pw_name": "root",
        "readable": true,
        "rgrp": true,
        "roth": false,
        "rusr": true,
        "size": 0,
        "uid": 0,
        "version": null,
        "wgrp": true,
        "woth": false,
        "writeable": true,
        "wusr": true,
        "xgrp": false,
        "xoth": false,
        "xusr": false
    }
}

TASK [Verify the presence/absence of the device node] **************************
task path: /tmp/tmpsvhdx31t/tests/test-verify-volume-device.yml:10
Thursday 21 July 2022  07:14:35 +0000 (0:00:00.413)       0:00:31.001 ********* 
ok: [/cache/rhel-9.qcow2.snap] => {
    "changed": false
}

MSG:

All assertions passed

TASK [Make sure we got info about this volume] *********************************
task path: /tmp/tmpsvhdx31t/tests/test-verify-volume-device.yml:18
Thursday 21 July 2022  07:14:35 +0000 (0:00:00.037)       0:00:31.038 ********* 
ok: [/cache/rhel-9.qcow2.snap] => {
    "changed": false
}

MSG:

All assertions passed

TASK [(1/2) Process volume type (set initial value)] ***************************
task path: /tmp/tmpsvhdx31t/tests/test-verify-volume-device.yml:24
Thursday 21 July 2022  07:14:35 +0000 (0:00:00.035)       0:00:31.074 ********* 
ok: [/cache/rhel-9.qcow2.snap] => {
    "ansible_facts": {
        "st_volume_type": "disk"
    },
    "changed": false
}

TASK [(2/2) Process volume type (get RAID value)] ******************************
task path: /tmp/tmpsvhdx31t/tests/test-verify-volume-device.yml:28
Thursday 21 July 2022  07:14:35 +0000 (0:00:00.034)       0:00:31.108 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [Verify the volume's device type] *****************************************
task path: /tmp/tmpsvhdx31t/tests/test-verify-volume-device.yml:33
Thursday 21 July 2022  07:14:35 +0000 (0:00:00.022)       0:00:31.130 ********* 
ok: [/cache/rhel-9.qcow2.snap] => {
    "changed": false
}

MSG:

All assertions passed

TASK [Stat the LUKS device, if encrypted] **************************************
task path: /tmp/tmpsvhdx31t/tests/test-verify-volume-encryption.yml:3
Thursday 21 July 2022  07:14:35 +0000 (0:00:00.039)       0:00:31.170 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [Ensure cryptsetup is present] ********************************************
task path: /tmp/tmpsvhdx31t/tests/test-verify-volume-encryption.yml:10
Thursday 21 July 2022  07:14:35 +0000 (0:00:00.022)       0:00:31.192 ********* 
ok: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "rc": 0,
    "results": []
}

MSG:

Nothing to do

TASK [Collect LUKS info for this volume] ***************************************
task path: /tmp/tmpsvhdx31t/tests/test-verify-volume-encryption.yml:15
Thursday 21 July 2022  07:14:36 +0000 (0:00:00.810)       0:00:32.003 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [Verify the presence/absence of the LUKS device node] *********************
task path: /tmp/tmpsvhdx31t/tests/test-verify-volume-encryption.yml:21
Thursday 21 July 2022  07:14:36 +0000 (0:00:00.023)       0:00:32.026 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [Verify that the raw device is the same as the device if not encrypted] ***
task path: /tmp/tmpsvhdx31t/tests/test-verify-volume-encryption.yml:30
Thursday 21 July 2022  07:14:36 +0000 (0:00:00.022)       0:00:32.049 ********* 
ok: [/cache/rhel-9.qcow2.snap] => {
    "changed": false
}

MSG:

All assertions passed

TASK [Make sure we got info about the LUKS volume if encrypted] ****************
task path: /tmp/tmpsvhdx31t/tests/test-verify-volume-encryption.yml:38
Thursday 21 July 2022  07:14:36 +0000 (0:00:00.049)       0:00:32.099 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [Verify the LUKS volume's device type if encrypted] ***********************
task path: /tmp/tmpsvhdx31t/tests/test-verify-volume-encryption.yml:44
Thursday 21 July 2022  07:14:36 +0000 (0:00:00.023)       0:00:32.122 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [Check LUKS version] ******************************************************
task path: /tmp/tmpsvhdx31t/tests/test-verify-volume-encryption.yml:49
Thursday 21 July 2022  07:14:36 +0000 (0:00:00.022)       0:00:32.145 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [Check LUKS key size] *****************************************************
task path: /tmp/tmpsvhdx31t/tests/test-verify-volume-encryption.yml:55
Thursday 21 July 2022  07:14:36 +0000 (0:00:00.022)       0:00:32.168 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [Check LUKS cipher] *******************************************************
task path: /tmp/tmpsvhdx31t/tests/test-verify-volume-encryption.yml:61
Thursday 21 July 2022  07:14:36 +0000 (0:00:00.022)       0:00:32.190 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [set_fact] ****************************************************************
task path: /tmp/tmpsvhdx31t/tests/test-verify-volume-encryption.yml:67
Thursday 21 July 2022  07:14:36 +0000 (0:00:00.023)       0:00:32.214 ********* 
ok: [/cache/rhel-9.qcow2.snap] => {
    "ansible_facts": {
        "_storage_test_crypttab_entries": [],
        "_storage_test_expected_crypttab_entries": "0",
        "_storage_test_expected_crypttab_key_file": "-"
    },
    "changed": false
}

TASK [Check for /etc/crypttab entry] *******************************************
task path: /tmp/tmpsvhdx31t/tests/test-verify-volume-encryption.yml:74
Thursday 21 July 2022  07:14:36 +0000 (0:00:00.047)       0:00:32.261 ********* 
ok: [/cache/rhel-9.qcow2.snap] => {
    "changed": false
}

MSG:

All assertions passed

TASK [Validate the format of the crypttab entry] *******************************
task path: /tmp/tmpsvhdx31t/tests/test-verify-volume-encryption.yml:79
Thursday 21 July 2022  07:14:36 +0000 (0:00:00.046)       0:00:32.308 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [Check backing device of crypttab entry] **********************************
task path: /tmp/tmpsvhdx31t/tests/test-verify-volume-encryption.yml:85
Thursday 21 July 2022  07:14:36 +0000 (0:00:00.036)       0:00:32.344 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [Check key file of crypttab entry] ****************************************
task path: /tmp/tmpsvhdx31t/tests/test-verify-volume-encryption.yml:91
Thursday 21 July 2022  07:14:36 +0000 (0:00:00.034)       0:00:32.379 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [set_fact] ****************************************************************
task path: /tmp/tmpsvhdx31t/tests/test-verify-volume-encryption.yml:97
Thursday 21 July 2022  07:14:36 +0000 (0:00:00.034)       0:00:32.413 ********* 
ok: [/cache/rhel-9.qcow2.snap] => {
    "ansible_facts": {
        "_storage_test_crypttab_entries": null,
        "_storage_test_expected_crypttab_entries": null,
        "_storage_test_expected_crypttab_key_file": null
    },
    "changed": false
}

TASK [get information about RAID] **********************************************
task path: /tmp/tmpsvhdx31t/tests/test-verify-volume-md.yml:7
Thursday 21 July 2022  07:14:36 +0000 (0:00:00.033)       0:00:32.446 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [set_fact] ****************************************************************
task path: /tmp/tmpsvhdx31t/tests/test-verify-volume-md.yml:13
Thursday 21 July 2022  07:14:36 +0000 (0:00:00.038)       0:00:32.485 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [set_fact] ****************************************************************
task path: /tmp/tmpsvhdx31t/tests/test-verify-volume-md.yml:17
Thursday 21 July 2022  07:14:36 +0000 (0:00:00.034)       0:00:32.519 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [set_fact] ****************************************************************
task path: /tmp/tmpsvhdx31t/tests/test-verify-volume-md.yml:21
Thursday 21 July 2022  07:14:36 +0000 (0:00:00.034)       0:00:32.554 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [check RAID active devices count] *****************************************
task path: /tmp/tmpsvhdx31t/tests/test-verify-volume-md.yml:25
Thursday 21 July 2022  07:14:37 +0000 (0:00:00.036)       0:00:32.590 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [check RAID spare devices count] ******************************************
task path: /tmp/tmpsvhdx31t/tests/test-verify-volume-md.yml:31
Thursday 21 July 2022  07:14:37 +0000 (0:00:00.038)       0:00:32.629 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [check RAID metadata version] *********************************************
task path: /tmp/tmpsvhdx31t/tests/test-verify-volume-md.yml:37
Thursday 21 July 2022  07:14:37 +0000 (0:00:00.040)       0:00:32.670 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [parse the actual size of the volume] *************************************
task path: /tmp/tmpsvhdx31t/tests/test-verify-volume-size.yml:3
Thursday 21 July 2022  07:14:37 +0000 (0:00:00.036)       0:00:32.706 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [parse the requested size of the volume] **********************************
task path: /tmp/tmpsvhdx31t/tests/test-verify-volume-size.yml:9
Thursday 21 July 2022  07:14:37 +0000 (0:00:00.023)       0:00:32.729 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [Establish base value for expected size] **********************************
task path: /tmp/tmpsvhdx31t/tests/test-verify-volume-size.yml:15
Thursday 21 July 2022  07:14:37 +0000 (0:00:00.066)       0:00:32.796 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [debug] *******************************************************************
task path: /tmp/tmpsvhdx31t/tests/test-verify-volume-size.yml:20
Thursday 21 July 2022  07:14:37 +0000 (0:00:00.036)       0:00:32.832 ********* 
ok: [/cache/rhel-9.qcow2.snap] => {
    "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined"
}

TASK [debug] *******************************************************************
task path: /tmp/tmpsvhdx31t/tests/test-verify-volume-size.yml:25
Thursday 21 July 2022  07:14:37 +0000 (0:00:00.034)       0:00:32.867 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {}

TASK [debug] *******************************************************************
task path: /tmp/tmpsvhdx31t/tests/test-verify-volume-size.yml:28
Thursday 21 July 2022  07:14:37 +0000 (0:00:00.035)       0:00:32.903 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {}

TASK [Get the size of parent/pool device] **************************************
task path: /tmp/tmpsvhdx31t/tests/test-verify-volume-size.yml:31
Thursday 21 July 2022  07:14:37 +0000 (0:00:00.036)       0:00:32.940 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [debug] *******************************************************************
task path: /tmp/tmpsvhdx31t/tests/test-verify-volume-size.yml:36
Thursday 21 July 2022  07:14:37 +0000 (0:00:00.035)       0:00:32.975 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {}

TASK [Calculate the expected size based on pool size and percentage value] *****
task path: /tmp/tmpsvhdx31t/tests/test-verify-volume-size.yml:39
Thursday 21 July 2022  07:14:37 +0000 (0:00:00.035)       0:00:33.010 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [debug] *******************************************************************
task path: /tmp/tmpsvhdx31t/tests/test-verify-volume-size.yml:44
Thursday 21 July 2022  07:14:37 +0000 (0:00:00.035)       0:00:33.045 ********* 
ok: [/cache/rhel-9.qcow2.snap] => {
    "storage_test_actual_size": {
        "changed": false,
        "skip_reason": "Conditional result was False",
        "skipped": true
    }
}

TASK [debug] *******************************************************************
task path: /tmp/tmpsvhdx31t/tests/test-verify-volume-size.yml:47
Thursday 21 July 2022  07:14:37 +0000 (0:00:00.036)       0:00:33.082 ********* 
ok: [/cache/rhel-9.qcow2.snap] => {
    "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined"
}

TASK [assert] ******************************************************************
task path: /tmp/tmpsvhdx31t/tests/test-verify-volume-size.yml:50
Thursday 21 July 2022  07:14:37 +0000 (0:00:00.033)       0:00:33.116 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [Get information about the LV] ********************************************
task path: /tmp/tmpsvhdx31t/tests/test-verify-volume-cache.yml:6
Thursday 21 July 2022  07:14:37 +0000 (0:00:00.023)       0:00:33.140 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [set_fact] ****************************************************************
task path: /tmp/tmpsvhdx31t/tests/test-verify-volume-cache.yml:14
Thursday 21 July 2022  07:14:37 +0000 (0:00:00.023)       0:00:33.163 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [check segment type] ******************************************************
task path: /tmp/tmpsvhdx31t/tests/test-verify-volume-cache.yml:17
Thursday 21 July 2022  07:14:37 +0000 (0:00:00.024)       0:00:33.188 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [set_fact] ****************************************************************
task path: /tmp/tmpsvhdx31t/tests/test-verify-volume-cache.yml:22
Thursday 21 July 2022  07:14:37 +0000 (0:00:00.023)       0:00:33.211 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [parse the requested cache size] ******************************************
task path: /tmp/tmpsvhdx31t/tests/test-verify-volume-cache.yml:26
Thursday 21 July 2022  07:14:37 +0000 (0:00:00.023)       0:00:33.235 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [set_fact] ****************************************************************
task path: /tmp/tmpsvhdx31t/tests/test-verify-volume-cache.yml:32
Thursday 21 July 2022  07:14:37 +0000 (0:00:00.025)       0:00:33.260 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [Check cache size] ********************************************************
task path: /tmp/tmpsvhdx31t/tests/test-verify-volume-cache.yml:36
Thursday 21 July 2022  07:14:37 +0000 (0:00:00.026)       0:00:33.287 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [Clean up facts] **********************************************************
task path: /tmp/tmpsvhdx31t/tests/test-verify-volume.yml:16
Thursday 21 July 2022  07:14:37 +0000 (0:00:00.026)       0:00:33.313 ********* 
ok: [/cache/rhel-9.qcow2.snap] => {
    "ansible_facts": {
        "_storage_test_volume_present": null
    },
    "changed": false
}

TASK [Clean up variable namespace] *********************************************
task path: /tmp/tmpsvhdx31t/tests/verify-role-results.yml:53
Thursday 21 July 2022  07:14:37 +0000 (0:00:00.037)       0:00:33.351 ********* 
ok: [/cache/rhel-9.qcow2.snap] => {
    "ansible_facts": {
        "storage_test_blkinfo": null,
        "storage_test_crypttab": null,
        "storage_test_fstab": null
    },
    "changed": false
}

TASK [Repeat the previous invocation to verify idempotence] ********************
task path: /tmp/tmpsvhdx31t/tests/tests_change_disk_mount.yml:49
Thursday 21 July 2022  07:14:37 +0000 (0:00:00.034)       0:00:33.386 ********* 

TASK [linux-system-roles.storage : set platform/version specific variables] ****
task path: /tmp/tmpsvhdx31t/tests/roles/linux-system-roles.storage/tasks/main.yml:2
Thursday 21 July 2022  07:14:37 +0000 (0:00:00.049)       0:00:33.435 ********* 
included: /tmp/tmpsvhdx31t/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-9.qcow2.snap

TASK [linux-system-roles.storage : Ensure ansible_facts used by role] **********
task path: /tmp/tmpsvhdx31t/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2
Thursday 21 July 2022  07:14:37 +0000 (0:00:00.034)       0:00:33.470 ********* 
ok: [/cache/rhel-9.qcow2.snap]

TASK [linux-system-roles.storage : Set platform/version specific variables] ****
task path: /tmp/tmpsvhdx31t/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8
Thursday 21 July 2022  07:14:38 +0000 (0:00:00.497)       0:00:33.967 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => (item=RedHat.yml)  => {
    "ansible_loop_var": "item",
    "changed": false,
    "item": "RedHat.yml",
    "skip_reason": "Conditional result was False"
}
skipping: [/cache/rhel-9.qcow2.snap] => (item=RedHat.yml)  => {
    "ansible_loop_var": "item",
    "changed": false,
    "item": "RedHat.yml",
    "skip_reason": "Conditional result was False"
}
ok: [/cache/rhel-9.qcow2.snap] => (item=RedHat_9.yml) => {
    "ansible_facts": {
        "blivet_package_list": [
            "python3-blivet",
            "libblockdev-crypto",
            "libblockdev-dm",
            "libblockdev-lvm",
            "libblockdev-mdraid",
            "libblockdev-swap",
            "vdo",
            "kmod-kvdo",
            "xfsprogs"
        ]
    },
    "ansible_included_var_files": [
        "/tmp/tmpsvhdx31t/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml"
    ],
    "ansible_loop_var": "item",
    "changed": false,
    "item": "RedHat_9.yml"
}
skipping: [/cache/rhel-9.qcow2.snap] => (item=RedHat_9.0.yml)  => {
    "ansible_loop_var": "item",
    "changed": false,
    "item": "RedHat_9.0.yml",
    "skip_reason": "Conditional result was False"
}

TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] ***
task path: /tmp/tmpsvhdx31t/tests/roles/linux-system-roles.storage/tasks/main.yml:5
Thursday 21 July 2022  07:14:38 +0000 (0:00:00.061)       0:00:34.028 ********* 
ok: [/cache/rhel-9.qcow2.snap] => {
    "ansible_facts": {
        "_storage_pools_list": []
    },
    "changed": false
}

TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] ***
task path: /tmp/tmpsvhdx31t/tests/roles/linux-system-roles.storage/tasks/main.yml:9
Thursday 21 July 2022  07:14:38 +0000 (0:00:00.032)       0:00:34.061 ********* 
ok: [/cache/rhel-9.qcow2.snap] => {
    "ansible_facts": {
        "_storage_volumes_list": []
    },
    "changed": false
}

TASK [linux-system-roles.storage : include the appropriate provider tasks] *****
task path: /tmp/tmpsvhdx31t/tests/roles/linux-system-roles.storage/tasks/main.yml:13
Thursday 21 July 2022  07:14:38 +0000 (0:00:00.033)       0:00:34.094 ********* 
redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount
redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount
included: /tmp/tmpsvhdx31t/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-9.qcow2.snap

TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] ***
task path: /tmp/tmpsvhdx31t/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2
Thursday 21 July 2022  07:14:38 +0000 (0:00:00.047)       0:00:34.142 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [linux-system-roles.storage : make sure blivet is available] **************
task path: /tmp/tmpsvhdx31t/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7
Thursday 21 July 2022  07:14:38 +0000 (0:00:00.021)       0:00:34.163 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [linux-system-roles.storage : show storage_pools] *************************
task path: /tmp/tmpsvhdx31t/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14
Thursday 21 July 2022  07:14:38 +0000 (0:00:00.027)       0:00:34.190 ********* 
ok: [/cache/rhel-9.qcow2.snap] => {
    "storage_pools": "VARIABLE IS NOT DEFINED!: 'storage_pools' is undefined"
}

TASK [linux-system-roles.storage : show storage_volumes] ***********************
task path: /tmp/tmpsvhdx31t/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19
Thursday 21 July 2022  07:14:38 +0000 (0:00:00.036)       0:00:34.227 ********* 
ok: [/cache/rhel-9.qcow2.snap] => {
    "storage_volumes": [
        {
            "disks": [
                "sda"
            ],
            "mount_point": "/opt/test2",
            "name": "test1",
            "type": "disk"
        }
    ]
}

TASK [linux-system-roles.storage : get required packages] **********************
task path: /tmp/tmpsvhdx31t/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24
Thursday 21 July 2022  07:14:38 +0000 (0:00:00.034)       0:00:34.262 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [linux-system-roles.storage : enable copr repositories if needed] *********
task path: /tmp/tmpsvhdx31t/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37
Thursday 21 July 2022  07:14:38 +0000 (0:00:00.026)       0:00:34.289 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [linux-system-roles.storage : make sure required packages are installed] ***
task path: /tmp/tmpsvhdx31t/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44
Thursday 21 July 2022  07:14:38 +0000 (0:00:00.026)       0:00:34.315 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [linux-system-roles.storage : get service facts] **************************
task path: /tmp/tmpsvhdx31t/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51
Thursday 21 July 2022  07:14:38 +0000 (0:00:00.027)       0:00:34.342 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************
task path: /tmp/tmpsvhdx31t/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58
Thursday 21 July 2022  07:14:38 +0000 (0:00:00.072)       0:00:34.415 ********* 
ok: [/cache/rhel-9.qcow2.snap] => {
    "ansible_facts": {
        "storage_cryptsetup_services": []
    },
    "changed": false
}

TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] *******
task path: /tmp/tmpsvhdx31t/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71
Thursday 21 July 2022  07:14:38 +0000 (0:00:00.056)       0:00:34.471 ********* 

TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] ***
task path: /tmp/tmpsvhdx31t/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77
Thursday 21 July 2022  07:14:38 +0000 (0:00:00.021)       0:00:34.493 ********* 
ok: [/cache/rhel-9.qcow2.snap] => {
    "actions": [],
    "changed": false,
    "crypts": [],
    "leaves": [
        "/dev/sr0",
        "/dev/vda1",
        "/dev/vda2",
        "/dev/vda3",
        "/dev/vda4",
        "/dev/sda",
        "/dev/sdb",
        "/dev/sdc",
        "/dev/nvme0n1",
        "/dev/nvme1n1",
        "/dev/nvme2n1",
        "/dev/vdb1",
        "/dev/vdc1",
        "/dev/vdd",
        "/dev/vde",
        "/dev/vdf"
    ],
    "mounts": [
        {
            "dump": 0,
            "fstype": "xfs",
            "opts": "defaults",
            "passno": 0,
            "path": "/opt/test2",
            "src": "UUID=31c6244a-312c-4308-8a54-e9662b6a934a",
            "state": "mounted"
        }
    ],
    "packages": [
        "dosfstools",
        "e2fsprogs",
        "xfsprogs"
    ],
    "pools": [],
    "volumes": [
        {
            "_device": "/dev/sda",
            "_kernel_device": "/dev/sda",
            "_mount_id": "UUID=31c6244a-312c-4308-8a54-e9662b6a934a",
            "_raw_device": "/dev/sda",
            "_raw_kernel_device": "/dev/sda",
            "cache_devices": [],
            "cache_mode": null,
            "cache_size": 0,
            "cached": false,
            "compression": null,
            "deduplication": null,
            "disks": [
                "sda"
            ],
            "encryption": false,
            "encryption_cipher": null,
            "encryption_key": null,
            "encryption_key_size": null,
            "encryption_luks_version": null,
            "encryption_password": null,
            "fs_create_options": "",
            "fs_label": "",
            "fs_overwrite_existing": true,
            "fs_type": "xfs",
            "mount_check": 0,
            "mount_device_identifier": "uuid",
            "mount_options": "defaults",
            "mount_passno": 0,
            "mount_point": "/opt/test2",
            "name": "test1",
            "raid_chunk_size": null,
            "raid_device_count": null,
            "raid_level": null,
            "raid_metadata_version": null,
            "raid_spare_count": null,
            "size": 10737418240,
            "state": "present",
            "thin": null,
            "thin_pool_name": null,
            "thin_pool_size": null,
            "type": "disk",
            "vdo_pool_size": null
        }
    ]
}

TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] ***
task path: /tmp/tmpsvhdx31t/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:91
Thursday 21 July 2022  07:14:40 +0000 (0:00:01.341)       0:00:35.835 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] *****
task path: /tmp/tmpsvhdx31t/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103
Thursday 21 July 2022  07:14:40 +0000 (0:00:00.035)       0:00:35.870 ********* 

TASK [linux-system-roles.storage : show blivet_output] *************************
task path: /tmp/tmpsvhdx31t/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:109
Thursday 21 July 2022  07:14:40 +0000 (0:00:00.021)       0:00:35.892 ********* 
ok: [/cache/rhel-9.qcow2.snap] => {
    "blivet_output": {
        "actions": [],
        "changed": false,
        "crypts": [],
        "failed": false,
        "leaves": [
            "/dev/sr0",
            "/dev/vda1",
            "/dev/vda2",
            "/dev/vda3",
            "/dev/vda4",
            "/dev/sda",
            "/dev/sdb",
            "/dev/sdc",
            "/dev/nvme0n1",
            "/dev/nvme1n1",
            "/dev/nvme2n1",
            "/dev/vdb1",
            "/dev/vdc1",
            "/dev/vdd",
            "/dev/vde",
            "/dev/vdf"
        ],
        "mounts": [
            {
                "dump": 0,
                "fstype": "xfs",
                "opts": "defaults",
                "passno": 0,
                "path": "/opt/test2",
                "src": "UUID=31c6244a-312c-4308-8a54-e9662b6a934a",
                "state": "mounted"
            }
        ],
        "packages": [
            "dosfstools",
            "e2fsprogs",
            "xfsprogs"
        ],
        "pools": [],
        "volumes": [
            {
                "_device": "/dev/sda",
                "_kernel_device": "/dev/sda",
                "_mount_id": "UUID=31c6244a-312c-4308-8a54-e9662b6a934a",
                "_raw_device": "/dev/sda",
                "_raw_kernel_device": "/dev/sda",
                "cache_devices": [],
                "cache_mode": null,
                "cache_size": 0,
                "cached": false,
                "compression": null,
                "deduplication": null,
                "disks": [
                    "sda"
                ],
                "encryption": false,
                "encryption_cipher": null,
                "encryption_key": null,
                "encryption_key_size": null,
                "encryption_luks_version": null,
                "encryption_password": null,
                "fs_create_options": "",
                "fs_label": "",
                "fs_overwrite_existing": true,
                "fs_type": "xfs",
                "mount_check": 0,
                "mount_device_identifier": "uuid",
                "mount_options": "defaults",
                "mount_passno": 0,
                "mount_point": "/opt/test2",
                "name": "test1",
                "raid_chunk_size": null,
                "raid_device_count": null,
                "raid_level": null,
                "raid_metadata_version": null,
                "raid_spare_count": null,
                "size": 10737418240,
                "state": "present",
                "thin": null,
                "thin_pool_name": null,
                "thin_pool_size": null,
                "type": "disk",
                "vdo_pool_size": null
            }
        ]
    }
}

TASK [linux-system-roles.storage : set the list of pools for test verification] ***
task path: /tmp/tmpsvhdx31t/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:114
Thursday 21 July 2022  07:14:40 +0000 (0:00:00.037)       0:00:35.929 ********* 
ok: [/cache/rhel-9.qcow2.snap] => {
    "ansible_facts": {
        "_storage_pools_list": []
    },
    "changed": false
}

TASK [linux-system-roles.storage : set the list of volumes for test verification] ***
task path: /tmp/tmpsvhdx31t/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:118
Thursday 21 July 2022  07:14:40 +0000 (0:00:00.036)       0:00:35.965 ********* 
ok: [/cache/rhel-9.qcow2.snap] => {
    "ansible_facts": {
        "_storage_volumes_list": [
            {
                "_device": "/dev/sda",
                "_kernel_device": "/dev/sda",
                "_mount_id": "UUID=31c6244a-312c-4308-8a54-e9662b6a934a",
                "_raw_device": "/dev/sda",
                "_raw_kernel_device": "/dev/sda",
                "cache_devices": [],
                "cache_mode": null,
                "cache_size": 0,
                "cached": false,
                "compression": null,
                "deduplication": null,
                "disks": [
                    "sda"
                ],
                "encryption": false,
                "encryption_cipher": null,
                "encryption_key": null,
                "encryption_key_size": null,
                "encryption_luks_version": null,
                "encryption_password": null,
                "fs_create_options": "",
                "fs_label": "",
                "fs_overwrite_existing": true,
                "fs_type": "xfs",
                "mount_check": 0,
                "mount_device_identifier": "uuid",
                "mount_options": "defaults",
                "mount_passno": 0,
                "mount_point": "/opt/test2",
                "name": "test1",
                "raid_chunk_size": null,
                "raid_device_count": null,
                "raid_level": null,
                "raid_metadata_version": null,
                "raid_spare_count": null,
                "size": 10737418240,
                "state": "present",
                "thin": null,
                "thin_pool_name": null,
                "thin_pool_size": null,
                "type": "disk",
                "vdo_pool_size": null
            }
        ]
    },
    "changed": false
}

TASK [linux-system-roles.storage : remove obsolete mounts] *********************
task path: /tmp/tmpsvhdx31t/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:134
Thursday 21 July 2022  07:14:40 +0000 (0:00:00.037)       0:00:36.003 ********* 

TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] ***
task path: /tmp/tmpsvhdx31t/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146
Thursday 21 July 2022  07:14:40 +0000 (0:00:00.038)       0:00:36.041 ********* 
ok: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "name": null,
    "status": {}
}

TASK [linux-system-roles.storage : set up new/current mounts] ******************
task path: /tmp/tmpsvhdx31t/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151
Thursday 21 July 2022  07:14:41 +0000 (0:00:00.649)       0:00:36.690 ********* 
redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount
ok: [/cache/rhel-9.qcow2.snap] => (item={'src': 'UUID=31c6244a-312c-4308-8a54-e9662b6a934a', 'path': '/opt/test2', 'fstype': 'xfs', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted'}) => {
    "ansible_loop_var": "mount_info",
    "backup_file": "",
    "boot": "yes",
    "changed": false,
    "dump": "0",
    "fstab": "/etc/fstab",
    "fstype": "xfs",
    "mount_info": {
        "dump": 0,
        "fstype": "xfs",
        "opts": "defaults",
        "passno": 0,
        "path": "/opt/test2",
        "src": "UUID=31c6244a-312c-4308-8a54-e9662b6a934a",
        "state": "mounted"
    },
    "name": "/opt/test2",
    "opts": "defaults",
    "passno": "0",
    "src": "UUID=31c6244a-312c-4308-8a54-e9662b6a934a"
}

TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] ***
task path: /tmp/tmpsvhdx31t/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163
Thursday 21 July 2022  07:14:41 +0000 (0:00:00.384)       0:00:37.075 ********* 
ok: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "name": null,
    "status": {}
}

TASK [linux-system-roles.storage : retrieve facts for the /etc/crypttab file] ***
task path: /tmp/tmpsvhdx31t/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171
Thursday 21 July 2022  07:14:42 +0000 (0:00:00.661)       0:00:37.736 ********* 
ok: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "stat": {
        "exists": false
    }
}

TASK [linux-system-roles.storage : manage /etc/crypttab to account for changes we just made] ***
task path: /tmp/tmpsvhdx31t/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:176
Thursday 21 July 2022  07:14:42 +0000 (0:00:00.367)       0:00:38.104 ********* 

TASK [linux-system-roles.storage : Update facts] *******************************
task path: /tmp/tmpsvhdx31t/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198
Thursday 21 July 2022  07:14:42 +0000 (0:00:00.021)       0:00:38.126 ********* 
ok: [/cache/rhel-9.qcow2.snap]
META: role_complete for /cache/rhel-9.qcow2.snap

TASK [include_tasks] ***********************************************************
task path: /tmp/tmpsvhdx31t/tests/tests_change_disk_mount.yml:59
Thursday 21 July 2022  07:14:43 +0000 (0:00:00.981)       0:00:39.107 ********* 
included: /tmp/tmpsvhdx31t/tests/verify-role-results.yml for /cache/rhel-9.qcow2.snap

TASK [Print out pool information] **********************************************
task path: /tmp/tmpsvhdx31t/tests/verify-role-results.yml:1
Thursday 21 July 2022  07:14:43 +0000 (0:00:00.044)       0:00:39.152 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {}

TASK [Print out volume information] ********************************************
task path: /tmp/tmpsvhdx31t/tests/verify-role-results.yml:6
Thursday 21 July 2022  07:14:43 +0000 (0:00:00.038)       0:00:39.191 ********* 
ok: [/cache/rhel-9.qcow2.snap] => {
    "_storage_volumes_list": [
        {
            "_device": "/dev/sda",
            "_kernel_device": "/dev/sda",
            "_mount_id": "UUID=31c6244a-312c-4308-8a54-e9662b6a934a",
            "_raw_device": "/dev/sda",
            "_raw_kernel_device": "/dev/sda",
            "cache_devices": [],
            "cache_mode": null,
            "cache_size": 0,
            "cached": false,
            "compression": null,
            "deduplication": null,
            "disks": [
                "sda"
            ],
            "encryption": false,
            "encryption_cipher": null,
            "encryption_key": null,
            "encryption_key_size": null,
            "encryption_luks_version": null,
            "encryption_password": null,
            "fs_create_options": "",
            "fs_label": "",
            "fs_overwrite_existing": true,
            "fs_type": "xfs",
            "mount_check": 0,
            "mount_device_identifier": "uuid",
            "mount_options": "defaults",
            "mount_passno": 0,
            "mount_point": "/opt/test2",
            "name": "test1",
            "raid_chunk_size": null,
            "raid_device_count": null,
            "raid_level": null,
            "raid_metadata_version": null,
            "raid_spare_count": null,
            "size": 10737418240,
            "state": "present",
            "thin": null,
            "thin_pool_name": null,
            "thin_pool_size": null,
            "type": "disk",
            "vdo_pool_size": null
        }
    ]
}

TASK [Collect info about the volumes.] *****************************************
task path: /tmp/tmpsvhdx31t/tests/verify-role-results.yml:14
Thursday 21 July 2022  07:14:43 +0000 (0:00:00.079)       0:00:39.270 ********* 
ok: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "info": {
        "/dev/nvme0n1": {
            "fstype": "",
            "label": "",
            "name": "/dev/nvme0n1",
            "size": "10G",
            "type": "disk",
            "uuid": ""
        },
        "/dev/nvme1n1": {
            "fstype": "",
            "label": "",
            "name": "/dev/nvme1n1",
            "size": "10G",
            "type": "disk",
            "uuid": ""
        },
        "/dev/nvme2n1": {
            "fstype": "",
            "label": "",
            "name": "/dev/nvme2n1",
            "size": "10G",
            "type": "disk",
            "uuid": ""
        },
        "/dev/sda": {
            "fstype": "xfs",
            "label": "",
            "name": "/dev/sda",
            "size": "10G",
            "type": "disk",
            "uuid": "31c6244a-312c-4308-8a54-e9662b6a934a"
        },
        "/dev/sdb": {
            "fstype": "",
            "label": "",
            "name": "/dev/sdb",
            "size": "10G",
            "type": "disk",
            "uuid": ""
        },
        "/dev/sdc": {
            "fstype": "",
            "label": "",
            "name": "/dev/sdc",
            "size": "10G",
            "type": "disk",
            "uuid": ""
        },
        "/dev/sr0": {
            "fstype": "iso9660",
            "label": "cidata",
            "name": "/dev/sr0",
            "size": "364K",
            "type": "rom",
            "uuid": "2022-07-21-07-13-51-00"
        },
        "/dev/vda": {
            "fstype": "",
            "label": "",
            "name": "/dev/vda",
            "size": "10G",
            "type": "disk",
            "uuid": ""
        },
        "/dev/vda1": {
            "fstype": "",
            "label": "",
            "name": "/dev/vda1",
            "size": "1M",
            "type": "partition",
            "uuid": ""
        },
        "/dev/vda2": {
            "fstype": "vfat",
            "label": "",
            "name": "/dev/vda2",
            "size": "200M",
            "type": "partition",
            "uuid": "7B77-95E7"
        },
        "/dev/vda3": {
            "fstype": "xfs",
            "label": "boot",
            "name": "/dev/vda3",
            "size": "500M",
            "type": "partition",
            "uuid": "6e74e171-0370-451f-8340-f16ad2839183"
        },
        "/dev/vda4": {
            "fstype": "xfs",
            "label": "root",
            "name": "/dev/vda4",
            "size": "9.3G",
            "type": "partition",
            "uuid": "1bb53e4d-984c-4316-908a-59b5a62fa30e"
        },
        "/dev/vdb": {
            "fstype": "",
            "label": "",
            "name": "/dev/vdb",
            "size": "2G",
            "type": "disk",
            "uuid": ""
        },
        "/dev/vdb1": {
            "fstype": "ext4",
            "label": "yumcache",
            "name": "/dev/vdb1",
            "size": "2G",
            "type": "partition",
            "uuid": "12be66f7-7f03-4d54-9a48-a520825d8fa9"
        },
        "/dev/vdc": {
            "fstype": "",
            "label": "",
            "name": "/dev/vdc",
            "size": "2G",
            "type": "disk",
            "uuid": ""
        },
        "/dev/vdc1": {
            "fstype": "ext4",
            "label": "yumvarlib",
            "name": "/dev/vdc1",
            "size": "2G",
            "type": "partition",
            "uuid": "9bd037ed-b1bb-4b9c-a008-9f07d34ab071"
        },
        "/dev/vdd": {
            "fstype": "",
            "label": "",
            "name": "/dev/vdd",
            "size": "10G",
            "type": "disk",
            "uuid": ""
        },
        "/dev/vde": {
            "fstype": "",
            "label": "",
            "name": "/dev/vde",
            "size": "10G",
            "type": "disk",
            "uuid": ""
        },
        "/dev/vdf": {
            "fstype": "",
            "label": "",
            "name": "/dev/vdf",
            "size": "10G",
            "type": "disk",
            "uuid": ""
        }
    }
}

TASK [Read the /etc/fstab file for volume existence] ***************************
task path: /tmp/tmpsvhdx31t/tests/verify-role-results.yml:19
Thursday 21 July 2022  07:14:44 +0000 (0:00:00.401)       0:00:39.671 ********* 
ok: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "cmd": [
        "cat",
        "/etc/fstab"
    ],
    "delta": "0:00:00.002377",
    "end": "2022-07-21 03:14:43.787141",
    "rc": 0,
    "start": "2022-07-21 03:14:43.784764"
}

STDOUT:

UUID=6e74e171-0370-451f-8340-f16ad2839183	/boot	xfs	defaults	0	0
UUID=1bb53e4d-984c-4316-908a-59b5a62fa30e	/	xfs	defaults	0	0
UUID=7B77-95E7	/boot/efi	vfat	defaults,uid=0,gid=0,umask=077,shortname=winnt	0	2
/dev/vdb1	/var/cache/dnf	auto	defaults,nofail,x-systemd.requires=cloud-init.service,_netdev,comment=cloudconfig	0	2
/dev/vdc1	/var/lib/dnf	auto	defaults,nofail,x-systemd.requires=cloud-init.service,_netdev,comment=cloudconfig	0	2
UUID=31c6244a-312c-4308-8a54-e9662b6a934a /opt/test2 xfs defaults 0 0

TASK [Read the /etc/crypttab file] *********************************************
task path: /tmp/tmpsvhdx31t/tests/verify-role-results.yml:24
Thursday 21 July 2022  07:14:44 +0000 (0:00:00.396)       0:00:40.067 ********* 
ok: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "cmd": [
        "cat",
        "/etc/crypttab"
    ],
    "delta": "0:00:00.002549",
    "end": "2022-07-21 03:14:44.146242",
    "failed_when_result": false,
    "rc": 1,
    "start": "2022-07-21 03:14:44.143693"
}

STDERR:

cat: /etc/crypttab: No such file or directory


MSG:

non-zero return code

TASK [Verify the volumes listed in storage_pools were correctly managed] *******
task path: /tmp/tmpsvhdx31t/tests/verify-role-results.yml:33
Thursday 21 July 2022  07:14:44 +0000 (0:00:00.360)       0:00:40.428 ********* 

TASK [Verify the volumes with no pool were correctly managed] ******************
task path: /tmp/tmpsvhdx31t/tests/verify-role-results.yml:43
Thursday 21 July 2022  07:14:44 +0000 (0:00:00.020)       0:00:40.449 ********* 
included: /tmp/tmpsvhdx31t/tests/test-verify-volume.yml for /cache/rhel-9.qcow2.snap => (item={'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test2', 'name': 'test1', 'raid_level': None, 'size': 10737418240, 'state': 'present', 'type': 'disk', 'disks': ['sda'], 'raid_device_count': None, 'raid_spare_count': None, 'raid_metadata_version': None, 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_chunk_size': None, 'compression': None, 'deduplication': None, 'vdo_pool_size': None, 'thin': None, 'thin_pool_name': None, 'thin_pool_size': None, 'cached': False, 'cache_size': 0, 'cache_mode': None, 'cache_devices': [], '_device': '/dev/sda', '_raw_device': '/dev/sda', '_mount_id': 'UUID=31c6244a-312c-4308-8a54-e9662b6a934a', '_kernel_device': '/dev/sda', '_raw_kernel_device': '/dev/sda'})

TASK [set_fact] ****************************************************************
task path: /tmp/tmpsvhdx31t/tests/test-verify-volume.yml:2
Thursday 21 July 2022  07:14:44 +0000 (0:00:00.062)       0:00:40.511 ********* 
ok: [/cache/rhel-9.qcow2.snap] => {
    "ansible_facts": {
        "_storage_test_volume_present": true,
        "_storage_volume_tests": [
            "mount",
            "fstab",
            "fs",
            "device",
            "encryption",
            "md",
            "size",
            "cache"
        ]
    },
    "changed": false
}

TASK [include_tasks] ***********************************************************
task path: /tmp/tmpsvhdx31t/tests/test-verify-volume.yml:10
Thursday 21 July 2022  07:14:44 +0000 (0:00:00.049)       0:00:40.560 ********* 
included: /tmp/tmpsvhdx31t/tests/test-verify-volume-mount.yml for /cache/rhel-9.qcow2.snap => (item=mount)
included: /tmp/tmpsvhdx31t/tests/test-verify-volume-fstab.yml for /cache/rhel-9.qcow2.snap => (item=fstab)
included: /tmp/tmpsvhdx31t/tests/test-verify-volume-fs.yml for /cache/rhel-9.qcow2.snap => (item=fs)
included: /tmp/tmpsvhdx31t/tests/test-verify-volume-device.yml for /cache/rhel-9.qcow2.snap => (item=device)
included: /tmp/tmpsvhdx31t/tests/test-verify-volume-encryption.yml for /cache/rhel-9.qcow2.snap => (item=encryption)
included: /tmp/tmpsvhdx31t/tests/test-verify-volume-md.yml for /cache/rhel-9.qcow2.snap => (item=md)
included: /tmp/tmpsvhdx31t/tests/test-verify-volume-size.yml for /cache/rhel-9.qcow2.snap => (item=size)
included: /tmp/tmpsvhdx31t/tests/test-verify-volume-cache.yml for /cache/rhel-9.qcow2.snap => (item=cache)

TASK [Get expected mount device based on device type] **************************
task path: /tmp/tmpsvhdx31t/tests/test-verify-volume-mount.yml:6
Thursday 21 July 2022  07:14:45 +0000 (0:00:00.069)       0:00:40.630 ********* 
ok: [/cache/rhel-9.qcow2.snap] => {
    "ansible_facts": {
        "storage_test_device_path": "/dev/sda"
    },
    "changed": false
}

TASK [Set some facts] **********************************************************
task path: /tmp/tmpsvhdx31t/tests/test-verify-volume-mount.yml:14
Thursday 21 July 2022  07:14:45 +0000 (0:00:00.040)       0:00:40.670 ********* 
ok: [/cache/rhel-9.qcow2.snap] => {
    "ansible_facts": {
        "storage_test_mount_device_matches": [
            {
                "block_available": 2592358,
                "block_size": 4096,
                "block_total": 2618880,
                "block_used": 26522,
                "device": "/dev/sda",
                "fstype": "xfs",
                "inode_available": 5242877,
                "inode_total": 5242880,
                "inode_used": 3,
                "mount": "/opt/test2",
                "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota",
                "size_available": 10618298368,
                "size_total": 10726932480,
                "uuid": "31c6244a-312c-4308-8a54-e9662b6a934a"
            }
        ],
        "storage_test_mount_expected_match_count": "1",
        "storage_test_mount_point_matches": [
            {
                "block_available": 2592358,
                "block_size": 4096,
                "block_total": 2618880,
                "block_used": 26522,
                "device": "/dev/sda",
                "fstype": "xfs",
                "inode_available": 5242877,
                "inode_total": 5242880,
                "inode_used": 3,
                "mount": "/opt/test2",
                "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota",
                "size_available": 10618298368,
                "size_total": 10726932480,
                "uuid": "31c6244a-312c-4308-8a54-e9662b6a934a"
            }
        ],
        "storage_test_swap_expected_matches": "0"
    },
    "changed": false
}

TASK [Verify the current mount state by device] ********************************
task path: /tmp/tmpsvhdx31t/tests/test-verify-volume-mount.yml:28
Thursday 21 July 2022  07:14:45 +0000 (0:00:00.067)       0:00:40.738 ********* 
ok: [/cache/rhel-9.qcow2.snap] => {
    "changed": false
}

MSG:

All assertions passed

TASK [Verify the current mount state by mount point] ***************************
task path: /tmp/tmpsvhdx31t/tests/test-verify-volume-mount.yml:37
Thursday 21 July 2022  07:14:45 +0000 (0:00:00.050)       0:00:40.789 ********* 
ok: [/cache/rhel-9.qcow2.snap] => {
    "changed": false
}

MSG:

All assertions passed

TASK [Verify the mount fs type] ************************************************
task path: /tmp/tmpsvhdx31t/tests/test-verify-volume-mount.yml:45
Thursday 21 July 2022  07:14:45 +0000 (0:00:00.049)       0:00:40.838 ********* 
ok: [/cache/rhel-9.qcow2.snap] => {
    "changed": false
}

MSG:

All assertions passed

TASK [command] *****************************************************************
task path: /tmp/tmpsvhdx31t/tests/test-verify-volume-mount.yml:54
Thursday 21 July 2022  07:14:45 +0000 (0:00:00.055)       0:00:40.894 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [Gather swap info] ********************************************************
task path: /tmp/tmpsvhdx31t/tests/test-verify-volume-mount.yml:58
Thursday 21 July 2022  07:14:45 +0000 (0:00:00.023)       0:00:40.918 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [Verify swap status] ******************************************************
task path: /tmp/tmpsvhdx31t/tests/test-verify-volume-mount.yml:63
Thursday 21 July 2022  07:14:45 +0000 (0:00:00.023)       0:00:40.941 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [Unset facts] *************************************************************
task path: /tmp/tmpsvhdx31t/tests/test-verify-volume-mount.yml:75
Thursday 21 July 2022  07:14:45 +0000 (0:00:00.024)       0:00:40.966 ********* 
ok: [/cache/rhel-9.qcow2.snap] => {
    "ansible_facts": {
        "storage_test_mount_device_matches": null,
        "storage_test_mount_expected_match_count": null,
        "storage_test_mount_point_matches": null,
        "storage_test_swap_expected_matches": null,
        "storage_test_swaps": null,
        "storage_test_sys_node": null
    },
    "changed": false
}

TASK [Set some variables for fstab checking] ***********************************
task path: /tmp/tmpsvhdx31t/tests/test-verify-volume-fstab.yml:2
Thursday 21 July 2022  07:14:45 +0000 (0:00:00.032)       0:00:40.999 ********* 
ok: [/cache/rhel-9.qcow2.snap] => {
    "ansible_facts": {
        "storage_test_fstab_expected_id_matches": "1",
        "storage_test_fstab_expected_mount_options_matches": "1",
        "storage_test_fstab_expected_mount_point_matches": "1",
        "storage_test_fstab_id_matches": [
            "UUID=31c6244a-312c-4308-8a54-e9662b6a934a "
        ],
        "storage_test_fstab_mount_options_matches": [
            " /opt/test2 xfs defaults "
        ],
        "storage_test_fstab_mount_point_matches": [
            " /opt/test2 "
        ]
    },
    "changed": false
}

TASK [Verify that the device identifier appears in /etc/fstab] *****************
task path: /tmp/tmpsvhdx31t/tests/test-verify-volume-fstab.yml:25
Thursday 21 July 2022  07:14:45 +0000 (0:00:00.060)       0:00:41.059 ********* 
ok: [/cache/rhel-9.qcow2.snap] => {
    "changed": false
}

MSG:

All assertions passed

TASK [Verify the fstab mount point] ********************************************
task path: /tmp/tmpsvhdx31t/tests/test-verify-volume-fstab.yml:32
Thursday 21 July 2022  07:14:45 +0000 (0:00:00.051)       0:00:41.111 ********* 
ok: [/cache/rhel-9.qcow2.snap] => {
    "changed": false
}

MSG:

All assertions passed

TASK [Verify mount_options] ****************************************************
task path: /tmp/tmpsvhdx31t/tests/test-verify-volume-fstab.yml:39
Thursday 21 July 2022  07:14:45 +0000 (0:00:00.049)       0:00:41.160 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [Clean up variables] ******************************************************
task path: /tmp/tmpsvhdx31t/tests/test-verify-volume-fstab.yml:49
Thursday 21 July 2022  07:14:45 +0000 (0:00:00.035)       0:00:41.196 ********* 
ok: [/cache/rhel-9.qcow2.snap] => {
    "ansible_facts": {
        "storage_test_fstab_expected_id_matches": null,
        "storage_test_fstab_expected_mount_options_matches": null,
        "storage_test_fstab_expected_mount_point_matches": null,
        "storage_test_fstab_id_matches": null,
        "storage_test_fstab_mount_options_matches": null,
        "storage_test_fstab_mount_point_matches": null
    },
    "changed": false
}

TASK [Verify fs type] **********************************************************
task path: /tmp/tmpsvhdx31t/tests/test-verify-volume-fs.yml:4
Thursday 21 July 2022  07:14:45 +0000 (0:00:00.034)       0:00:41.230 ********* 
ok: [/cache/rhel-9.qcow2.snap] => {
    "changed": false
}

MSG:

All assertions passed

TASK [Verify fs label] *********************************************************
task path: /tmp/tmpsvhdx31t/tests/test-verify-volume-fs.yml:10
Thursday 21 July 2022  07:14:45 +0000 (0:00:00.042)       0:00:41.273 ********* 
ok: [/cache/rhel-9.qcow2.snap] => {
    "changed": false
}

MSG:

All assertions passed

TASK [See whether the device node is present] **********************************
task path: /tmp/tmpsvhdx31t/tests/test-verify-volume-device.yml:4
Thursday 21 July 2022  07:14:45 +0000 (0:00:00.039)       0:00:41.313 ********* 
ok: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "stat": {
        "atime": 1658387656.9902117,
        "attr_flags": "",
        "attributes": [],
        "block_size": 4096,
        "blocks": 0,
        "charset": "binary",
        "ctime": 1658387656.9902117,
        "dev": 5,
        "device_type": 2048,
        "executable": false,
        "exists": true,
        "gid": 6,
        "gr_name": "disk",
        "inode": 363,
        "isblk": true,
        "ischr": false,
        "isdir": false,
        "isfifo": false,
        "isgid": false,
        "islnk": false,
        "isreg": false,
        "issock": false,
        "isuid": false,
        "mimetype": "inode/blockdevice",
        "mode": "0660",
        "mtime": 1658387656.9902117,
        "nlink": 1,
        "path": "/dev/sda",
        "pw_name": "root",
        "readable": true,
        "rgrp": true,
        "roth": false,
        "rusr": true,
        "size": 0,
        "uid": 0,
        "version": null,
        "wgrp": true,
        "woth": false,
        "writeable": true,
        "wusr": true,
        "xgrp": false,
        "xoth": false,
        "xusr": false
    }
}

TASK [Verify the presence/absence of the device node] **************************
task path: /tmp/tmpsvhdx31t/tests/test-verify-volume-device.yml:10
Thursday 21 July 2022  07:14:46 +0000 (0:00:00.409)       0:00:41.722 ********* 
ok: [/cache/rhel-9.qcow2.snap] => {
    "changed": false
}

MSG:

All assertions passed

TASK [Make sure we got info about this volume] *********************************
task path: /tmp/tmpsvhdx31t/tests/test-verify-volume-device.yml:18
Thursday 21 July 2022  07:14:46 +0000 (0:00:00.109)       0:00:41.831 ********* 
ok: [/cache/rhel-9.qcow2.snap] => {
    "changed": false
}

MSG:

All assertions passed

TASK [(1/2) Process volume type (set initial value)] ***************************
task path: /tmp/tmpsvhdx31t/tests/test-verify-volume-device.yml:24
Thursday 21 July 2022  07:14:46 +0000 (0:00:00.037)       0:00:41.869 ********* 
ok: [/cache/rhel-9.qcow2.snap] => {
    "ansible_facts": {
        "st_volume_type": "disk"
    },
    "changed": false
}

TASK [(2/2) Process volume type (get RAID value)] ******************************
task path: /tmp/tmpsvhdx31t/tests/test-verify-volume-device.yml:28
Thursday 21 July 2022  07:14:46 +0000 (0:00:00.035)       0:00:41.905 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [Verify the volume's device type] *****************************************
task path: /tmp/tmpsvhdx31t/tests/test-verify-volume-device.yml:33
Thursday 21 July 2022  07:14:46 +0000 (0:00:00.022)       0:00:41.928 ********* 
ok: [/cache/rhel-9.qcow2.snap] => {
    "changed": false
}

MSG:

All assertions passed

TASK [Stat the LUKS device, if encrypted] **************************************
task path: /tmp/tmpsvhdx31t/tests/test-verify-volume-encryption.yml:3
Thursday 21 July 2022  07:14:46 +0000 (0:00:00.040)       0:00:41.968 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [Ensure cryptsetup is present] ********************************************
task path: /tmp/tmpsvhdx31t/tests/test-verify-volume-encryption.yml:10
Thursday 21 July 2022  07:14:46 +0000 (0:00:00.024)       0:00:41.992 ********* 
ok: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "rc": 0,
    "results": []
}

MSG:

Nothing to do

TASK [Collect LUKS info for this volume] ***************************************
task path: /tmp/tmpsvhdx31t/tests/test-verify-volume-encryption.yml:15
Thursday 21 July 2022  07:14:47 +0000 (0:00:00.876)       0:00:42.869 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [Verify the presence/absence of the LUKS device node] *********************
task path: /tmp/tmpsvhdx31t/tests/test-verify-volume-encryption.yml:21
Thursday 21 July 2022  07:14:47 +0000 (0:00:00.023)       0:00:42.892 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [Verify that the raw device is the same as the device if not encrypted] ***
task path: /tmp/tmpsvhdx31t/tests/test-verify-volume-encryption.yml:30
Thursday 21 July 2022  07:14:47 +0000 (0:00:00.022)       0:00:42.915 ********* 
ok: [/cache/rhel-9.qcow2.snap] => {
    "changed": false
}

MSG:

All assertions passed

TASK [Make sure we got info about the LUKS volume if encrypted] ****************
task path: /tmp/tmpsvhdx31t/tests/test-verify-volume-encryption.yml:38
Thursday 21 July 2022  07:14:47 +0000 (0:00:00.055)       0:00:42.970 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [Verify the LUKS volume's device type if encrypted] ***********************
task path: /tmp/tmpsvhdx31t/tests/test-verify-volume-encryption.yml:44
Thursday 21 July 2022  07:14:47 +0000 (0:00:00.023)       0:00:42.994 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [Check LUKS version] ******************************************************
task path: /tmp/tmpsvhdx31t/tests/test-verify-volume-encryption.yml:49
Thursday 21 July 2022  07:14:47 +0000 (0:00:00.022)       0:00:43.016 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [Check LUKS key size] *****************************************************
task path: /tmp/tmpsvhdx31t/tests/test-verify-volume-encryption.yml:55
Thursday 21 July 2022  07:14:47 +0000 (0:00:00.023)       0:00:43.039 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [Check LUKS cipher] *******************************************************
task path: /tmp/tmpsvhdx31t/tests/test-verify-volume-encryption.yml:61
Thursday 21 July 2022  07:14:47 +0000 (0:00:00.024)       0:00:43.063 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [set_fact] ****************************************************************
task path: /tmp/tmpsvhdx31t/tests/test-verify-volume-encryption.yml:67
Thursday 21 July 2022  07:14:47 +0000 (0:00:00.024)       0:00:43.088 ********* 
ok: [/cache/rhel-9.qcow2.snap] => {
    "ansible_facts": {
        "_storage_test_crypttab_entries": [],
        "_storage_test_expected_crypttab_entries": "0",
        "_storage_test_expected_crypttab_key_file": "-"
    },
    "changed": false
}

TASK [Check for /etc/crypttab entry] *******************************************
task path: /tmp/tmpsvhdx31t/tests/test-verify-volume-encryption.yml:74
Thursday 21 July 2022  07:14:47 +0000 (0:00:00.057)       0:00:43.145 ********* 
ok: [/cache/rhel-9.qcow2.snap] => {
    "changed": false
}

MSG:

All assertions passed

TASK [Validate the format of the crypttab entry] *******************************
task path: /tmp/tmpsvhdx31t/tests/test-verify-volume-encryption.yml:79
Thursday 21 July 2022  07:14:47 +0000 (0:00:00.049)       0:00:43.195 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [Check backing device of crypttab entry] **********************************
task path: /tmp/tmpsvhdx31t/tests/test-verify-volume-encryption.yml:85
Thursday 21 July 2022  07:14:47 +0000 (0:00:00.035)       0:00:43.230 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [Check key file of crypttab entry] ****************************************
task path: /tmp/tmpsvhdx31t/tests/test-verify-volume-encryption.yml:91
Thursday 21 July 2022  07:14:47 +0000 (0:00:00.039)       0:00:43.270 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [set_fact] ****************************************************************
task path: /tmp/tmpsvhdx31t/tests/test-verify-volume-encryption.yml:97
Thursday 21 July 2022  07:14:47 +0000 (0:00:00.035)       0:00:43.305 ********* 
ok: [/cache/rhel-9.qcow2.snap] => {
    "ansible_facts": {
        "_storage_test_crypttab_entries": null,
        "_storage_test_expected_crypttab_entries": null,
        "_storage_test_expected_crypttab_key_file": null
    },
    "changed": false
}

TASK [get information about RAID] **********************************************
task path: /tmp/tmpsvhdx31t/tests/test-verify-volume-md.yml:7
Thursday 21 July 2022  07:14:47 +0000 (0:00:00.036)       0:00:43.342 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [set_fact] ****************************************************************
task path: /tmp/tmpsvhdx31t/tests/test-verify-volume-md.yml:13
Thursday 21 July 2022  07:14:47 +0000 (0:00:00.036)       0:00:43.378 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [set_fact] ****************************************************************
task path: /tmp/tmpsvhdx31t/tests/test-verify-volume-md.yml:17
Thursday 21 July 2022  07:14:47 +0000 (0:00:00.034)       0:00:43.413 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [set_fact] ****************************************************************
task path: /tmp/tmpsvhdx31t/tests/test-verify-volume-md.yml:21
Thursday 21 July 2022  07:14:47 +0000 (0:00:00.045)       0:00:43.458 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [check RAID active devices count] *****************************************
task path: /tmp/tmpsvhdx31t/tests/test-verify-volume-md.yml:25
Thursday 21 July 2022  07:14:47 +0000 (0:00:00.043)       0:00:43.502 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [check RAID spare devices count] ******************************************
task path: /tmp/tmpsvhdx31t/tests/test-verify-volume-md.yml:31
Thursday 21 July 2022  07:14:47 +0000 (0:00:00.037)       0:00:43.540 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [check RAID metadata version] *********************************************
task path: /tmp/tmpsvhdx31t/tests/test-verify-volume-md.yml:37
Thursday 21 July 2022  07:14:47 +0000 (0:00:00.036)       0:00:43.576 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [parse the actual size of the volume] *************************************
task path: /tmp/tmpsvhdx31t/tests/test-verify-volume-size.yml:3
Thursday 21 July 2022  07:14:48 +0000 (0:00:00.035)       0:00:43.612 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [parse the requested size of the volume] **********************************
task path: /tmp/tmpsvhdx31t/tests/test-verify-volume-size.yml:9
Thursday 21 July 2022  07:14:48 +0000 (0:00:00.024)       0:00:43.637 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [Establish base value for expected size] **********************************
task path: /tmp/tmpsvhdx31t/tests/test-verify-volume-size.yml:15
Thursday 21 July 2022  07:14:48 +0000 (0:00:00.036)       0:00:43.673 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [debug] *******************************************************************
task path: /tmp/tmpsvhdx31t/tests/test-verify-volume-size.yml:20
Thursday 21 July 2022  07:14:48 +0000 (0:00:00.036)       0:00:43.710 ********* 
ok: [/cache/rhel-9.qcow2.snap] => {
    "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined"
}

TASK [debug] *******************************************************************
task path: /tmp/tmpsvhdx31t/tests/test-verify-volume-size.yml:25
Thursday 21 July 2022  07:14:48 +0000 (0:00:00.065)       0:00:43.775 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {}

TASK [debug] *******************************************************************
task path: /tmp/tmpsvhdx31t/tests/test-verify-volume-size.yml:28
Thursday 21 July 2022  07:14:48 +0000 (0:00:00.038)       0:00:43.814 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {}

TASK [Get the size of parent/pool device] **************************************
task path: /tmp/tmpsvhdx31t/tests/test-verify-volume-size.yml:31
Thursday 21 July 2022  07:14:48 +0000 (0:00:00.035)       0:00:43.849 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [debug] *******************************************************************
task path: /tmp/tmpsvhdx31t/tests/test-verify-volume-size.yml:36
Thursday 21 July 2022  07:14:48 +0000 (0:00:00.035)       0:00:43.885 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {}

TASK [Calculate the expected size based on pool size and percentage value] *****
task path: /tmp/tmpsvhdx31t/tests/test-verify-volume-size.yml:39
Thursday 21 July 2022  07:14:48 +0000 (0:00:00.036)       0:00:43.921 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [debug] *******************************************************************
task path: /tmp/tmpsvhdx31t/tests/test-verify-volume-size.yml:44
Thursday 21 July 2022  07:14:48 +0000 (0:00:00.040)       0:00:43.962 ********* 
ok: [/cache/rhel-9.qcow2.snap] => {
    "storage_test_actual_size": {
        "changed": false,
        "skip_reason": "Conditional result was False",
        "skipped": true
    }
}

TASK [debug] *******************************************************************
task path: /tmp/tmpsvhdx31t/tests/test-verify-volume-size.yml:47
Thursday 21 July 2022  07:14:48 +0000 (0:00:00.035)       0:00:43.997 ********* 
ok: [/cache/rhel-9.qcow2.snap] => {
    "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined"
}

TASK [assert] ******************************************************************
task path: /tmp/tmpsvhdx31t/tests/test-verify-volume-size.yml:50
Thursday 21 July 2022  07:14:48 +0000 (0:00:00.066)       0:00:44.063 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [Get information about the LV] ********************************************
task path: /tmp/tmpsvhdx31t/tests/test-verify-volume-cache.yml:6
Thursday 21 July 2022  07:14:48 +0000 (0:00:00.024)       0:00:44.088 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [set_fact] ****************************************************************
task path: /tmp/tmpsvhdx31t/tests/test-verify-volume-cache.yml:14
Thursday 21 July 2022  07:14:48 +0000 (0:00:00.049)       0:00:44.138 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [check segment type] ******************************************************
task path: /tmp/tmpsvhdx31t/tests/test-verify-volume-cache.yml:17
Thursday 21 July 2022  07:14:48 +0000 (0:00:00.024)       0:00:44.163 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [set_fact] ****************************************************************
task path: /tmp/tmpsvhdx31t/tests/test-verify-volume-cache.yml:22
Thursday 21 July 2022  07:14:48 +0000 (0:00:00.023)       0:00:44.186 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [parse the requested cache size] ******************************************
task path: /tmp/tmpsvhdx31t/tests/test-verify-volume-cache.yml:26
Thursday 21 July 2022  07:14:48 +0000 (0:00:00.023)       0:00:44.209 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [set_fact] ****************************************************************
task path: /tmp/tmpsvhdx31t/tests/test-verify-volume-cache.yml:32
Thursday 21 July 2022  07:14:48 +0000 (0:00:00.024)       0:00:44.234 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [Check cache size] ********************************************************
task path: /tmp/tmpsvhdx31t/tests/test-verify-volume-cache.yml:36
Thursday 21 July 2022  07:14:48 +0000 (0:00:00.024)       0:00:44.258 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [Clean up facts] **********************************************************
task path: /tmp/tmpsvhdx31t/tests/test-verify-volume.yml:16
Thursday 21 July 2022  07:14:48 +0000 (0:00:00.022)       0:00:44.281 ********* 
ok: [/cache/rhel-9.qcow2.snap] => {
    "ansible_facts": {
        "_storage_test_volume_present": null
    },
    "changed": false
}

TASK [Clean up variable namespace] *********************************************
task path: /tmp/tmpsvhdx31t/tests/verify-role-results.yml:53
Thursday 21 July 2022  07:14:48 +0000 (0:00:00.032)       0:00:44.314 ********* 
ok: [/cache/rhel-9.qcow2.snap] => {
    "ansible_facts": {
        "storage_test_blkinfo": null,
        "storage_test_crypttab": null,
        "storage_test_fstab": null
    },
    "changed": false
}

TASK [Clean up] ****************************************************************
task path: /tmp/tmpsvhdx31t/tests/tests_change_disk_mount.yml:61
Thursday 21 July 2022  07:14:48 +0000 (0:00:00.033)       0:00:44.348 ********* 

TASK [linux-system-roles.storage : set platform/version specific variables] ****
task path: /tmp/tmpsvhdx31t/tests/roles/linux-system-roles.storage/tasks/main.yml:2
Thursday 21 July 2022  07:14:48 +0000 (0:00:00.058)       0:00:44.406 ********* 
included: /tmp/tmpsvhdx31t/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/rhel-9.qcow2.snap

TASK [linux-system-roles.storage : Ensure ansible_facts used by role] **********
task path: /tmp/tmpsvhdx31t/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2
Thursday 21 July 2022  07:14:48 +0000 (0:00:00.034)       0:00:44.440 ********* 
ok: [/cache/rhel-9.qcow2.snap]

TASK [linux-system-roles.storage : Set platform/version specific variables] ****
task path: /tmp/tmpsvhdx31t/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8
Thursday 21 July 2022  07:14:49 +0000 (0:00:00.515)       0:00:44.955 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => (item=RedHat.yml)  => {
    "ansible_loop_var": "item",
    "changed": false,
    "item": "RedHat.yml",
    "skip_reason": "Conditional result was False"
}
skipping: [/cache/rhel-9.qcow2.snap] => (item=RedHat.yml)  => {
    "ansible_loop_var": "item",
    "changed": false,
    "item": "RedHat.yml",
    "skip_reason": "Conditional result was False"
}
ok: [/cache/rhel-9.qcow2.snap] => (item=RedHat_9.yml) => {
    "ansible_facts": {
        "blivet_package_list": [
            "python3-blivet",
            "libblockdev-crypto",
            "libblockdev-dm",
            "libblockdev-lvm",
            "libblockdev-mdraid",
            "libblockdev-swap",
            "vdo",
            "kmod-kvdo",
            "xfsprogs"
        ]
    },
    "ansible_included_var_files": [
        "/tmp/tmpsvhdx31t/tests/roles/linux-system-roles.storage/vars/RedHat_9.yml"
    ],
    "ansible_loop_var": "item",
    "changed": false,
    "item": "RedHat_9.yml"
}
skipping: [/cache/rhel-9.qcow2.snap] => (item=RedHat_9.0.yml)  => {
    "ansible_loop_var": "item",
    "changed": false,
    "item": "RedHat_9.0.yml",
    "skip_reason": "Conditional result was False"
}

TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] ***
task path: /tmp/tmpsvhdx31t/tests/roles/linux-system-roles.storage/tasks/main.yml:5
Thursday 21 July 2022  07:14:49 +0000 (0:00:00.059)       0:00:45.015 ********* 
ok: [/cache/rhel-9.qcow2.snap] => {
    "ansible_facts": {
        "_storage_pools_list": []
    },
    "changed": false
}

TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] ***
task path: /tmp/tmpsvhdx31t/tests/roles/linux-system-roles.storage/tasks/main.yml:9
Thursday 21 July 2022  07:14:49 +0000 (0:00:00.034)       0:00:45.050 ********* 
ok: [/cache/rhel-9.qcow2.snap] => {
    "ansible_facts": {
        "_storage_volumes_list": []
    },
    "changed": false
}

TASK [linux-system-roles.storage : include the appropriate provider tasks] *****
task path: /tmp/tmpsvhdx31t/tests/roles/linux-system-roles.storage/tasks/main.yml:13
Thursday 21 July 2022  07:14:49 +0000 (0:00:00.033)       0:00:45.083 ********* 
redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount
redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount
included: /tmp/tmpsvhdx31t/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/rhel-9.qcow2.snap

TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] ***
task path: /tmp/tmpsvhdx31t/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2
Thursday 21 July 2022  07:14:49 +0000 (0:00:00.044)       0:00:45.128 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [linux-system-roles.storage : make sure blivet is available] **************
task path: /tmp/tmpsvhdx31t/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7
Thursday 21 July 2022  07:14:49 +0000 (0:00:00.021)       0:00:45.149 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [linux-system-roles.storage : show storage_pools] *************************
task path: /tmp/tmpsvhdx31t/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14
Thursday 21 July 2022  07:14:49 +0000 (0:00:00.026)       0:00:45.176 ********* 
ok: [/cache/rhel-9.qcow2.snap] => {
    "storage_pools": "VARIABLE IS NOT DEFINED!: 'storage_pools' is undefined"
}

TASK [linux-system-roles.storage : show storage_volumes] ***********************
task path: /tmp/tmpsvhdx31t/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19
Thursday 21 July 2022  07:14:49 +0000 (0:00:00.034)       0:00:45.210 ********* 
ok: [/cache/rhel-9.qcow2.snap] => {
    "storage_volumes": [
        {
            "disks": [
                "sda"
            ],
            "mount_point": "/opt/test2",
            "name": "test1",
            "state": "absent",
            "type": "disk"
        }
    ]
}

TASK [linux-system-roles.storage : get required packages] **********************
task path: /tmp/tmpsvhdx31t/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24
Thursday 21 July 2022  07:14:49 +0000 (0:00:00.038)       0:00:45.249 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [linux-system-roles.storage : enable copr repositories if needed] *********
task path: /tmp/tmpsvhdx31t/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37
Thursday 21 July 2022  07:14:49 +0000 (0:00:00.029)       0:00:45.279 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [linux-system-roles.storage : make sure required packages are installed] ***
task path: /tmp/tmpsvhdx31t/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44
Thursday 21 July 2022  07:14:49 +0000 (0:00:00.026)       0:00:45.305 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [linux-system-roles.storage : get service facts] **************************
task path: /tmp/tmpsvhdx31t/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51
Thursday 21 July 2022  07:14:49 +0000 (0:00:00.026)       0:00:45.332 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************
task path: /tmp/tmpsvhdx31t/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58
Thursday 21 July 2022  07:14:49 +0000 (0:00:00.026)       0:00:45.358 ********* 
ok: [/cache/rhel-9.qcow2.snap] => {
    "ansible_facts": {
        "storage_cryptsetup_services": []
    },
    "changed": false
}

TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] *******
task path: /tmp/tmpsvhdx31t/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71
Thursday 21 July 2022  07:14:49 +0000 (0:00:00.056)       0:00:45.415 ********* 

TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] ***
task path: /tmp/tmpsvhdx31t/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77
Thursday 21 July 2022  07:14:49 +0000 (0:00:00.021)       0:00:45.436 ********* 
changed: [/cache/rhel-9.qcow2.snap] => {
    "actions": [
        {
            "action": "destroy format",
            "device": "/dev/sda",
            "fs_type": "xfs"
        }
    ],
    "changed": true,
    "crypts": [],
    "leaves": [
        "/dev/sr0",
        "/dev/vda1",
        "/dev/vda2",
        "/dev/vda3",
        "/dev/vda4",
        "/dev/sda",
        "/dev/sdb",
        "/dev/sdc",
        "/dev/nvme0n1",
        "/dev/nvme1n1",
        "/dev/nvme2n1",
        "/dev/vdb1",
        "/dev/vdc1",
        "/dev/vdd",
        "/dev/vde",
        "/dev/vdf"
    ],
    "mounts": [
        {
            "fstype": "xfs",
            "path": "/opt/test2",
            "src": "UUID=31c6244a-312c-4308-8a54-e9662b6a934a",
            "state": "absent"
        }
    ],
    "packages": [
        "dosfstools",
        "xfsprogs",
        "e2fsprogs"
    ],
    "pools": [],
    "volumes": [
        {
            "_device": "/dev/sda",
            "_mount_id": "UUID=31c6244a-312c-4308-8a54-e9662b6a934a",
            "_raw_device": "/dev/sda",
            "cache_devices": [],
            "cache_mode": null,
            "cache_size": 0,
            "cached": false,
            "compression": null,
            "deduplication": null,
            "disks": [
                "sda"
            ],
            "encryption": false,
            "encryption_cipher": null,
            "encryption_key": null,
            "encryption_key_size": null,
            "encryption_luks_version": null,
            "encryption_password": null,
            "fs_create_options": "",
            "fs_label": "",
            "fs_overwrite_existing": true,
            "fs_type": "xfs",
            "mount_check": 0,
            "mount_device_identifier": "uuid",
            "mount_options": "defaults",
            "mount_passno": 0,
            "mount_point": "/opt/test2",
            "name": "test1",
            "raid_chunk_size": null,
            "raid_device_count": null,
            "raid_level": null,
            "raid_metadata_version": null,
            "raid_spare_count": null,
            "size": 10737418240,
            "state": "absent",
            "thin": null,
            "thin_pool_name": null,
            "thin_pool_size": null,
            "type": "disk",
            "vdo_pool_size": null
        }
    ]
}

TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] ***
task path: /tmp/tmpsvhdx31t/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:91
Thursday 21 July 2022  07:14:51 +0000 (0:00:01.655)       0:00:47.091 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] *****
task path: /tmp/tmpsvhdx31t/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103
Thursday 21 July 2022  07:14:51 +0000 (0:00:00.038)       0:00:47.129 ********* 

TASK [linux-system-roles.storage : show blivet_output] *************************
task path: /tmp/tmpsvhdx31t/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:109
Thursday 21 July 2022  07:14:51 +0000 (0:00:00.022)       0:00:47.152 ********* 
ok: [/cache/rhel-9.qcow2.snap] => {
    "blivet_output": {
        "actions": [
            {
                "action": "destroy format",
                "device": "/dev/sda",
                "fs_type": "xfs"
            }
        ],
        "changed": true,
        "crypts": [],
        "failed": false,
        "leaves": [
            "/dev/sr0",
            "/dev/vda1",
            "/dev/vda2",
            "/dev/vda3",
            "/dev/vda4",
            "/dev/sda",
            "/dev/sdb",
            "/dev/sdc",
            "/dev/nvme0n1",
            "/dev/nvme1n1",
            "/dev/nvme2n1",
            "/dev/vdb1",
            "/dev/vdc1",
            "/dev/vdd",
            "/dev/vde",
            "/dev/vdf"
        ],
        "mounts": [
            {
                "fstype": "xfs",
                "path": "/opt/test2",
                "src": "UUID=31c6244a-312c-4308-8a54-e9662b6a934a",
                "state": "absent"
            }
        ],
        "packages": [
            "dosfstools",
            "xfsprogs",
            "e2fsprogs"
        ],
        "pools": [],
        "volumes": [
            {
                "_device": "/dev/sda",
                "_mount_id": "UUID=31c6244a-312c-4308-8a54-e9662b6a934a",
                "_raw_device": "/dev/sda",
                "cache_devices": [],
                "cache_mode": null,
                "cache_size": 0,
                "cached": false,
                "compression": null,
                "deduplication": null,
                "disks": [
                    "sda"
                ],
                "encryption": false,
                "encryption_cipher": null,
                "encryption_key": null,
                "encryption_key_size": null,
                "encryption_luks_version": null,
                "encryption_password": null,
                "fs_create_options": "",
                "fs_label": "",
                "fs_overwrite_existing": true,
                "fs_type": "xfs",
                "mount_check": 0,
                "mount_device_identifier": "uuid",
                "mount_options": "defaults",
                "mount_passno": 0,
                "mount_point": "/opt/test2",
                "name": "test1",
                "raid_chunk_size": null,
                "raid_device_count": null,
                "raid_level": null,
                "raid_metadata_version": null,
                "raid_spare_count": null,
                "size": 10737418240,
                "state": "absent",
                "thin": null,
                "thin_pool_name": null,
                "thin_pool_size": null,
                "type": "disk",
                "vdo_pool_size": null
            }
        ]
    }
}

TASK [linux-system-roles.storage : set the list of pools for test verification] ***
task path: /tmp/tmpsvhdx31t/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:114
Thursday 21 July 2022  07:14:51 +0000 (0:00:00.039)       0:00:47.192 ********* 
ok: [/cache/rhel-9.qcow2.snap] => {
    "ansible_facts": {
        "_storage_pools_list": []
    },
    "changed": false
}

TASK [linux-system-roles.storage : set the list of volumes for test verification] ***
task path: /tmp/tmpsvhdx31t/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:118
Thursday 21 July 2022  07:14:51 +0000 (0:00:00.036)       0:00:47.228 ********* 
ok: [/cache/rhel-9.qcow2.snap] => {
    "ansible_facts": {
        "_storage_volumes_list": [
            {
                "_device": "/dev/sda",
                "_mount_id": "UUID=31c6244a-312c-4308-8a54-e9662b6a934a",
                "_raw_device": "/dev/sda",
                "cache_devices": [],
                "cache_mode": null,
                "cache_size": 0,
                "cached": false,
                "compression": null,
                "deduplication": null,
                "disks": [
                    "sda"
                ],
                "encryption": false,
                "encryption_cipher": null,
                "encryption_key": null,
                "encryption_key_size": null,
                "encryption_luks_version": null,
                "encryption_password": null,
                "fs_create_options": "",
                "fs_label": "",
                "fs_overwrite_existing": true,
                "fs_type": "xfs",
                "mount_check": 0,
                "mount_device_identifier": "uuid",
                "mount_options": "defaults",
                "mount_passno": 0,
                "mount_point": "/opt/test2",
                "name": "test1",
                "raid_chunk_size": null,
                "raid_device_count": null,
                "raid_level": null,
                "raid_metadata_version": null,
                "raid_spare_count": null,
                "size": 10737418240,
                "state": "absent",
                "thin": null,
                "thin_pool_name": null,
                "thin_pool_size": null,
                "type": "disk",
                "vdo_pool_size": null
            }
        ]
    },
    "changed": false
}

TASK [linux-system-roles.storage : remove obsolete mounts] *********************
task path: /tmp/tmpsvhdx31t/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:134
Thursday 21 July 2022  07:14:51 +0000 (0:00:00.038)       0:00:47.267 ********* 
redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount
changed: [/cache/rhel-9.qcow2.snap] => (item={'src': 'UUID=31c6244a-312c-4308-8a54-e9662b6a934a', 'path': '/opt/test2', 'state': 'absent', 'fstype': 'xfs'}) => {
    "ansible_loop_var": "mount_info",
    "backup_file": "",
    "boot": "yes",
    "changed": true,
    "dump": "0",
    "fstab": "/etc/fstab",
    "fstype": "xfs",
    "mount_info": {
        "fstype": "xfs",
        "path": "/opt/test2",
        "src": "UUID=31c6244a-312c-4308-8a54-e9662b6a934a",
        "state": "absent"
    },
    "name": "/opt/test2",
    "opts": "defaults",
    "passno": "0",
    "src": "UUID=31c6244a-312c-4308-8a54-e9662b6a934a"
}

TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] ***
task path: /tmp/tmpsvhdx31t/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146
Thursday 21 July 2022  07:14:52 +0000 (0:00:00.378)       0:00:47.645 ********* 
ok: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "name": null,
    "status": {}
}

TASK [linux-system-roles.storage : set up new/current mounts] ******************
task path: /tmp/tmpsvhdx31t/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151
Thursday 21 July 2022  07:14:52 +0000 (0:00:00.645)       0:00:48.290 ********* 

TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] ***
task path: /tmp/tmpsvhdx31t/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163
Thursday 21 July 2022  07:14:52 +0000 (0:00:00.038)       0:00:48.329 ********* 
ok: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "name": null,
    "status": {}
}

TASK [linux-system-roles.storage : retrieve facts for the /etc/crypttab file] ***
task path: /tmp/tmpsvhdx31t/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171
Thursday 21 July 2022  07:14:53 +0000 (0:00:00.644)       0:00:48.973 ********* 
ok: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "stat": {
        "exists": false
    }
}

TASK [linux-system-roles.storage : manage /etc/crypttab to account for changes we just made] ***
task path: /tmp/tmpsvhdx31t/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:176
Thursday 21 July 2022  07:14:53 +0000 (0:00:00.356)       0:00:49.330 ********* 

TASK [linux-system-roles.storage : Update facts] *******************************
task path: /tmp/tmpsvhdx31t/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198
Thursday 21 July 2022  07:14:53 +0000 (0:00:00.022)       0:00:49.352 ********* 
ok: [/cache/rhel-9.qcow2.snap]
META: role_complete for /cache/rhel-9.qcow2.snap

TASK [include_tasks] ***********************************************************
task path: /tmp/tmpsvhdx31t/tests/tests_change_disk_mount.yml:72
Thursday 21 July 2022  07:14:54 +0000 (0:00:00.946)       0:00:50.298 ********* 
included: /tmp/tmpsvhdx31t/tests/verify-role-results.yml for /cache/rhel-9.qcow2.snap

TASK [Print out pool information] **********************************************
task path: /tmp/tmpsvhdx31t/tests/verify-role-results.yml:1
Thursday 21 July 2022  07:14:54 +0000 (0:00:00.047)       0:00:50.346 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {}

TASK [Print out volume information] ********************************************
task path: /tmp/tmpsvhdx31t/tests/verify-role-results.yml:6
Thursday 21 July 2022  07:14:54 +0000 (0:00:00.035)       0:00:50.381 ********* 
ok: [/cache/rhel-9.qcow2.snap] => {
    "_storage_volumes_list": [
        {
            "_device": "/dev/sda",
            "_mount_id": "UUID=31c6244a-312c-4308-8a54-e9662b6a934a",
            "_raw_device": "/dev/sda",
            "cache_devices": [],
            "cache_mode": null,
            "cache_size": 0,
            "cached": false,
            "compression": null,
            "deduplication": null,
            "disks": [
                "sda"
            ],
            "encryption": false,
            "encryption_cipher": null,
            "encryption_key": null,
            "encryption_key_size": null,
            "encryption_luks_version": null,
            "encryption_password": null,
            "fs_create_options": "",
            "fs_label": "",
            "fs_overwrite_existing": true,
            "fs_type": "xfs",
            "mount_check": 0,
            "mount_device_identifier": "uuid",
            "mount_options": "defaults",
            "mount_passno": 0,
            "mount_point": "/opt/test2",
            "name": "test1",
            "raid_chunk_size": null,
            "raid_device_count": null,
            "raid_level": null,
            "raid_metadata_version": null,
            "raid_spare_count": null,
            "size": 10737418240,
            "state": "absent",
            "thin": null,
            "thin_pool_name": null,
            "thin_pool_size": null,
            "type": "disk",
            "vdo_pool_size": null
        }
    ]
}

TASK [Collect info about the volumes.] *****************************************
task path: /tmp/tmpsvhdx31t/tests/verify-role-results.yml:14
Thursday 21 July 2022  07:14:54 +0000 (0:00:00.078)       0:00:50.460 ********* 
ok: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "info": {
        "/dev/nvme0n1": {
            "fstype": "",
            "label": "",
            "name": "/dev/nvme0n1",
            "size": "10G",
            "type": "disk",
            "uuid": ""
        },
        "/dev/nvme1n1": {
            "fstype": "",
            "label": "",
            "name": "/dev/nvme1n1",
            "size": "10G",
            "type": "disk",
            "uuid": ""
        },
        "/dev/nvme2n1": {
            "fstype": "",
            "label": "",
            "name": "/dev/nvme2n1",
            "size": "10G",
            "type": "disk",
            "uuid": ""
        },
        "/dev/sda": {
            "fstype": "",
            "label": "",
            "name": "/dev/sda",
            "size": "10G",
            "type": "disk",
            "uuid": ""
        },
        "/dev/sdb": {
            "fstype": "",
            "label": "",
            "name": "/dev/sdb",
            "size": "10G",
            "type": "disk",
            "uuid": ""
        },
        "/dev/sdc": {
            "fstype": "",
            "label": "",
            "name": "/dev/sdc",
            "size": "10G",
            "type": "disk",
            "uuid": ""
        },
        "/dev/sr0": {
            "fstype": "iso9660",
            "label": "cidata",
            "name": "/dev/sr0",
            "size": "364K",
            "type": "rom",
            "uuid": "2022-07-21-07-13-51-00"
        },
        "/dev/vda": {
            "fstype": "",
            "label": "",
            "name": "/dev/vda",
            "size": "10G",
            "type": "disk",
            "uuid": ""
        },
        "/dev/vda1": {
            "fstype": "",
            "label": "",
            "name": "/dev/vda1",
            "size": "1M",
            "type": "partition",
            "uuid": ""
        },
        "/dev/vda2": {
            "fstype": "vfat",
            "label": "",
            "name": "/dev/vda2",
            "size": "200M",
            "type": "partition",
            "uuid": "7B77-95E7"
        },
        "/dev/vda3": {
            "fstype": "xfs",
            "label": "boot",
            "name": "/dev/vda3",
            "size": "500M",
            "type": "partition",
            "uuid": "6e74e171-0370-451f-8340-f16ad2839183"
        },
        "/dev/vda4": {
            "fstype": "xfs",
            "label": "root",
            "name": "/dev/vda4",
            "size": "9.3G",
            "type": "partition",
            "uuid": "1bb53e4d-984c-4316-908a-59b5a62fa30e"
        },
        "/dev/vdb": {
            "fstype": "",
            "label": "",
            "name": "/dev/vdb",
            "size": "2G",
            "type": "disk",
            "uuid": ""
        },
        "/dev/vdb1": {
            "fstype": "ext4",
            "label": "yumcache",
            "name": "/dev/vdb1",
            "size": "2G",
            "type": "partition",
            "uuid": "12be66f7-7f03-4d54-9a48-a520825d8fa9"
        },
        "/dev/vdc": {
            "fstype": "",
            "label": "",
            "name": "/dev/vdc",
            "size": "2G",
            "type": "disk",
            "uuid": ""
        },
        "/dev/vdc1": {
            "fstype": "ext4",
            "label": "yumvarlib",
            "name": "/dev/vdc1",
            "size": "2G",
            "type": "partition",
            "uuid": "9bd037ed-b1bb-4b9c-a008-9f07d34ab071"
        },
        "/dev/vdd": {
            "fstype": "",
            "label": "",
            "name": "/dev/vdd",
            "size": "10G",
            "type": "disk",
            "uuid": ""
        },
        "/dev/vde": {
            "fstype": "",
            "label": "",
            "name": "/dev/vde",
            "size": "10G",
            "type": "disk",
            "uuid": ""
        },
        "/dev/vdf": {
            "fstype": "",
            "label": "",
            "name": "/dev/vdf",
            "size": "10G",
            "type": "disk",
            "uuid": ""
        }
    }
}

TASK [Read the /etc/fstab file for volume existence] ***************************
task path: /tmp/tmpsvhdx31t/tests/verify-role-results.yml:19
Thursday 21 July 2022  07:14:55 +0000 (0:00:00.402)       0:00:50.862 ********* 
ok: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "cmd": [
        "cat",
        "/etc/fstab"
    ],
    "delta": "0:00:00.002566",
    "end": "2022-07-21 03:14:54.939798",
    "rc": 0,
    "start": "2022-07-21 03:14:54.937232"
}

STDOUT:

UUID=6e74e171-0370-451f-8340-f16ad2839183	/boot	xfs	defaults	0	0
UUID=1bb53e4d-984c-4316-908a-59b5a62fa30e	/	xfs	defaults	0	0
UUID=7B77-95E7	/boot/efi	vfat	defaults,uid=0,gid=0,umask=077,shortname=winnt	0	2
/dev/vdb1	/var/cache/dnf	auto	defaults,nofail,x-systemd.requires=cloud-init.service,_netdev,comment=cloudconfig	0	2
/dev/vdc1	/var/lib/dnf	auto	defaults,nofail,x-systemd.requires=cloud-init.service,_netdev,comment=cloudconfig	0	2

TASK [Read the /etc/crypttab file] *********************************************
task path: /tmp/tmpsvhdx31t/tests/verify-role-results.yml:24
Thursday 21 July 2022  07:14:55 +0000 (0:00:00.362)       0:00:51.225 ********* 
ok: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "cmd": [
        "cat",
        "/etc/crypttab"
    ],
    "delta": "0:00:00.002738",
    "end": "2022-07-21 03:14:55.330391",
    "failed_when_result": false,
    "rc": 1,
    "start": "2022-07-21 03:14:55.327653"
}

STDERR:

cat: /etc/crypttab: No such file or directory


MSG:

non-zero return code

TASK [Verify the volumes listed in storage_pools were correctly managed] *******
task path: /tmp/tmpsvhdx31t/tests/verify-role-results.yml:33
Thursday 21 July 2022  07:14:56 +0000 (0:00:00.384)       0:00:51.610 ********* 

TASK [Verify the volumes with no pool were correctly managed] ******************
task path: /tmp/tmpsvhdx31t/tests/verify-role-results.yml:43
Thursday 21 July 2022  07:14:56 +0000 (0:00:00.022)       0:00:51.632 ********* 
included: /tmp/tmpsvhdx31t/tests/test-verify-volume.yml for /cache/rhel-9.qcow2.snap => (item={'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test2', 'name': 'test1', 'raid_level': None, 'size': 10737418240, 'state': 'absent', 'type': 'disk', 'disks': ['sda'], 'raid_device_count': None, 'raid_spare_count': None, 'raid_metadata_version': None, 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_chunk_size': None, 'compression': None, 'deduplication': None, 'vdo_pool_size': None, 'thin': None, 'thin_pool_name': None, 'thin_pool_size': None, 'cached': False, 'cache_size': 0, 'cache_mode': None, 'cache_devices': [], '_device': '/dev/sda', '_raw_device': '/dev/sda', '_mount_id': 'UUID=31c6244a-312c-4308-8a54-e9662b6a934a'})

TASK [set_fact] ****************************************************************
task path: /tmp/tmpsvhdx31t/tests/test-verify-volume.yml:2
Thursday 21 July 2022  07:14:56 +0000 (0:00:00.056)       0:00:51.689 ********* 
ok: [/cache/rhel-9.qcow2.snap] => {
    "ansible_facts": {
        "_storage_test_volume_present": false,
        "_storage_volume_tests": [
            "mount",
            "fstab",
            "fs",
            "device",
            "encryption",
            "md",
            "size",
            "cache"
        ]
    },
    "changed": false
}

TASK [include_tasks] ***********************************************************
task path: /tmp/tmpsvhdx31t/tests/test-verify-volume.yml:10
Thursday 21 July 2022  07:14:56 +0000 (0:00:00.048)       0:00:51.738 ********* 
included: /tmp/tmpsvhdx31t/tests/test-verify-volume-mount.yml for /cache/rhel-9.qcow2.snap => (item=mount)
included: /tmp/tmpsvhdx31t/tests/test-verify-volume-fstab.yml for /cache/rhel-9.qcow2.snap => (item=fstab)
included: /tmp/tmpsvhdx31t/tests/test-verify-volume-fs.yml for /cache/rhel-9.qcow2.snap => (item=fs)
included: /tmp/tmpsvhdx31t/tests/test-verify-volume-device.yml for /cache/rhel-9.qcow2.snap => (item=device)
included: /tmp/tmpsvhdx31t/tests/test-verify-volume-encryption.yml for /cache/rhel-9.qcow2.snap => (item=encryption)
included: /tmp/tmpsvhdx31t/tests/test-verify-volume-md.yml for /cache/rhel-9.qcow2.snap => (item=md)
included: /tmp/tmpsvhdx31t/tests/test-verify-volume-size.yml for /cache/rhel-9.qcow2.snap => (item=size)
included: /tmp/tmpsvhdx31t/tests/test-verify-volume-cache.yml for /cache/rhel-9.qcow2.snap => (item=cache)

TASK [Get expected mount device based on device type] **************************
task path: /tmp/tmpsvhdx31t/tests/test-verify-volume-mount.yml:6
Thursday 21 July 2022  07:14:56 +0000 (0:00:00.069)       0:00:51.808 ********* 
ok: [/cache/rhel-9.qcow2.snap] => {
    "ansible_facts": {
        "storage_test_device_path": "/dev/sda"
    },
    "changed": false
}

TASK [Set some facts] **********************************************************
task path: /tmp/tmpsvhdx31t/tests/test-verify-volume-mount.yml:14
Thursday 21 July 2022  07:14:56 +0000 (0:00:00.040)       0:00:51.848 ********* 
ok: [/cache/rhel-9.qcow2.snap] => {
    "ansible_facts": {
        "storage_test_mount_device_matches": [],
        "storage_test_mount_expected_match_count": "0",
        "storage_test_mount_point_matches": [],
        "storage_test_swap_expected_matches": "0"
    },
    "changed": false
}

TASK [Verify the current mount state by device] ********************************
task path: /tmp/tmpsvhdx31t/tests/test-verify-volume-mount.yml:28
Thursday 21 July 2022  07:14:56 +0000 (0:00:00.053)       0:00:51.902 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [Verify the current mount state by mount point] ***************************
task path: /tmp/tmpsvhdx31t/tests/test-verify-volume-mount.yml:37
Thursday 21 July 2022  07:14:56 +0000 (0:00:00.023)       0:00:51.925 ********* 
ok: [/cache/rhel-9.qcow2.snap] => {
    "changed": false
}

MSG:

All assertions passed

TASK [Verify the mount fs type] ************************************************
task path: /tmp/tmpsvhdx31t/tests/test-verify-volume-mount.yml:45
Thursday 21 July 2022  07:14:56 +0000 (0:00:00.049)       0:00:51.974 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [command] *****************************************************************
task path: /tmp/tmpsvhdx31t/tests/test-verify-volume-mount.yml:54
Thursday 21 July 2022  07:14:56 +0000 (0:00:00.034)       0:00:52.009 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [Gather swap info] ********************************************************
task path: /tmp/tmpsvhdx31t/tests/test-verify-volume-mount.yml:58
Thursday 21 July 2022  07:14:56 +0000 (0:00:00.023)       0:00:52.033 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [Verify swap status] ******************************************************
task path: /tmp/tmpsvhdx31t/tests/test-verify-volume-mount.yml:63
Thursday 21 July 2022  07:14:56 +0000 (0:00:00.023)       0:00:52.056 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [Unset facts] *************************************************************
task path: /tmp/tmpsvhdx31t/tests/test-verify-volume-mount.yml:75
Thursday 21 July 2022  07:14:56 +0000 (0:00:00.024)       0:00:52.080 ********* 
ok: [/cache/rhel-9.qcow2.snap] => {
    "ansible_facts": {
        "storage_test_mount_device_matches": null,
        "storage_test_mount_expected_match_count": null,
        "storage_test_mount_point_matches": null,
        "storage_test_swap_expected_matches": null,
        "storage_test_swaps": null,
        "storage_test_sys_node": null
    },
    "changed": false
}

TASK [Set some variables for fstab checking] ***********************************
task path: /tmp/tmpsvhdx31t/tests/test-verify-volume-fstab.yml:2
Thursday 21 July 2022  07:14:56 +0000 (0:00:00.032)       0:00:52.113 ********* 
ok: [/cache/rhel-9.qcow2.snap] => {
    "ansible_facts": {
        "storage_test_fstab_expected_id_matches": "0",
        "storage_test_fstab_expected_mount_options_matches": "0",
        "storage_test_fstab_expected_mount_point_matches": "0",
        "storage_test_fstab_id_matches": [],
        "storage_test_fstab_mount_options_matches": [],
        "storage_test_fstab_mount_point_matches": []
    },
    "changed": false
}

TASK [Verify that the device identifier appears in /etc/fstab] *****************
task path: /tmp/tmpsvhdx31t/tests/test-verify-volume-fstab.yml:25
Thursday 21 July 2022  07:14:56 +0000 (0:00:00.059)       0:00:52.172 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [Verify the fstab mount point] ********************************************
task path: /tmp/tmpsvhdx31t/tests/test-verify-volume-fstab.yml:32
Thursday 21 July 2022  07:14:56 +0000 (0:00:00.022)       0:00:52.195 ********* 
ok: [/cache/rhel-9.qcow2.snap] => {
    "changed": false
}

MSG:

All assertions passed

TASK [Verify mount_options] ****************************************************
task path: /tmp/tmpsvhdx31t/tests/test-verify-volume-fstab.yml:39
Thursday 21 July 2022  07:14:56 +0000 (0:00:00.048)       0:00:52.243 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [Clean up variables] ******************************************************
task path: /tmp/tmpsvhdx31t/tests/test-verify-volume-fstab.yml:49
Thursday 21 July 2022  07:14:56 +0000 (0:00:00.035)       0:00:52.279 ********* 
ok: [/cache/rhel-9.qcow2.snap] => {
    "ansible_facts": {
        "storage_test_fstab_expected_id_matches": null,
        "storage_test_fstab_expected_mount_options_matches": null,
        "storage_test_fstab_expected_mount_point_matches": null,
        "storage_test_fstab_id_matches": null,
        "storage_test_fstab_mount_options_matches": null,
        "storage_test_fstab_mount_point_matches": null
    },
    "changed": false
}

TASK [Verify fs type] **********************************************************
task path: /tmp/tmpsvhdx31t/tests/test-verify-volume-fs.yml:4
Thursday 21 July 2022  07:14:56 +0000 (0:00:00.034)       0:00:52.313 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [Verify fs label] *********************************************************
task path: /tmp/tmpsvhdx31t/tests/test-verify-volume-fs.yml:10
Thursday 21 July 2022  07:14:56 +0000 (0:00:00.024)       0:00:52.338 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [See whether the device node is present] **********************************
task path: /tmp/tmpsvhdx31t/tests/test-verify-volume-device.yml:4
Thursday 21 July 2022  07:14:56 +0000 (0:00:00.023)       0:00:52.361 ********* 
ok: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "stat": {
        "atime": 1658387690.7492118,
        "attr_flags": "",
        "attributes": [],
        "block_size": 4096,
        "blocks": 0,
        "charset": "binary",
        "ctime": 1658387690.7492118,
        "dev": 5,
        "device_type": 2048,
        "executable": false,
        "exists": true,
        "gid": 6,
        "gr_name": "disk",
        "inode": 363,
        "isblk": true,
        "ischr": false,
        "isdir": false,
        "isfifo": false,
        "isgid": false,
        "islnk": false,
        "isreg": false,
        "issock": false,
        "isuid": false,
        "mimetype": "inode/blockdevice",
        "mode": "0660",
        "mtime": 1658387690.7492118,
        "nlink": 1,
        "path": "/dev/sda",
        "pw_name": "root",
        "readable": true,
        "rgrp": true,
        "roth": false,
        "rusr": true,
        "size": 0,
        "uid": 0,
        "version": null,
        "wgrp": true,
        "woth": false,
        "writeable": true,
        "wusr": true,
        "xgrp": false,
        "xoth": false,
        "xusr": false
    }
}

TASK [Verify the presence/absence of the device node] **************************
task path: /tmp/tmpsvhdx31t/tests/test-verify-volume-device.yml:10
Thursday 21 July 2022  07:14:57 +0000 (0:00:00.403)       0:00:52.764 ********* 
ok: [/cache/rhel-9.qcow2.snap] => {
    "changed": false
}

MSG:

All assertions passed

TASK [Make sure we got info about this volume] *********************************
task path: /tmp/tmpsvhdx31t/tests/test-verify-volume-device.yml:18
Thursday 21 July 2022  07:14:57 +0000 (0:00:00.037)       0:00:52.802 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [(1/2) Process volume type (set initial value)] ***************************
task path: /tmp/tmpsvhdx31t/tests/test-verify-volume-device.yml:24
Thursday 21 July 2022  07:14:57 +0000 (0:00:00.023)       0:00:52.826 ********* 
ok: [/cache/rhel-9.qcow2.snap] => {
    "ansible_facts": {
        "st_volume_type": "disk"
    },
    "changed": false
}

TASK [(2/2) Process volume type (get RAID value)] ******************************
task path: /tmp/tmpsvhdx31t/tests/test-verify-volume-device.yml:28
Thursday 21 July 2022  07:14:57 +0000 (0:00:00.033)       0:00:52.860 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [Verify the volume's device type] *****************************************
task path: /tmp/tmpsvhdx31t/tests/test-verify-volume-device.yml:33
Thursday 21 July 2022  07:14:57 +0000 (0:00:00.023)       0:00:52.883 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [Stat the LUKS device, if encrypted] **************************************
task path: /tmp/tmpsvhdx31t/tests/test-verify-volume-encryption.yml:3
Thursday 21 July 2022  07:14:57 +0000 (0:00:00.023)       0:00:52.907 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [Ensure cryptsetup is present] ********************************************
task path: /tmp/tmpsvhdx31t/tests/test-verify-volume-encryption.yml:10
Thursday 21 July 2022  07:14:57 +0000 (0:00:00.025)       0:00:52.932 ********* 
ok: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "rc": 0,
    "results": []
}

MSG:

Nothing to do

TASK [Collect LUKS info for this volume] ***************************************
task path: /tmp/tmpsvhdx31t/tests/test-verify-volume-encryption.yml:15
Thursday 21 July 2022  07:14:58 +0000 (0:00:00.869)       0:00:53.802 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [Verify the presence/absence of the LUKS device node] *********************
task path: /tmp/tmpsvhdx31t/tests/test-verify-volume-encryption.yml:21
Thursday 21 July 2022  07:14:58 +0000 (0:00:00.024)       0:00:53.826 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [Verify that the raw device is the same as the device if not encrypted] ***
task path: /tmp/tmpsvhdx31t/tests/test-verify-volume-encryption.yml:30
Thursday 21 July 2022  07:14:58 +0000 (0:00:00.023)       0:00:53.850 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [Make sure we got info about the LUKS volume if encrypted] ****************
task path: /tmp/tmpsvhdx31t/tests/test-verify-volume-encryption.yml:38
Thursday 21 July 2022  07:14:58 +0000 (0:00:00.023)       0:00:53.873 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [Verify the LUKS volume's device type if encrypted] ***********************
task path: /tmp/tmpsvhdx31t/tests/test-verify-volume-encryption.yml:44
Thursday 21 July 2022  07:14:58 +0000 (0:00:00.025)       0:00:53.899 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [Check LUKS version] ******************************************************
task path: /tmp/tmpsvhdx31t/tests/test-verify-volume-encryption.yml:49
Thursday 21 July 2022  07:14:58 +0000 (0:00:00.024)       0:00:53.923 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [Check LUKS key size] *****************************************************
task path: /tmp/tmpsvhdx31t/tests/test-verify-volume-encryption.yml:55
Thursday 21 July 2022  07:14:58 +0000 (0:00:00.024)       0:00:53.947 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [Check LUKS cipher] *******************************************************
task path: /tmp/tmpsvhdx31t/tests/test-verify-volume-encryption.yml:61
Thursday 21 July 2022  07:14:58 +0000 (0:00:00.023)       0:00:53.971 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [set_fact] ****************************************************************
task path: /tmp/tmpsvhdx31t/tests/test-verify-volume-encryption.yml:67
Thursday 21 July 2022  07:14:58 +0000 (0:00:00.025)       0:00:53.996 ********* 
ok: [/cache/rhel-9.qcow2.snap] => {
    "ansible_facts": {
        "_storage_test_crypttab_entries": [],
        "_storage_test_expected_crypttab_entries": "0",
        "_storage_test_expected_crypttab_key_file": "-"
    },
    "changed": false
}

TASK [Check for /etc/crypttab entry] *******************************************
task path: /tmp/tmpsvhdx31t/tests/test-verify-volume-encryption.yml:74
Thursday 21 July 2022  07:14:58 +0000 (0:00:00.051)       0:00:54.048 ********* 
ok: [/cache/rhel-9.qcow2.snap] => {
    "changed": false
}

MSG:

All assertions passed

TASK [Validate the format of the crypttab entry] *******************************
task path: /tmp/tmpsvhdx31t/tests/test-verify-volume-encryption.yml:79
Thursday 21 July 2022  07:14:58 +0000 (0:00:00.050)       0:00:54.098 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [Check backing device of crypttab entry] **********************************
task path: /tmp/tmpsvhdx31t/tests/test-verify-volume-encryption.yml:85
Thursday 21 July 2022  07:14:58 +0000 (0:00:00.037)       0:00:54.135 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [Check key file of crypttab entry] ****************************************
task path: /tmp/tmpsvhdx31t/tests/test-verify-volume-encryption.yml:91
Thursday 21 July 2022  07:14:58 +0000 (0:00:00.036)       0:00:54.172 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [set_fact] ****************************************************************
task path: /tmp/tmpsvhdx31t/tests/test-verify-volume-encryption.yml:97
Thursday 21 July 2022  07:14:58 +0000 (0:00:00.036)       0:00:54.209 ********* 
ok: [/cache/rhel-9.qcow2.snap] => {
    "ansible_facts": {
        "_storage_test_crypttab_entries": null,
        "_storage_test_expected_crypttab_entries": null,
        "_storage_test_expected_crypttab_key_file": null
    },
    "changed": false
}

TASK [get information about RAID] **********************************************
task path: /tmp/tmpsvhdx31t/tests/test-verify-volume-md.yml:7
Thursday 21 July 2022  07:14:58 +0000 (0:00:00.035)       0:00:54.244 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [set_fact] ****************************************************************
task path: /tmp/tmpsvhdx31t/tests/test-verify-volume-md.yml:13
Thursday 21 July 2022  07:14:58 +0000 (0:00:00.037)       0:00:54.282 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [set_fact] ****************************************************************
task path: /tmp/tmpsvhdx31t/tests/test-verify-volume-md.yml:17
Thursday 21 July 2022  07:14:58 +0000 (0:00:00.037)       0:00:54.319 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [set_fact] ****************************************************************
task path: /tmp/tmpsvhdx31t/tests/test-verify-volume-md.yml:21
Thursday 21 July 2022  07:14:58 +0000 (0:00:00.044)       0:00:54.364 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [check RAID active devices count] *****************************************
task path: /tmp/tmpsvhdx31t/tests/test-verify-volume-md.yml:25
Thursday 21 July 2022  07:14:58 +0000 (0:00:00.039)       0:00:54.403 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [check RAID spare devices count] ******************************************
task path: /tmp/tmpsvhdx31t/tests/test-verify-volume-md.yml:31
Thursday 21 July 2022  07:14:58 +0000 (0:00:00.038)       0:00:54.442 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [check RAID metadata version] *********************************************
task path: /tmp/tmpsvhdx31t/tests/test-verify-volume-md.yml:37
Thursday 21 July 2022  07:14:58 +0000 (0:00:00.036)       0:00:54.478 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [parse the actual size of the volume] *************************************
task path: /tmp/tmpsvhdx31t/tests/test-verify-volume-size.yml:3
Thursday 21 July 2022  07:14:58 +0000 (0:00:00.035)       0:00:54.514 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [parse the requested size of the volume] **********************************
task path: /tmp/tmpsvhdx31t/tests/test-verify-volume-size.yml:9
Thursday 21 July 2022  07:14:58 +0000 (0:00:00.023)       0:00:54.537 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [Establish base value for expected size] **********************************
task path: /tmp/tmpsvhdx31t/tests/test-verify-volume-size.yml:15
Thursday 21 July 2022  07:14:58 +0000 (0:00:00.036)       0:00:54.574 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [debug] *******************************************************************
task path: /tmp/tmpsvhdx31t/tests/test-verify-volume-size.yml:20
Thursday 21 July 2022  07:14:59 +0000 (0:00:00.078)       0:00:54.653 ********* 
ok: [/cache/rhel-9.qcow2.snap] => {
    "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined"
}

TASK [debug] *******************************************************************
task path: /tmp/tmpsvhdx31t/tests/test-verify-volume-size.yml:25
Thursday 21 July 2022  07:14:59 +0000 (0:00:00.035)       0:00:54.688 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {}

TASK [debug] *******************************************************************
task path: /tmp/tmpsvhdx31t/tests/test-verify-volume-size.yml:28
Thursday 21 July 2022  07:14:59 +0000 (0:00:00.035)       0:00:54.724 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {}

TASK [Get the size of parent/pool device] **************************************
task path: /tmp/tmpsvhdx31t/tests/test-verify-volume-size.yml:31
Thursday 21 July 2022  07:14:59 +0000 (0:00:00.036)       0:00:54.760 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [debug] *******************************************************************
task path: /tmp/tmpsvhdx31t/tests/test-verify-volume-size.yml:36
Thursday 21 July 2022  07:14:59 +0000 (0:00:00.035)       0:00:54.796 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {}

TASK [Calculate the expected size based on pool size and percentage value] *****
task path: /tmp/tmpsvhdx31t/tests/test-verify-volume-size.yml:39
Thursday 21 July 2022  07:14:59 +0000 (0:00:00.035)       0:00:54.831 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [debug] *******************************************************************
task path: /tmp/tmpsvhdx31t/tests/test-verify-volume-size.yml:44
Thursday 21 July 2022  07:14:59 +0000 (0:00:00.035)       0:00:54.866 ********* 
ok: [/cache/rhel-9.qcow2.snap] => {
    "storage_test_actual_size": {
        "changed": false,
        "skip_reason": "Conditional result was False",
        "skipped": true
    }
}

TASK [debug] *******************************************************************
task path: /tmp/tmpsvhdx31t/tests/test-verify-volume-size.yml:47
Thursday 21 July 2022  07:14:59 +0000 (0:00:00.037)       0:00:54.904 ********* 
ok: [/cache/rhel-9.qcow2.snap] => {
    "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined"
}

TASK [assert] ******************************************************************
task path: /tmp/tmpsvhdx31t/tests/test-verify-volume-size.yml:50
Thursday 21 July 2022  07:14:59 +0000 (0:00:00.034)       0:00:54.938 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [Get information about the LV] ********************************************
task path: /tmp/tmpsvhdx31t/tests/test-verify-volume-cache.yml:6
Thursday 21 July 2022  07:14:59 +0000 (0:00:00.024)       0:00:54.963 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [set_fact] ****************************************************************
task path: /tmp/tmpsvhdx31t/tests/test-verify-volume-cache.yml:14
Thursday 21 July 2022  07:14:59 +0000 (0:00:00.024)       0:00:54.987 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [check segment type] ******************************************************
task path: /tmp/tmpsvhdx31t/tests/test-verify-volume-cache.yml:17
Thursday 21 July 2022  07:14:59 +0000 (0:00:00.024)       0:00:55.012 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [set_fact] ****************************************************************
task path: /tmp/tmpsvhdx31t/tests/test-verify-volume-cache.yml:22
Thursday 21 July 2022  07:14:59 +0000 (0:00:00.023)       0:00:55.035 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [parse the requested cache size] ******************************************
task path: /tmp/tmpsvhdx31t/tests/test-verify-volume-cache.yml:26
Thursday 21 July 2022  07:14:59 +0000 (0:00:00.023)       0:00:55.058 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [set_fact] ****************************************************************
task path: /tmp/tmpsvhdx31t/tests/test-verify-volume-cache.yml:32
Thursday 21 July 2022  07:14:59 +0000 (0:00:00.024)       0:00:55.083 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [Check cache size] ********************************************************
task path: /tmp/tmpsvhdx31t/tests/test-verify-volume-cache.yml:36
Thursday 21 July 2022  07:14:59 +0000 (0:00:00.024)       0:00:55.107 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [Clean up facts] **********************************************************
task path: /tmp/tmpsvhdx31t/tests/test-verify-volume.yml:16
Thursday 21 July 2022  07:14:59 +0000 (0:00:00.023)       0:00:55.131 ********* 
ok: [/cache/rhel-9.qcow2.snap] => {
    "ansible_facts": {
        "_storage_test_volume_present": null
    },
    "changed": false
}

TASK [Clean up variable namespace] *********************************************
task path: /tmp/tmpsvhdx31t/tests/verify-role-results.yml:53
Thursday 21 July 2022  07:14:59 +0000 (0:00:00.034)       0:00:55.166 ********* 
ok: [/cache/rhel-9.qcow2.snap] => {
    "ansible_facts": {
        "storage_test_blkinfo": null,
        "storage_test_crypttab": null,
        "storage_test_fstab": null
    },
    "changed": false
}
META: ran handlers
META: ran handlers

PLAY RECAP *********************************************************************
/cache/rhel-9.qcow2.snap   : ok=259  changed=6    unreachable=0    failed=0    skipped=228  rescued=0    ignored=0   

Thursday 21 July 2022  07:14:59 +0000 (0:00:00.047)       0:00:55.214 ********* 
=============================================================================== 
linux-system-roles.storage : get service facts -------------------------- 1.89s
/tmp/tmpsvhdx31t/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 
linux-system-roles.storage : manage the pools and volumes to match the specified state --- 1.74s
/tmp/tmpsvhdx31t/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 
linux-system-roles.storage : manage the pools and volumes to match the specified state --- 1.66s
/tmp/tmpsvhdx31t/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 
Gathering Facts --------------------------------------------------------- 1.42s
/tmp/tmpsvhdx31t/tests/tests_change_disk_mount.yml:2 --------------------------
linux-system-roles.storage : manage the pools and volumes to match the specified state --- 1.38s
/tmp/tmpsvhdx31t/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 
linux-system-roles.storage : manage the pools and volumes to match the specified state --- 1.34s
/tmp/tmpsvhdx31t/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 
linux-system-roles.storage : make sure blivet is available -------------- 1.27s
/tmp/tmpsvhdx31t/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 
linux-system-roles.storage : Update facts ------------------------------- 1.00s
/tmp/tmpsvhdx31t/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 
linux-system-roles.storage : Update facts ------------------------------- 0.98s
/tmp/tmpsvhdx31t/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 
linux-system-roles.storage : Update facts ------------------------------- 0.98s
/tmp/tmpsvhdx31t/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 
linux-system-roles.storage : Update facts ------------------------------- 0.97s
/tmp/tmpsvhdx31t/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 
linux-system-roles.storage : make sure required packages are installed --- 0.95s
/tmp/tmpsvhdx31t/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 
linux-system-roles.storage : Update facts ------------------------------- 0.95s
/tmp/tmpsvhdx31t/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 
linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab --- 0.95s
/tmp/tmpsvhdx31t/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 
Ensure cryptsetup is present -------------------------------------------- 0.88s
/tmp/tmpsvhdx31t/tests/test-verify-volume-encryption.yml:10 -------------------
Ensure cryptsetup is present -------------------------------------------- 0.87s
/tmp/tmpsvhdx31t/tests/test-verify-volume-encryption.yml:10 -------------------
Ensure cryptsetup is present -------------------------------------------- 0.87s
/tmp/tmpsvhdx31t/tests/test-verify-volume-encryption.yml:10 -------------------
Ensure cryptsetup is present -------------------------------------------- 0.81s
/tmp/tmpsvhdx31t/tests/test-verify-volume-encryption.yml:10 -------------------
linux-system-roles.storage : get required packages ---------------------- 0.72s
/tmp/tmpsvhdx31t/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 
linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab --- 0.67s
/tmp/tmpsvhdx31t/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 
ansible-playbook [core 2.12.6]
  config file = /etc/ansible/ansible.cfg
  configured module search path = ['/root/.ansible/plugins/modules', '/usr/share/ansible/plugins/modules']
  ansible python module location = /usr/lib/python3.9/site-packages/ansible
  ansible collection location = /tmp/tmpi19f9hzy
  executable location = /usr/bin/ansible-playbook
  python version = 3.9.13 (main, May 18 2022, 00:00:00) [GCC 11.3.1 20220421 (Red Hat 11.3.1-2)]
  jinja version = 2.11.3
  libyaml = True
Using /etc/ansible/ansible.cfg as config file
Skipping callback 'debug', as we already have a stdout callback.
Skipping callback 'default', as we already have a stdout callback.
Skipping callback 'minimal', as we already have a stdout callback.
Skipping callback 'oneline', as we already have a stdout callback.

PLAYBOOK: tests_change_disk_mount.yml ******************************************
1 plays in /tmp/tmpfc26zqih/tests/storage/tests_change_disk_mount.yml

PLAY [all] *********************************************************************

TASK [Gathering Facts] *********************************************************
task path: /tmp/tmpfc26zqih/tests/storage/tests_change_disk_mount.yml:2
Thursday 21 July 2022  10:28:37 +0000 (0:00:00.012)       0:00:00.012 ********* 
ok: [/cache/rhel-9.qcow2.snap]
META: ran handlers

TASK [include_role : fedora.linux_system_roles.storage] ************************
task path: /tmp/tmpfc26zqih/tests/storage/tests_change_disk_mount.yml:10
Thursday 21 July 2022  10:28:38 +0000 (0:00:01.362)       0:00:01.375 ********* 

TASK [fedora.linux_system_roles.storage : set platform/version specific variables] ***
task path: /tmp/tmpi19f9hzy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2
Thursday 21 July 2022  10:28:38 +0000 (0:00:00.036)       0:00:01.411 ********* 
included: /tmp/tmpi19f9hzy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for /cache/rhel-9.qcow2.snap

TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] ***
task path: /tmp/tmpi19f9hzy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2
Thursday 21 July 2022  10:28:38 +0000 (0:00:00.032)       0:00:01.444 ********* 
ok: [/cache/rhel-9.qcow2.snap]

TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] ***
task path: /tmp/tmpi19f9hzy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7
Thursday 21 July 2022  10:28:39 +0000 (0:00:00.527)       0:00:01.971 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => (item=RedHat.yml)  => {
    "ansible_loop_var": "item",
    "changed": false,
    "item": "RedHat.yml",
    "skip_reason": "Conditional result was False"
}
skipping: [/cache/rhel-9.qcow2.snap] => (item=RedHat.yml)  => {
    "ansible_loop_var": "item",
    "changed": false,
    "item": "RedHat.yml",
    "skip_reason": "Conditional result was False"
}
ok: [/cache/rhel-9.qcow2.snap] => (item=RedHat_9.yml) => {
    "ansible_facts": {
        "blivet_package_list": [
            "python3-blivet",
            "libblockdev-crypto",
            "libblockdev-dm",
            "libblockdev-lvm",
            "libblockdev-mdraid",
            "libblockdev-swap",
            "vdo",
            "kmod-kvdo",
            "xfsprogs"
        ]
    },
    "ansible_included_var_files": [
        "/tmp/tmpi19f9hzy/ansible_collections/fedora/linux_system_roles/roles/storage/vars/RedHat_9.yml"
    ],
    "ansible_loop_var": "item",
    "changed": false,
    "item": "RedHat_9.yml"
}
skipping: [/cache/rhel-9.qcow2.snap] => (item=RedHat_9.0.yml)  => {
    "ansible_loop_var": "item",
    "changed": false,
    "item": "RedHat_9.0.yml",
    "skip_reason": "Conditional result was False"
}

TASK [fedora.linux_system_roles.storage : define an empty list of pools to be used in testing] ***
task path: /tmp/tmpi19f9hzy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5
Thursday 21 July 2022  10:28:39 +0000 (0:00:00.056)       0:00:02.028 ********* 
ok: [/cache/rhel-9.qcow2.snap] => {
    "ansible_facts": {
        "_storage_pools_list": []
    },
    "changed": false
}

TASK [fedora.linux_system_roles.storage : define an empty list of volumes to be used in testing] ***
task path: /tmp/tmpi19f9hzy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9
Thursday 21 July 2022  10:28:39 +0000 (0:00:00.034)       0:00:02.063 ********* 
ok: [/cache/rhel-9.qcow2.snap] => {
    "ansible_facts": {
        "_storage_volumes_list": []
    },
    "changed": false
}

TASK [fedora.linux_system_roles.storage : include the appropriate provider tasks] ***
task path: /tmp/tmpi19f9hzy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13
Thursday 21 July 2022  10:28:39 +0000 (0:00:00.030)       0:00:02.093 ********* 
redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount
redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount
included: /tmp/tmpi19f9hzy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for /cache/rhel-9.qcow2.snap

TASK [fedora.linux_system_roles.storage : get a list of rpm packages installed on host machine] ***
task path: /tmp/tmpi19f9hzy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2
Thursday 21 July 2022  10:28:39 +0000 (0:00:00.053)       0:00:02.147 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [fedora.linux_system_roles.storage : make sure blivet is available] *******
task path: /tmp/tmpi19f9hzy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:7
Thursday 21 July 2022  10:28:39 +0000 (0:00:00.018)       0:00:02.165 ********* 
ok: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "rc": 0,
    "results": []
}

MSG:

Nothing to do

TASK [fedora.linux_system_roles.storage : show storage_pools] ******************
task path: /tmp/tmpi19f9hzy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:13
Thursday 21 July 2022  10:28:40 +0000 (0:00:01.292)       0:00:03.457 ********* 
ok: [/cache/rhel-9.qcow2.snap] => {
    "storage_pools": "VARIABLE IS NOT DEFINED!: 'storage_pools' is undefined"
}

TASK [fedora.linux_system_roles.storage : show storage_volumes] ****************
task path: /tmp/tmpi19f9hzy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:18
Thursday 21 July 2022  10:28:40 +0000 (0:00:00.032)       0:00:03.490 ********* 
ok: [/cache/rhel-9.qcow2.snap] => {
    "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined"
}

TASK [fedora.linux_system_roles.storage : get required packages] ***************
task path: /tmp/tmpi19f9hzy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:23
Thursday 21 July 2022  10:28:40 +0000 (0:00:00.034)       0:00:03.524 ********* 
ok: [/cache/rhel-9.qcow2.snap] => {
    "actions": [],
    "changed": false,
    "crypts": [],
    "leaves": [],
    "mounts": [],
    "packages": [],
    "pools": [],
    "volumes": []
}

TASK [fedora.linux_system_roles.storage : enable copr repositories if needed] ***
task path: /tmp/tmpi19f9hzy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:35
Thursday 21 July 2022  10:28:41 +0000 (0:00:00.746)       0:00:04.271 ********* 
included: /tmp/tmpi19f9hzy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for /cache/rhel-9.qcow2.snap

TASK [fedora.linux_system_roles.storage : check if the COPR support packages should be installed] ***
task path: /tmp/tmpi19f9hzy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2
Thursday 21 July 2022  10:28:41 +0000 (0:00:00.041)       0:00:04.312 ********* 

TASK [fedora.linux_system_roles.storage : make sure COPR support packages are present] ***
task path: /tmp/tmpi19f9hzy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13
Thursday 21 July 2022  10:28:41 +0000 (0:00:00.033)       0:00:04.346 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [fedora.linux_system_roles.storage : enable COPRs] ************************
task path: /tmp/tmpi19f9hzy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:18
Thursday 21 July 2022  10:28:41 +0000 (0:00:00.035)       0:00:04.381 ********* 

TASK [fedora.linux_system_roles.storage : make sure required packages are installed] ***
task path: /tmp/tmpi19f9hzy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:41
Thursday 21 July 2022  10:28:41 +0000 (0:00:00.029)       0:00:04.411 ********* 
ok: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "rc": 0,
    "results": []
}

MSG:

Nothing to do

TASK [fedora.linux_system_roles.storage : get service facts] *******************
task path: /tmp/tmpi19f9hzy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:47
Thursday 21 July 2022  10:28:42 +0000 (0:00:00.891)       0:00:05.303 ********* 
ok: [/cache/rhel-9.qcow2.snap] => {
    "ansible_facts": {
        "services": {
            "NetworkManager-dispatcher.service": {
                "name": "NetworkManager-dispatcher.service",
                "source": "systemd",
                "state": "inactive",
                "status": "enabled"
            },
            "NetworkManager-wait-online.service": {
                "name": "NetworkManager-wait-online.service",
                "source": "systemd",
                "state": "stopped",
                "status": "enabled"
            },
            "NetworkManager.service": {
                "name": "NetworkManager.service",
                "source": "systemd",
                "state": "running",
                "status": "enabled"
            },
            "auditd.service": {
                "name": "auditd.service",
                "source": "systemd",
                "state": "running",
                "status": "enabled"
            },
            "auth-rpcgss-module.service": {
                "name": "auth-rpcgss-module.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "autofs.service": {
                "name": "autofs.service",
                "source": "systemd",
                "state": "stopped",
                "status": "not-found"
            },
            "autovt@.service": {
                "name": "autovt@.service",
                "source": "systemd",
                "state": "unknown",
                "status": "alias"
            },
            "blivet.service": {
                "name": "blivet.service",
                "source": "systemd",
                "state": "inactive",
                "status": "static"
            },
            "blk-availability.service": {
                "name": "blk-availability.service",
                "source": "systemd",
                "state": "stopped",
                "status": "disabled"
            },
            "chrony-wait.service": {
                "name": "chrony-wait.service",
                "source": "systemd",
                "state": "inactive",
                "status": "disabled"
            },
            "chronyd.service": {
                "name": "chronyd.service",
                "source": "systemd",
                "state": "running",
                "status": "enabled"
            },
            "cloud-config.service": {
                "name": "cloud-config.service",
                "source": "systemd",
                "state": "stopped",
                "status": "enabled"
            },
            "cloud-final.service": {
                "name": "cloud-final.service",
                "source": "systemd",
                "state": "stopped",
                "status": "enabled"
            },
            "cloud-init-local.service": {
                "name": "cloud-init-local.service",
                "source": "systemd",
                "state": "stopped",
                "status": "enabled"
            },
            "cloud-init.service": {
                "name": "cloud-init.service",
                "source": "systemd",
                "state": "stopped",
                "status": "enabled"
            },
            "cockpit-motd.service": {
                "name": "cockpit-motd.service",
                "source": "systemd",
                "state": "inactive",
                "status": "static"
            },
            "cockpit-wsinstance-http.service": {
                "name": "cockpit-wsinstance-http.service",
                "source": "systemd",
                "state": "inactive",
                "status": "static"
            },
            "cockpit-wsinstance-https-factory@.service": {
                "name": "cockpit-wsinstance-https-factory@.service",
                "source": "systemd",
                "state": "unknown",
                "status": "static"
            },
            "cockpit-wsinstance-https@.service": {
                "name": "cockpit-wsinstance-https@.service",
                "source": "systemd",
                "state": "unknown",
                "status": "static"
            },
            "cockpit.service": {
                "name": "cockpit.service",
                "source": "systemd",
                "state": "inactive",
                "status": "static"
            },
            "console-getty.service": {
                "name": "console-getty.service",
                "source": "systemd",
                "state": "inactive",
                "status": "disabled"
            },
            "container-getty@.service": {
                "name": "container-getty@.service",
                "source": "systemd",
                "state": "unknown",
                "status": "static"
            },
            "cpupower.service": {
                "name": "cpupower.service",
                "source": "systemd",
                "state": "stopped",
                "status": "disabled"
            },
            "crond.service": {
                "name": "crond.service",
                "source": "systemd",
                "state": "running",
                "status": "enabled"
            },
            "dbus-broker.service": {
                "name": "dbus-broker.service",
                "source": "systemd",
                "state": "running",
                "status": "enabled"
            },
            "dbus-org.freedesktop.hostname1.service": {
                "name": "dbus-org.freedesktop.hostname1.service",
                "source": "systemd",
                "state": "active",
                "status": "alias"
            },
            "dbus-org.freedesktop.locale1.service": {
                "name": "dbus-org.freedesktop.locale1.service",
                "source": "systemd",
                "state": "inactive",
                "status": "alias"
            },
            "dbus-org.freedesktop.login1.service": {
                "name": "dbus-org.freedesktop.login1.service",
                "source": "systemd",
                "state": "active",
                "status": "alias"
            },
            "dbus-org.freedesktop.nm-dispatcher.service": {
                "name": "dbus-org.freedesktop.nm-dispatcher.service",
                "source": "systemd",
                "state": "inactive",
                "status": "alias"
            },
            "dbus-org.freedesktop.timedate1.service": {
                "name": "dbus-org.freedesktop.timedate1.service",
                "source": "systemd",
                "state": "inactive",
                "status": "alias"
            },
            "dbus.service": {
                "name": "dbus.service",
                "source": "systemd",
                "state": "active",
                "status": "alias"
            },
            "debug-shell.service": {
                "name": "debug-shell.service",
                "source": "systemd",
                "state": "inactive",
                "status": "disabled"
            },
            "display-manager.service": {
                "name": "display-manager.service",
                "source": "systemd",
                "state": "stopped",
                "status": "not-found"
            },
            "dm-event.service": {
                "name": "dm-event.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "dnf-makecache.service": {
                "name": "dnf-makecache.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "dracut-cmdline.service": {
                "name": "dracut-cmdline.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "dracut-initqueue.service": {
                "name": "dracut-initqueue.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "dracut-mount.service": {
                "name": "dracut-mount.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "dracut-pre-mount.service": {
                "name": "dracut-pre-mount.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "dracut-pre-pivot.service": {
                "name": "dracut-pre-pivot.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "dracut-pre-trigger.service": {
                "name": "dracut-pre-trigger.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "dracut-pre-udev.service": {
                "name": "dracut-pre-udev.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "dracut-shutdown-onfailure.service": {
                "name": "dracut-shutdown-onfailure.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "dracut-shutdown.service": {
                "name": "dracut-shutdown.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "emergency.service": {
                "name": "emergency.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "fcoe.service": {
                "name": "fcoe.service",
                "source": "systemd",
                "state": "stopped",
                "status": "not-found"
            },
            "fstrim.service": {
                "name": "fstrim.service",
                "source": "systemd",
                "state": "inactive",
                "status": "static"
            },
            "fwupd-offline-update.service": {
                "name": "fwupd-offline-update.service",
                "source": "systemd",
                "state": "inactive",
                "status": "static"
            },
            "fwupd-refresh.service": {
                "name": "fwupd-refresh.service",
                "source": "systemd",
                "state": "inactive",
                "status": "static"
            },
            "fwupd.service": {
                "name": "fwupd.service",
                "source": "systemd",
                "state": "inactive",
                "status": "static"
            },
            "getty@.service": {
                "name": "getty@.service",
                "source": "systemd",
                "state": "unknown",
                "status": "enabled"
            },
            "getty@tty1.service": {
                "name": "getty@tty1.service",
                "source": "systemd",
                "state": "running",
                "status": "active"
            },
            "grub-boot-indeterminate.service": {
                "name": "grub-boot-indeterminate.service",
                "source": "systemd",
                "state": "inactive",
                "status": "static"
            },
            "grub2-systemd-integration.service": {
                "name": "grub2-systemd-integration.service",
                "source": "systemd",
                "state": "inactive",
                "status": "static"
            },
            "gssproxy.service": {
                "name": "gssproxy.service",
                "source": "systemd",
                "state": "running",
                "status": "disabled"
            },
            "initrd-cleanup.service": {
                "name": "initrd-cleanup.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "initrd-parse-etc.service": {
                "name": "initrd-parse-etc.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "initrd-switch-root.service": {
                "name": "initrd-switch-root.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "initrd-udevadm-cleanup-db.service": {
                "name": "initrd-udevadm-cleanup-db.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "insights-client-boot.service": {
                "name": "insights-client-boot.service",
                "source": "systemd",
                "state": "stopped",
                "status": "enabled"
            },
            "insights-client-results.service": {
                "name": "insights-client-results.service",
                "source": "systemd",
                "state": "inactive",
                "status": "static"
            },
            "insights-client.service": {
                "name": "insights-client.service",
                "source": "systemd",
                "state": "inactive",
                "status": "static"
            },
            "irqbalance.service": {
                "name": "irqbalance.service",
                "source": "systemd",
                "state": "running",
                "status": "enabled"
            },
            "iscsi-shutdown.service": {
                "name": "iscsi-shutdown.service",
                "source": "systemd",
                "state": "stopped",
                "status": "not-found"
            },
            "iscsi.service": {
                "name": "iscsi.service",
                "source": "systemd",
                "state": "stopped",
                "status": "not-found"
            },
            "iscsid.service": {
                "name": "iscsid.service",
                "source": "systemd",
                "state": "stopped",
                "status": "not-found"
            },
            "kdump.service": {
                "name": "kdump.service",
                "source": "systemd",
                "state": "stopped",
                "status": "enabled"
            },
            "kmod-static-nodes.service": {
                "name": "kmod-static-nodes.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "kvm_stat.service": {
                "name": "kvm_stat.service",
                "source": "systemd",
                "state": "inactive",
                "status": "disabled"
            },
            "ldconfig.service": {
                "name": "ldconfig.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "logrotate.service": {
                "name": "logrotate.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "lvm2-activation-early.service": {
                "name": "lvm2-activation-early.service",
                "source": "systemd",
                "state": "stopped",
                "status": "not-found"
            },
            "lvm2-lvmpolld.service": {
                "name": "lvm2-lvmpolld.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "lvm2-monitor.service": {
                "name": "lvm2-monitor.service",
                "source": "systemd",
                "state": "stopped",
                "status": "enabled"
            },
            "man-db-cache-update.service": {
                "name": "man-db-cache-update.service",
                "source": "systemd",
                "state": "inactive",
                "status": "static"
            },
            "man-db-restart-cache-update.service": {
                "name": "man-db-restart-cache-update.service",
                "source": "systemd",
                "state": "inactive",
                "status": "disabled"
            },
            "mdadm-grow-continue@.service": {
                "name": "mdadm-grow-continue@.service",
                "source": "systemd",
                "state": "unknown",
                "status": "static"
            },
            "mdadm-last-resort@.service": {
                "name": "mdadm-last-resort@.service",
                "source": "systemd",
                "state": "unknown",
                "status": "static"
            },
            "mdcheck_continue.service": {
                "name": "mdcheck_continue.service",
                "source": "systemd",
                "state": "inactive",
                "status": "static"
            },
            "mdcheck_start.service": {
                "name": "mdcheck_start.service",
                "source": "systemd",
                "state": "inactive",
                "status": "static"
            },
            "mdmon@.service": {
                "name": "mdmon@.service",
                "source": "systemd",
                "state": "unknown",
                "status": "static"
            },
            "mdmonitor-oneshot.service": {
                "name": "mdmonitor-oneshot.service",
                "source": "systemd",
                "state": "inactive",
                "status": "static"
            },
            "mdmonitor.service": {
                "name": "mdmonitor.service",
                "source": "systemd",
                "state": "stopped",
                "status": "enabled"
            },
            "microcode.service": {
                "name": "microcode.service",
                "source": "systemd",
                "state": "stopped",
                "status": "enabled"
            },
            "modprobe@.service": {
                "name": "modprobe@.service",
                "source": "systemd",
                "state": "unknown",
                "status": "static"
            },
            "modprobe@configfs.service": {
                "name": "modprobe@configfs.service",
                "source": "systemd",
                "state": "stopped",
                "status": "inactive"
            },
            "modprobe@drm.service": {
                "name": "modprobe@drm.service",
                "source": "systemd",
                "state": "stopped",
                "status": "inactive"
            },
            "modprobe@fuse.service": {
                "name": "modprobe@fuse.service",
                "source": "systemd",
                "state": "stopped",
                "status": "inactive"
            },
            "multipathd.service": {
                "name": "multipathd.service",
                "source": "systemd",
                "state": "stopped",
                "status": "enabled"
            },
            "ndctl-monitor.service": {
                "name": "ndctl-monitor.service",
                "source": "systemd",
                "state": "inactive",
                "status": "disabled"
            },
            "network.service": {
                "name": "network.service",
                "source": "systemd",
                "state": "stopped",
                "status": "not-found"
            },
            "nfs-blkmap.service": {
                "name": "nfs-blkmap.service",
                "source": "systemd",
                "state": "inactive",
                "status": "disabled"
            },
            "nfs-idmapd.service": {
                "name": "nfs-idmapd.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "nfs-mountd.service": {
                "name": "nfs-mountd.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "nfs-server.service": {
                "name": "nfs-server.service",
                "source": "systemd",
                "state": "stopped",
                "status": "disabled"
            },
            "nfs-utils.service": {
                "name": "nfs-utils.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "nfsdcld.service": {
                "name": "nfsdcld.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "nis-domainname.service": {
                "name": "nis-domainname.service",
                "source": "systemd",
                "state": "stopped",
                "status": "enabled"
            },
            "nm-priv-helper.service": {
                "name": "nm-priv-helper.service",
                "source": "systemd",
                "state": "inactive",
                "status": "static"
            },
            "ntpd.service": {
                "name": "ntpd.service",
                "source": "systemd",
                "state": "stopped",
                "status": "not-found"
            },
            "ntpdate.service": {
                "name": "ntpdate.service",
                "source": "systemd",
                "state": "stopped",
                "status": "not-found"
            },
            "oddjobd.service": {
                "name": "oddjobd.service",
                "source": "systemd",
                "state": "inactive",
                "status": "disabled"
            },
            "packagekit-offline-update.service": {
                "name": "packagekit-offline-update.service",
                "source": "systemd",
                "state": "inactive",
                "status": "static"
            },
            "packagekit.service": {
                "name": "packagekit.service",
                "source": "systemd",
                "state": "inactive",
                "status": "static"
            },
            "pam_namespace.service": {
                "name": "pam_namespace.service",
                "source": "systemd",
                "state": "inactive",
                "status": "static"
            },
            "plymouth-quit-wait.service": {
                "name": "plymouth-quit-wait.service",
                "source": "systemd",
                "state": "stopped",
                "status": "not-found"
            },
            "plymouth-start.service": {
                "name": "plymouth-start.service",
                "source": "systemd",
                "state": "stopped",
                "status": "not-found"
            },
            "polkit.service": {
                "name": "polkit.service",
                "source": "systemd",
                "state": "running",
                "status": "static"
            },
            "qemu-guest-agent.service": {
                "name": "qemu-guest-agent.service",
                "source": "systemd",
                "state": "inactive",
                "status": "enabled"
            },
            "quotaon.service": {
                "name": "quotaon.service",
                "source": "systemd",
                "state": "inactive",
                "status": "static"
            },
            "raid-check.service": {
                "name": "raid-check.service",
                "source": "systemd",
                "state": "inactive",
                "status": "static"
            },
            "rbdmap.service": {
                "name": "rbdmap.service",
                "source": "systemd",
                "state": "stopped",
                "status": "not-found"
            },
            "rc-local.service": {
                "name": "rc-local.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "rdisc.service": {
                "name": "rdisc.service",
                "source": "systemd",
                "state": "inactive",
                "status": "disabled"
            },
            "rescue.service": {
                "name": "rescue.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "rhsm-facts.service": {
                "name": "rhsm-facts.service",
                "source": "systemd",
                "state": "inactive",
                "status": "disabled"
            },
            "rhsm.service": {
                "name": "rhsm.service",
                "source": "systemd",
                "state": "inactive",
                "status": "disabled"
            },
            "rhsmcertd.service": {
                "name": "rhsmcertd.service",
                "source": "systemd",
                "state": "running",
                "status": "enabled"
            },
            "rpc-gssd.service": {
                "name": "rpc-gssd.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "rpc-statd-notify.service": {
                "name": "rpc-statd-notify.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "rpc-statd.service": {
                "name": "rpc-statd.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "rpc-svcgssd.service": {
                "name": "rpc-svcgssd.service",
                "source": "systemd",
                "state": "stopped",
                "status": "not-found"
            },
            "rpcbind.service": {
                "name": "rpcbind.service",
                "source": "systemd",
                "state": "running",
                "status": "enabled"
            },
            "rpmdb-rebuild.service": {
                "name": "rpmdb-rebuild.service",
                "source": "systemd",
                "state": "inactive",
                "status": "disabled"
            },
            "rsyslog.service": {
                "name": "rsyslog.service",
                "source": "systemd",
                "state": "running",
                "status": "enabled"
            },
            "selinux-autorelabel-mark.service": {
                "name": "selinux-autorelabel-mark.service",
                "source": "systemd",
                "state": "stopped",
                "status": "enabled"
            },
            "selinux-autorelabel.service": {
                "name": "selinux-autorelabel.service",
                "source": "systemd",
                "state": "inactive",
                "status": "static"
            },
            "serial-getty@.service": {
                "name": "serial-getty@.service",
                "source": "systemd",
                "state": "unknown",
                "status": "indirect"
            },
            "serial-getty@ttyS0.service": {
                "name": "serial-getty@ttyS0.service",
                "source": "systemd",
                "state": "running",
                "status": "active"
            },
            "sntp.service": {
                "name": "sntp.service",
                "source": "systemd",
                "state": "stopped",
                "status": "not-found"
            },
            "sshd-keygen.service": {
                "name": "sshd-keygen.service",
                "source": "systemd",
                "state": "stopped",
                "status": "not-found"
            },
            "sshd-keygen@.service": {
                "name": "sshd-keygen@.service",
                "source": "systemd",
                "state": "unknown",
                "status": "disabled"
            },
            "sshd-keygen@ecdsa.service": {
                "name": "sshd-keygen@ecdsa.service",
                "source": "systemd",
                "state": "stopped",
                "status": "inactive"
            },
            "sshd-keygen@ed25519.service": {
                "name": "sshd-keygen@ed25519.service",
                "source": "systemd",
                "state": "stopped",
                "status": "inactive"
            },
            "sshd-keygen@rsa.service": {
                "name": "sshd-keygen@rsa.service",
                "source": "systemd",
                "state": "stopped",
                "status": "inactive"
            },
            "sshd.service": {
                "name": "sshd.service",
                "source": "systemd",
                "state": "running",
                "status": "enabled"
            },
            "sshd@.service": {
                "name": "sshd@.service",
                "source": "systemd",
                "state": "unknown",
                "status": "static"
            },
            "sssd-autofs.service": {
                "name": "sssd-autofs.service",
                "source": "systemd",
                "state": "inactive",
                "status": "indirect"
            },
            "sssd-kcm.service": {
                "name": "sssd-kcm.service",
                "source": "systemd",
                "state": "stopped",
                "status": "indirect"
            },
            "sssd-nss.service": {
                "name": "sssd-nss.service",
                "source": "systemd",
                "state": "inactive",
                "status": "indirect"
            },
            "sssd-pac.service": {
                "name": "sssd-pac.service",
                "source": "systemd",
                "state": "inactive",
                "status": "indirect"
            },
            "sssd-pam.service": {
                "name": "sssd-pam.service",
                "source": "systemd",
                "state": "inactive",
                "status": "indirect"
            },
            "sssd-ssh.service": {
                "name": "sssd-ssh.service",
                "source": "systemd",
                "state": "inactive",
                "status": "indirect"
            },
            "sssd-sudo.service": {
                "name": "sssd-sudo.service",
                "source": "systemd",
                "state": "inactive",
                "status": "indirect"
            },
            "sssd.service": {
                "name": "sssd.service",
                "source": "systemd",
                "state": "stopped",
                "status": "enabled"
            },
            "syslog.service": {
                "name": "syslog.service",
                "source": "systemd",
                "state": "stopped",
                "status": "not-found"
            },
            "system-update-cleanup.service": {
                "name": "system-update-cleanup.service",
                "source": "systemd",
                "state": "inactive",
                "status": "static"
            },
            "systemd-ask-password-console.service": {
                "name": "systemd-ask-password-console.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "systemd-ask-password-wall.service": {
                "name": "systemd-ask-password-wall.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "systemd-backlight@.service": {
                "name": "systemd-backlight@.service",
                "source": "systemd",
                "state": "unknown",
                "status": "static"
            },
            "systemd-binfmt.service": {
                "name": "systemd-binfmt.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "systemd-boot-check-no-failures.service": {
                "name": "systemd-boot-check-no-failures.service",
                "source": "systemd",
                "state": "inactive",
                "status": "disabled"
            },
            "systemd-coredump@.service": {
                "name": "systemd-coredump@.service",
                "source": "systemd",
                "state": "unknown",
                "status": "static"
            },
            "systemd-exit.service": {
                "name": "systemd-exit.service",
                "source": "systemd",
                "state": "inactive",
                "status": "static"
            },
            "systemd-firstboot.service": {
                "name": "systemd-firstboot.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "systemd-fsck-root.service": {
                "name": "systemd-fsck-root.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "systemd-fsck@.service": {
                "name": "systemd-fsck@.service",
                "source": "systemd",
                "state": "unknown",
                "status": "static"
            },
            "systemd-fsck@dev-disk-by\\x2duuid-7B77\\x2d95E7.service": {
                "name": "systemd-fsck@dev-disk-by\\x2duuid-7B77\\x2d95E7.service",
                "source": "systemd",
                "state": "stopped",
                "status": "active"
            },
            "systemd-fsck@dev-vda2.service": {
                "name": "systemd-fsck@dev-vda2.service",
                "source": "systemd",
                "state": "stopped",
                "status": "inactive"
            },
            "systemd-fsck@dev-vdb1.service": {
                "name": "systemd-fsck@dev-vdb1.service",
                "source": "systemd",
                "state": "stopped",
                "status": "active"
            },
            "systemd-fsck@dev-vdc1.service": {
                "name": "systemd-fsck@dev-vdc1.service",
                "source": "systemd",
                "state": "stopped",
                "status": "active"
            },
            "systemd-halt.service": {
                "name": "systemd-halt.service",
                "source": "systemd",
                "state": "inactive",
                "status": "static"
            },
            "systemd-hibernate-resume@.service": {
                "name": "systemd-hibernate-resume@.service",
                "source": "systemd",
                "state": "unknown",
                "status": "static"
            },
            "systemd-hibernate.service": {
                "name": "systemd-hibernate.service",
                "source": "systemd",
                "state": "inactive",
                "status": "static"
            },
            "systemd-hostnamed.service": {
                "name": "systemd-hostnamed.service",
                "source": "systemd",
                "state": "running",
                "status": "static"
            },
            "systemd-hwdb-update.service": {
                "name": "systemd-hwdb-update.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "systemd-hybrid-sleep.service": {
                "name": "systemd-hybrid-sleep.service",
                "source": "systemd",
                "state": "inactive",
                "status": "static"
            },
            "systemd-initctl.service": {
                "name": "systemd-initctl.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "systemd-journal-catalog-update.service": {
                "name": "systemd-journal-catalog-update.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "systemd-journal-flush.service": {
                "name": "systemd-journal-flush.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "systemd-journald.service": {
                "name": "systemd-journald.service",
                "source": "systemd",
                "state": "running",
                "status": "static"
            },
            "systemd-journald@.service": {
                "name": "systemd-journald@.service",
                "source": "systemd",
                "state": "unknown",
                "status": "static"
            },
            "systemd-kexec.service": {
                "name": "systemd-kexec.service",
                "source": "systemd",
                "state": "inactive",
                "status": "static"
            },
            "systemd-localed.service": {
                "name": "systemd-localed.service",
                "source": "systemd",
                "state": "inactive",
                "status": "static"
            },
            "systemd-logind.service": {
                "name": "systemd-logind.service",
                "source": "systemd",
                "state": "running",
                "status": "static"
            },
            "systemd-machine-id-commit.service": {
                "name": "systemd-machine-id-commit.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "systemd-modules-load.service": {
                "name": "systemd-modules-load.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "systemd-network-generator.service": {
                "name": "systemd-network-generator.service",
                "source": "systemd",
                "state": "stopped",
                "status": "enabled"
            },
            "systemd-poweroff.service": {
                "name": "systemd-poweroff.service",
                "source": "systemd",
                "state": "inactive",
                "status": "static"
            },
            "systemd-pstore.service": {
                "name": "systemd-pstore.service",
                "source": "systemd",
                "state": "inactive",
                "status": "disabled"
            },
            "systemd-quotacheck.service": {
                "name": "systemd-quotacheck.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "systemd-random-seed.service": {
                "name": "systemd-random-seed.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "systemd-reboot.service": {
                "name": "systemd-reboot.service",
                "source": "systemd",
                "state": "inactive",
                "status": "static"
            },
            "systemd-remount-fs.service": {
                "name": "systemd-remount-fs.service",
                "source": "systemd",
                "state": "stopped",
                "status": "enabled-runtime"
            },
            "systemd-repart.service": {
                "name": "systemd-repart.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "systemd-rfkill.service": {
                "name": "systemd-rfkill.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "systemd-suspend-then-hibernate.service": {
                "name": "systemd-suspend-then-hibernate.service",
                "source": "systemd",
                "state": "inactive",
                "status": "static"
            },
            "systemd-suspend.service": {
                "name": "systemd-suspend.service",
                "source": "systemd",
                "state": "inactive",
                "status": "static"
            },
            "systemd-sysctl.service": {
                "name": "systemd-sysctl.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "systemd-sysext.service": {
                "name": "systemd-sysext.service",
                "source": "systemd",
                "state": "inactive",
                "status": "disabled"
            },
            "systemd-sysusers.service": {
                "name": "systemd-sysusers.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "systemd-timedated.service": {
                "name": "systemd-timedated.service",
                "source": "systemd",
                "state": "inactive",
                "status": "static"
            },
            "systemd-timesyncd.service": {
                "name": "systemd-timesyncd.service",
                "source": "systemd",
                "state": "stopped",
                "status": "not-found"
            },
            "systemd-tmpfiles-clean.service": {
                "name": "systemd-tmpfiles-clean.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "systemd-tmpfiles-setup-dev.service": {
                "name": "systemd-tmpfiles-setup-dev.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "systemd-tmpfiles-setup.service": {
                "name": "systemd-tmpfiles-setup.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "systemd-udev-settle.service": {
                "name": "systemd-udev-settle.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "systemd-udev-trigger.service": {
                "name": "systemd-udev-trigger.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "systemd-udevd.service": {
                "name": "systemd-udevd.service",
                "source": "systemd",
                "state": "running",
                "status": "static"
            },
            "systemd-update-done.service": {
                "name": "systemd-update-done.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "systemd-update-utmp-runlevel.service": {
                "name": "systemd-update-utmp-runlevel.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "systemd-update-utmp.service": {
                "name": "systemd-update-utmp.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "systemd-user-sessions.service": {
                "name": "systemd-user-sessions.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "systemd-vconsole-setup.service": {
                "name": "systemd-vconsole-setup.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "systemd-volatile-root.service": {
                "name": "systemd-volatile-root.service",
                "source": "systemd",
                "state": "inactive",
                "status": "static"
            },
            "teamd@.service": {
                "name": "teamd@.service",
                "source": "systemd",
                "state": "unknown",
                "status": "static"
            },
            "tuned.service": {
                "name": "tuned.service",
                "source": "systemd",
                "state": "running",
                "status": "enabled"
            },
            "user-runtime-dir@.service": {
                "name": "user-runtime-dir@.service",
                "source": "systemd",
                "state": "unknown",
                "status": "static"
            },
            "user-runtime-dir@0.service": {
                "name": "user-runtime-dir@0.service",
                "source": "systemd",
                "state": "stopped",
                "status": "active"
            },
            "user@.service": {
                "name": "user@.service",
                "source": "systemd",
                "state": "unknown",
                "status": "static"
            },
            "user@0.service": {
                "name": "user@0.service",
                "source": "systemd",
                "state": "running",
                "status": "active"
            },
            "ypbind.service": {
                "name": "ypbind.service",
                "source": "systemd",
                "state": "stopped",
                "status": "not-found"
            },
            "yppasswdd.service": {
                "name": "yppasswdd.service",
                "source": "systemd",
                "state": "stopped",
                "status": "not-found"
            },
            "ypserv.service": {
                "name": "ypserv.service",
                "source": "systemd",
                "state": "stopped",
                "status": "not-found"
            },
            "ypxfrd.service": {
                "name": "ypxfrd.service",
                "source": "systemd",
                "state": "stopped",
                "status": "not-found"
            }
        }
    },
    "changed": false
}

TASK [fedora.linux_system_roles.storage : Set storage_cryptsetup_services] *****
task path: /tmp/tmpi19f9hzy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:53
Thursday 21 July 2022  10:28:44 +0000 (0:00:01.799)       0:00:07.102 ********* 
ok: [/cache/rhel-9.qcow2.snap] => {
    "ansible_facts": {
        "storage_cryptsetup_services": []
    },
    "changed": false
}

TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] ***
task path: /tmp/tmpi19f9hzy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:58
Thursday 21 July 2022  10:28:44 +0000 (0:00:00.095)       0:00:07.198 ********* 

TASK [fedora.linux_system_roles.storage : manage the pools and volumes to match the specified state] ***
task path: /tmp/tmpi19f9hzy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:64
Thursday 21 July 2022  10:28:44 +0000 (0:00:00.053)       0:00:07.252 ********* 
ok: [/cache/rhel-9.qcow2.snap] => {
    "actions": [],
    "changed": false,
    "crypts": [],
    "leaves": [],
    "mounts": [],
    "packages": [],
    "pools": [],
    "volumes": []
}

TASK [fedora.linux_system_roles.storage : Workaround for udev issue on some platforms] ***
task path: /tmp/tmpi19f9hzy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:78
Thursday 21 July 2022  10:28:45 +0000 (0:00:00.516)       0:00:07.768 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] ***
task path: /tmp/tmpi19f9hzy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:90
Thursday 21 July 2022  10:28:45 +0000 (0:00:00.036)       0:00:07.804 ********* 

TASK [fedora.linux_system_roles.storage : show blivet_output] ******************
task path: /tmp/tmpi19f9hzy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:96
Thursday 21 July 2022  10:28:45 +0000 (0:00:00.021)       0:00:07.826 ********* 
ok: [/cache/rhel-9.qcow2.snap] => {
    "blivet_output": {
        "actions": [],
        "changed": false,
        "crypts": [],
        "failed": false,
        "leaves": [],
        "mounts": [],
        "packages": [],
        "pools": [],
        "volumes": []
    }
}

TASK [fedora.linux_system_roles.storage : set the list of pools for test verification] ***
task path: /tmp/tmpi19f9hzy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:101
Thursday 21 July 2022  10:28:45 +0000 (0:00:00.036)       0:00:07.862 ********* 
ok: [/cache/rhel-9.qcow2.snap] => {
    "ansible_facts": {
        "_storage_pools_list": []
    },
    "changed": false
}

TASK [fedora.linux_system_roles.storage : set the list of volumes for test verification] ***
task path: /tmp/tmpi19f9hzy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:105
Thursday 21 July 2022  10:28:45 +0000 (0:00:00.039)       0:00:07.902 ********* 
ok: [/cache/rhel-9.qcow2.snap] => {
    "ansible_facts": {
        "_storage_volumes_list": []
    },
    "changed": false
}

TASK [fedora.linux_system_roles.storage : remove obsolete mounts] **************
task path: /tmp/tmpi19f9hzy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:121
Thursday 21 July 2022  10:28:45 +0000 (0:00:00.045)       0:00:07.947 ********* 

TASK [fedora.linux_system_roles.storage : tell systemd to refresh its view of /etc/fstab] ***
task path: /tmp/tmpi19f9hzy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:132
Thursday 21 July 2022  10:28:45 +0000 (0:00:00.033)       0:00:07.981 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [fedora.linux_system_roles.storage : set up new/current mounts] ***********
task path: /tmp/tmpi19f9hzy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:137
Thursday 21 July 2022  10:28:45 +0000 (0:00:00.022)       0:00:08.003 ********* 

TASK [fedora.linux_system_roles.storage : tell systemd to refresh its view of /etc/fstab] ***
task path: /tmp/tmpi19f9hzy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:148
Thursday 21 July 2022  10:28:45 +0000 (0:00:00.032)       0:00:08.036 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [fedora.linux_system_roles.storage : retrieve facts for the /etc/crypttab file] ***
task path: /tmp/tmpi19f9hzy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:156
Thursday 21 July 2022  10:28:45 +0000 (0:00:00.024)       0:00:08.060 ********* 
ok: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "stat": {
        "exists": false
    }
}

TASK [fedora.linux_system_roles.storage : manage /etc/crypttab to account for changes we just made] ***
task path: /tmp/tmpi19f9hzy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:161
Thursday 21 July 2022  10:28:46 +0000 (0:00:00.524)       0:00:08.585 ********* 

TASK [fedora.linux_system_roles.storage : Update facts] ************************
task path: /tmp/tmpi19f9hzy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:183
Thursday 21 July 2022  10:28:46 +0000 (0:00:00.023)       0:00:08.608 ********* 
ok: [/cache/rhel-9.qcow2.snap]
META: role_complete for /cache/rhel-9.qcow2.snap

TASK [Mark tasks to be skipped] ************************************************
task path: /tmp/tmpfc26zqih/tests/storage/tests_change_disk_mount.yml:13
Thursday 21 July 2022  10:28:47 +0000 (0:00:00.987)       0:00:09.596 ********* 
ok: [/cache/rhel-9.qcow2.snap] => {
    "ansible_facts": {
        "storage_skip_checks": [
            "blivet_available",
            "packages_installed",
            "service_facts"
        ]
    },
    "changed": false
}

TASK [include_tasks] ***********************************************************
task path: /tmp/tmpfc26zqih/tests/storage/tests_change_disk_mount.yml:20
Thursday 21 July 2022  10:28:47 +0000 (0:00:00.031)       0:00:09.628 ********* 
included: /tmp/tmpfc26zqih/tests/storage/get_unused_disk.yml for /cache/rhel-9.qcow2.snap

TASK [Find unused disks in the system] *****************************************
task path: /tmp/tmpfc26zqih/tests/storage/get_unused_disk.yml:2
Thursday 21 July 2022  10:28:47 +0000 (0:00:00.035)       0:00:09.664 ********* 
ok: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "disks": [
        "sda"
    ]
}

TASK [Set unused_disks if necessary] *******************************************
task path: /tmp/tmpfc26zqih/tests/storage/get_unused_disk.yml:9
Thursday 21 July 2022  10:28:47 +0000 (0:00:00.504)       0:00:10.169 ********* 
ok: [/cache/rhel-9.qcow2.snap] => {
    "ansible_facts": {
        "unused_disks": [
            "sda"
        ]
    },
    "changed": false
}

TASK [Exit playbook when there's not enough unused disks in the system] ********
task path: /tmp/tmpfc26zqih/tests/storage/get_unused_disk.yml:14
Thursday 21 July 2022  10:28:47 +0000 (0:00:00.069)       0:00:10.238 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [Print unused disks] ******************************************************
task path: /tmp/tmpfc26zqih/tests/storage/get_unused_disk.yml:19
Thursday 21 July 2022  10:28:47 +0000 (0:00:00.039)       0:00:10.277 ********* 
ok: [/cache/rhel-9.qcow2.snap] => {
    "unused_disks": [
        "sda"
    ]
}

TASK [Create a disk device mounted at "/opt/test1"] ****************************
task path: /tmp/tmpfc26zqih/tests/storage/tests_change_disk_mount.yml:25
Thursday 21 July 2022  10:28:47 +0000 (0:00:00.035)       0:00:10.313 ********* 

TASK [fedora.linux_system_roles.storage : set platform/version specific variables] ***
task path: /tmp/tmpi19f9hzy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2
Thursday 21 July 2022  10:28:47 +0000 (0:00:00.071)       0:00:10.384 ********* 
included: /tmp/tmpi19f9hzy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for /cache/rhel-9.qcow2.snap

TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] ***
task path: /tmp/tmpi19f9hzy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2
Thursday 21 July 2022  10:28:47 +0000 (0:00:00.035)       0:00:10.420 ********* 
ok: [/cache/rhel-9.qcow2.snap]

TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] ***
task path: /tmp/tmpi19f9hzy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7
Thursday 21 July 2022  10:28:48 +0000 (0:00:00.524)       0:00:10.944 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => (item=RedHat.yml)  => {
    "ansible_loop_var": "item",
    "changed": false,
    "item": "RedHat.yml",
    "skip_reason": "Conditional result was False"
}
skipping: [/cache/rhel-9.qcow2.snap] => (item=RedHat.yml)  => {
    "ansible_loop_var": "item",
    "changed": false,
    "item": "RedHat.yml",
    "skip_reason": "Conditional result was False"
}
ok: [/cache/rhel-9.qcow2.snap] => (item=RedHat_9.yml) => {
    "ansible_facts": {
        "blivet_package_list": [
            "python3-blivet",
            "libblockdev-crypto",
            "libblockdev-dm",
            "libblockdev-lvm",
            "libblockdev-mdraid",
            "libblockdev-swap",
            "vdo",
            "kmod-kvdo",
            "xfsprogs"
        ]
    },
    "ansible_included_var_files": [
        "/tmp/tmpi19f9hzy/ansible_collections/fedora/linux_system_roles/roles/storage/vars/RedHat_9.yml"
    ],
    "ansible_loop_var": "item",
    "changed": false,
    "item": "RedHat_9.yml"
}
skipping: [/cache/rhel-9.qcow2.snap] => (item=RedHat_9.0.yml)  => {
    "ansible_loop_var": "item",
    "changed": false,
    "item": "RedHat_9.0.yml",
    "skip_reason": "Conditional result was False"
}

TASK [fedora.linux_system_roles.storage : define an empty list of pools to be used in testing] ***
task path: /tmp/tmpi19f9hzy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5
Thursday 21 July 2022  10:28:48 +0000 (0:00:00.060)       0:00:11.005 ********* 
ok: [/cache/rhel-9.qcow2.snap] => {
    "ansible_facts": {
        "_storage_pools_list": []
    },
    "changed": false
}

TASK [fedora.linux_system_roles.storage : define an empty list of volumes to be used in testing] ***
task path: /tmp/tmpi19f9hzy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9
Thursday 21 July 2022  10:28:48 +0000 (0:00:00.035)       0:00:11.040 ********* 
ok: [/cache/rhel-9.qcow2.snap] => {
    "ansible_facts": {
        "_storage_volumes_list": []
    },
    "changed": false
}

TASK [fedora.linux_system_roles.storage : include the appropriate provider tasks] ***
task path: /tmp/tmpi19f9hzy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13
Thursday 21 July 2022  10:28:48 +0000 (0:00:00.033)       0:00:11.074 ********* 
redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount
redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount
included: /tmp/tmpi19f9hzy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for /cache/rhel-9.qcow2.snap

TASK [fedora.linux_system_roles.storage : get a list of rpm packages installed on host machine] ***
task path: /tmp/tmpi19f9hzy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2
Thursday 21 July 2022  10:28:48 +0000 (0:00:00.053)       0:00:11.128 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [fedora.linux_system_roles.storage : make sure blivet is available] *******
task path: /tmp/tmpi19f9hzy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:7
Thursday 21 July 2022  10:28:48 +0000 (0:00:00.020)       0:00:11.148 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [fedora.linux_system_roles.storage : show storage_pools] ******************
task path: /tmp/tmpi19f9hzy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:13
Thursday 21 July 2022  10:28:48 +0000 (0:00:00.027)       0:00:11.176 ********* 
ok: [/cache/rhel-9.qcow2.snap] => {
    "storage_pools": "VARIABLE IS NOT DEFINED!: 'storage_pools' is undefined"
}

TASK [fedora.linux_system_roles.storage : show storage_volumes] ****************
task path: /tmp/tmpi19f9hzy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:18
Thursday 21 July 2022  10:28:48 +0000 (0:00:00.041)       0:00:11.218 ********* 
ok: [/cache/rhel-9.qcow2.snap] => {
    "storage_volumes": [
        {
            "disks": [
                "sda"
            ],
            "mount_point": "/opt/test1",
            "name": "test1",
            "type": "disk"
        }
    ]
}

TASK [fedora.linux_system_roles.storage : get required packages] ***************
task path: /tmp/tmpi19f9hzy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:23
Thursday 21 July 2022  10:28:48 +0000 (0:00:00.038)       0:00:11.256 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [fedora.linux_system_roles.storage : enable copr repositories if needed] ***
task path: /tmp/tmpi19f9hzy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:35
Thursday 21 July 2022  10:28:48 +0000 (0:00:00.029)       0:00:11.286 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [fedora.linux_system_roles.storage : make sure required packages are installed] ***
task path: /tmp/tmpi19f9hzy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:41
Thursday 21 July 2022  10:28:48 +0000 (0:00:00.027)       0:00:11.313 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [fedora.linux_system_roles.storage : get service facts] *******************
task path: /tmp/tmpi19f9hzy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:47
Thursday 21 July 2022  10:28:48 +0000 (0:00:00.028)       0:00:11.342 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [fedora.linux_system_roles.storage : Set storage_cryptsetup_services] *****
task path: /tmp/tmpi19f9hzy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:53
Thursday 21 July 2022  10:28:48 +0000 (0:00:00.027)       0:00:11.369 ********* 
ok: [/cache/rhel-9.qcow2.snap] => {
    "ansible_facts": {
        "storage_cryptsetup_services": []
    },
    "changed": false
}

TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] ***
task path: /tmp/tmpi19f9hzy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:58
Thursday 21 July 2022  10:28:48 +0000 (0:00:00.061)       0:00:11.431 ********* 

TASK [fedora.linux_system_roles.storage : manage the pools and volumes to match the specified state] ***
task path: /tmp/tmpi19f9hzy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:64
Thursday 21 July 2022  10:28:48 +0000 (0:00:00.021)       0:00:11.452 ********* 
changed: [/cache/rhel-9.qcow2.snap] => {
    "actions": [
        {
            "action": "create format",
            "device": "/dev/sda",
            "fs_type": "xfs"
        }
    ],
    "changed": true,
    "crypts": [],
    "leaves": [
        "/dev/sr0",
        "/dev/vda1",
        "/dev/vda2",
        "/dev/vda3",
        "/dev/vda4",
        "/dev/sda",
        "/dev/sdb",
        "/dev/sdc",
        "/dev/nvme0n1",
        "/dev/nvme1n1",
        "/dev/nvme2n1",
        "/dev/vdb1",
        "/dev/vdc1",
        "/dev/vdd",
        "/dev/vde",
        "/dev/vdf"
    ],
    "mounts": [
        {
            "dump": 0,
            "fstype": "xfs",
            "opts": "defaults",
            "passno": 0,
            "path": "/opt/test1",
            "src": "UUID=dc31169e-edd9-4a64-ab59-3286f5673c34",
            "state": "mounted"
        }
    ],
    "packages": [
        "xfsprogs",
        "dosfstools",
        "e2fsprogs"
    ],
    "pools": [],
    "volumes": [
        {
            "_device": "/dev/sda",
            "_kernel_device": "/dev/sda",
            "_mount_id": "UUID=dc31169e-edd9-4a64-ab59-3286f5673c34",
            "_raw_device": "/dev/sda",
            "_raw_kernel_device": "/dev/sda",
            "cache_devices": [],
            "cache_mode": null,
            "cache_size": 0,
            "cached": false,
            "compression": null,
            "deduplication": null,
            "disks": [
                "sda"
            ],
            "encryption": false,
            "encryption_cipher": null,
            "encryption_key": null,
            "encryption_key_size": null,
            "encryption_luks_version": null,
            "encryption_password": null,
            "fs_create_options": "",
            "fs_label": "",
            "fs_overwrite_existing": true,
            "fs_type": "xfs",
            "mount_check": 0,
            "mount_device_identifier": "uuid",
            "mount_options": "defaults",
            "mount_passno": 0,
            "mount_point": "/opt/test1",
            "name": "test1",
            "raid_chunk_size": null,
            "raid_device_count": null,
            "raid_level": null,
            "raid_metadata_version": null,
            "raid_spare_count": null,
            "size": 10737418240,
            "state": "present",
            "thin": null,
            "thin_pool_name": null,
            "thin_pool_size": null,
            "type": "disk",
            "vdo_pool_size": null
        }
    ]
}

TASK [fedora.linux_system_roles.storage : Workaround for udev issue on some platforms] ***
task path: /tmp/tmpi19f9hzy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:78
Thursday 21 July 2022  10:28:50 +0000 (0:00:01.647)       0:00:13.099 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] ***
task path: /tmp/tmpi19f9hzy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:90
Thursday 21 July 2022  10:28:50 +0000 (0:00:00.039)       0:00:13.138 ********* 

TASK [fedora.linux_system_roles.storage : show blivet_output] ******************
task path: /tmp/tmpi19f9hzy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:96
Thursday 21 July 2022  10:28:50 +0000 (0:00:00.023)       0:00:13.163 ********* 
ok: [/cache/rhel-9.qcow2.snap] => {
    "blivet_output": {
        "actions": [
            {
                "action": "create format",
                "device": "/dev/sda",
                "fs_type": "xfs"
            }
        ],
        "changed": true,
        "crypts": [],
        "failed": false,
        "leaves": [
            "/dev/sr0",
            "/dev/vda1",
            "/dev/vda2",
            "/dev/vda3",
            "/dev/vda4",
            "/dev/sda",
            "/dev/sdb",
            "/dev/sdc",
            "/dev/nvme0n1",
            "/dev/nvme1n1",
            "/dev/nvme2n1",
            "/dev/vdb1",
            "/dev/vdc1",
            "/dev/vdd",
            "/dev/vde",
            "/dev/vdf"
        ],
        "mounts": [
            {
                "dump": 0,
                "fstype": "xfs",
                "opts": "defaults",
                "passno": 0,
                "path": "/opt/test1",
                "src": "UUID=dc31169e-edd9-4a64-ab59-3286f5673c34",
                "state": "mounted"
            }
        ],
        "packages": [
            "xfsprogs",
            "dosfstools",
            "e2fsprogs"
        ],
        "pools": [],
        "volumes": [
            {
                "_device": "/dev/sda",
                "_kernel_device": "/dev/sda",
                "_mount_id": "UUID=dc31169e-edd9-4a64-ab59-3286f5673c34",
                "_raw_device": "/dev/sda",
                "_raw_kernel_device": "/dev/sda",
                "cache_devices": [],
                "cache_mode": null,
                "cache_size": 0,
                "cached": false,
                "compression": null,
                "deduplication": null,
                "disks": [
                    "sda"
                ],
                "encryption": false,
                "encryption_cipher": null,
                "encryption_key": null,
                "encryption_key_size": null,
                "encryption_luks_version": null,
                "encryption_password": null,
                "fs_create_options": "",
                "fs_label": "",
                "fs_overwrite_existing": true,
                "fs_type": "xfs",
                "mount_check": 0,
                "mount_device_identifier": "uuid",
                "mount_options": "defaults",
                "mount_passno": 0,
                "mount_point": "/opt/test1",
                "name": "test1",
                "raid_chunk_size": null,
                "raid_device_count": null,
                "raid_level": null,
                "raid_metadata_version": null,
                "raid_spare_count": null,
                "size": 10737418240,
                "state": "present",
                "thin": null,
                "thin_pool_name": null,
                "thin_pool_size": null,
                "type": "disk",
                "vdo_pool_size": null
            }
        ]
    }
}

TASK [fedora.linux_system_roles.storage : set the list of pools for test verification] ***
task path: /tmp/tmpi19f9hzy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:101
Thursday 21 July 2022  10:28:50 +0000 (0:00:00.068)       0:00:13.231 ********* 
ok: [/cache/rhel-9.qcow2.snap] => {
    "ansible_facts": {
        "_storage_pools_list": []
    },
    "changed": false
}

TASK [fedora.linux_system_roles.storage : set the list of volumes for test verification] ***
task path: /tmp/tmpi19f9hzy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:105
Thursday 21 July 2022  10:28:50 +0000 (0:00:00.067)       0:00:13.298 ********* 
ok: [/cache/rhel-9.qcow2.snap] => {
    "ansible_facts": {
        "_storage_volumes_list": [
            {
                "_device": "/dev/sda",
                "_kernel_device": "/dev/sda",
                "_mount_id": "UUID=dc31169e-edd9-4a64-ab59-3286f5673c34",
                "_raw_device": "/dev/sda",
                "_raw_kernel_device": "/dev/sda",
                "cache_devices": [],
                "cache_mode": null,
                "cache_size": 0,
                "cached": false,
                "compression": null,
                "deduplication": null,
                "disks": [
                    "sda"
                ],
                "encryption": false,
                "encryption_cipher": null,
                "encryption_key": null,
                "encryption_key_size": null,
                "encryption_luks_version": null,
                "encryption_password": null,
                "fs_create_options": "",
                "fs_label": "",
                "fs_overwrite_existing": true,
                "fs_type": "xfs",
                "mount_check": 0,
                "mount_device_identifier": "uuid",
                "mount_options": "defaults",
                "mount_passno": 0,
                "mount_point": "/opt/test1",
                "name": "test1",
                "raid_chunk_size": null,
                "raid_device_count": null,
                "raid_level": null,
                "raid_metadata_version": null,
                "raid_spare_count": null,
                "size": 10737418240,
                "state": "present",
                "thin": null,
                "thin_pool_name": null,
                "thin_pool_size": null,
                "type": "disk",
                "vdo_pool_size": null
            }
        ]
    },
    "changed": false
}

TASK [fedora.linux_system_roles.storage : remove obsolete mounts] **************
task path: /tmp/tmpi19f9hzy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:121
Thursday 21 July 2022  10:28:50 +0000 (0:00:00.039)       0:00:13.338 ********* 

TASK [fedora.linux_system_roles.storage : tell systemd to refresh its view of /etc/fstab] ***
task path: /tmp/tmpi19f9hzy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:132
Thursday 21 July 2022  10:28:50 +0000 (0:00:00.039)       0:00:13.378 ********* 
ok: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "name": null,
    "status": {}
}

TASK [fedora.linux_system_roles.storage : set up new/current mounts] ***********
task path: /tmp/tmpi19f9hzy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:137
Thursday 21 July 2022  10:28:51 +0000 (0:00:00.917)       0:00:14.295 ********* 
redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount
changed: [/cache/rhel-9.qcow2.snap] => (item={'src': 'UUID=dc31169e-edd9-4a64-ab59-3286f5673c34', 'path': '/opt/test1', 'fstype': 'xfs', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted'}) => {
    "ansible_loop_var": "mount_info",
    "backup_file": "",
    "boot": "yes",
    "changed": true,
    "dump": "0",
    "fstab": "/etc/fstab",
    "fstype": "xfs",
    "mount_info": {
        "dump": 0,
        "fstype": "xfs",
        "opts": "defaults",
        "passno": 0,
        "path": "/opt/test1",
        "src": "UUID=dc31169e-edd9-4a64-ab59-3286f5673c34",
        "state": "mounted"
    },
    "name": "/opt/test1",
    "opts": "defaults",
    "passno": "0",
    "src": "UUID=dc31169e-edd9-4a64-ab59-3286f5673c34"
}

TASK [fedora.linux_system_roles.storage : tell systemd to refresh its view of /etc/fstab] ***
task path: /tmp/tmpi19f9hzy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:148
Thursday 21 July 2022  10:28:52 +0000 (0:00:00.543)       0:00:14.839 ********* 
ok: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "name": null,
    "status": {}
}

TASK [fedora.linux_system_roles.storage : retrieve facts for the /etc/crypttab file] ***
task path: /tmp/tmpi19f9hzy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:156
Thursday 21 July 2022  10:28:52 +0000 (0:00:00.679)       0:00:15.519 ********* 
ok: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "stat": {
        "exists": false
    }
}

TASK [fedora.linux_system_roles.storage : manage /etc/crypttab to account for changes we just made] ***
task path: /tmp/tmpi19f9hzy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:161
Thursday 21 July 2022  10:28:53 +0000 (0:00:00.371)       0:00:15.890 ********* 

TASK [fedora.linux_system_roles.storage : Update facts] ************************
task path: /tmp/tmpi19f9hzy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:183
Thursday 21 July 2022  10:28:53 +0000 (0:00:00.024)       0:00:15.915 ********* 
ok: [/cache/rhel-9.qcow2.snap]
META: role_complete for /cache/rhel-9.qcow2.snap

TASK [include_tasks] ***********************************************************
task path: /tmp/tmpfc26zqih/tests/storage/tests_change_disk_mount.yml:35
Thursday 21 July 2022  10:28:54 +0000 (0:00:00.983)       0:00:16.899 ********* 
included: /tmp/tmpfc26zqih/tests/storage/verify-role-results.yml for /cache/rhel-9.qcow2.snap

TASK [Print out pool information] **********************************************
task path: /tmp/tmpfc26zqih/tests/storage/verify-role-results.yml:1
Thursday 21 July 2022  10:28:54 +0000 (0:00:00.042)       0:00:16.941 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {}

TASK [Print out volume information] ********************************************
task path: /tmp/tmpfc26zqih/tests/storage/verify-role-results.yml:6
Thursday 21 July 2022  10:28:54 +0000 (0:00:00.042)       0:00:16.983 ********* 
ok: [/cache/rhel-9.qcow2.snap] => {
    "_storage_volumes_list": [
        {
            "_device": "/dev/sda",
            "_kernel_device": "/dev/sda",
            "_mount_id": "UUID=dc31169e-edd9-4a64-ab59-3286f5673c34",
            "_raw_device": "/dev/sda",
            "_raw_kernel_device": "/dev/sda",
            "cache_devices": [],
            "cache_mode": null,
            "cache_size": 0,
            "cached": false,
            "compression": null,
            "deduplication": null,
            "disks": [
                "sda"
            ],
            "encryption": false,
            "encryption_cipher": null,
            "encryption_key": null,
            "encryption_key_size": null,
            "encryption_luks_version": null,
            "encryption_password": null,
            "fs_create_options": "",
            "fs_label": "",
            "fs_overwrite_existing": true,
            "fs_type": "xfs",
            "mount_check": 0,
            "mount_device_identifier": "uuid",
            "mount_options": "defaults",
            "mount_passno": 0,
            "mount_point": "/opt/test1",
            "name": "test1",
            "raid_chunk_size": null,
            "raid_device_count": null,
            "raid_level": null,
            "raid_metadata_version": null,
            "raid_spare_count": null,
            "size": 10737418240,
            "state": "present",
            "thin": null,
            "thin_pool_name": null,
            "thin_pool_size": null,
            "type": "disk",
            "vdo_pool_size": null
        }
    ]
}

TASK [Collect info about the volumes.] *****************************************
task path: /tmp/tmpfc26zqih/tests/storage/verify-role-results.yml:14
Thursday 21 July 2022  10:28:54 +0000 (0:00:00.053)       0:00:17.037 ********* 
ok: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "info": {
        "/dev/nvme0n1": {
            "fstype": "",
            "label": "",
            "name": "/dev/nvme0n1",
            "size": "10G",
            "type": "disk",
            "uuid": ""
        },
        "/dev/nvme1n1": {
            "fstype": "",
            "label": "",
            "name": "/dev/nvme1n1",
            "size": "10G",
            "type": "disk",
            "uuid": ""
        },
        "/dev/nvme2n1": {
            "fstype": "",
            "label": "",
            "name": "/dev/nvme2n1",
            "size": "10G",
            "type": "disk",
            "uuid": ""
        },
        "/dev/sda": {
            "fstype": "xfs",
            "label": "",
            "name": "/dev/sda",
            "size": "10G",
            "type": "disk",
            "uuid": "dc31169e-edd9-4a64-ab59-3286f5673c34"
        },
        "/dev/sdb": {
            "fstype": "",
            "label": "",
            "name": "/dev/sdb",
            "size": "10G",
            "type": "disk",
            "uuid": ""
        },
        "/dev/sdc": {
            "fstype": "",
            "label": "",
            "name": "/dev/sdc",
            "size": "10G",
            "type": "disk",
            "uuid": ""
        },
        "/dev/sr0": {
            "fstype": "iso9660",
            "label": "cidata",
            "name": "/dev/sr0",
            "size": "364K",
            "type": "rom",
            "uuid": "2022-07-21-10-28-24-00"
        },
        "/dev/vda": {
            "fstype": "",
            "label": "",
            "name": "/dev/vda",
            "size": "10G",
            "type": "disk",
            "uuid": ""
        },
        "/dev/vda1": {
            "fstype": "",
            "label": "",
            "name": "/dev/vda1",
            "size": "1M",
            "type": "partition",
            "uuid": ""
        },
        "/dev/vda2": {
            "fstype": "vfat",
            "label": "",
            "name": "/dev/vda2",
            "size": "200M",
            "type": "partition",
            "uuid": "7B77-95E7"
        },
        "/dev/vda3": {
            "fstype": "xfs",
            "label": "boot",
            "name": "/dev/vda3",
            "size": "500M",
            "type": "partition",
            "uuid": "6e74e171-0370-451f-8340-f16ad2839183"
        },
        "/dev/vda4": {
            "fstype": "xfs",
            "label": "root",
            "name": "/dev/vda4",
            "size": "9.3G",
            "type": "partition",
            "uuid": "1bb53e4d-984c-4316-908a-59b5a62fa30e"
        },
        "/dev/vdb": {
            "fstype": "",
            "label": "",
            "name": "/dev/vdb",
            "size": "2G",
            "type": "disk",
            "uuid": ""
        },
        "/dev/vdb1": {
            "fstype": "ext4",
            "label": "yumcache",
            "name": "/dev/vdb1",
            "size": "2G",
            "type": "partition",
            "uuid": "12be66f7-7f03-4d54-9a48-a520825d8fa9"
        },
        "/dev/vdc": {
            "fstype": "",
            "label": "",
            "name": "/dev/vdc",
            "size": "2G",
            "type": "disk",
            "uuid": ""
        },
        "/dev/vdc1": {
            "fstype": "ext4",
            "label": "yumvarlib",
            "name": "/dev/vdc1",
            "size": "2G",
            "type": "partition",
            "uuid": "9bd037ed-b1bb-4b9c-a008-9f07d34ab071"
        },
        "/dev/vdd": {
            "fstype": "",
            "label": "",
            "name": "/dev/vdd",
            "size": "10G",
            "type": "disk",
            "uuid": ""
        },
        "/dev/vde": {
            "fstype": "",
            "label": "",
            "name": "/dev/vde",
            "size": "10G",
            "type": "disk",
            "uuid": ""
        },
        "/dev/vdf": {
            "fstype": "",
            "label": "",
            "name": "/dev/vdf",
            "size": "10G",
            "type": "disk",
            "uuid": ""
        }
    }
}

TASK [Read the /etc/fstab file for volume existence] ***************************
task path: /tmp/tmpfc26zqih/tests/storage/verify-role-results.yml:19
Thursday 21 July 2022  10:28:55 +0000 (0:00:00.546)       0:00:17.583 ********* 
ok: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "cmd": [
        "cat",
        "/etc/fstab"
    ],
    "delta": "0:00:00.002302",
    "end": "2022-07-21 06:28:55.898657",
    "rc": 0,
    "start": "2022-07-21 06:28:55.896355"
}

STDOUT:

UUID=6e74e171-0370-451f-8340-f16ad2839183	/boot	xfs	defaults	0	0
UUID=1bb53e4d-984c-4316-908a-59b5a62fa30e	/	xfs	defaults	0	0
UUID=7B77-95E7	/boot/efi	vfat	defaults,uid=0,gid=0,umask=077,shortname=winnt	0	2
/dev/vdb1	/var/cache/dnf	auto	defaults,nofail,x-systemd.requires=cloud-init.service,_netdev,comment=cloudconfig	0	2
/dev/vdc1	/var/lib/dnf	auto	defaults,nofail,x-systemd.requires=cloud-init.service,_netdev,comment=cloudconfig	0	2
UUID=dc31169e-edd9-4a64-ab59-3286f5673c34 /opt/test1 xfs defaults 0 0

TASK [Read the /etc/crypttab file] *********************************************
task path: /tmp/tmpfc26zqih/tests/storage/verify-role-results.yml:24
Thursday 21 July 2022  10:28:55 +0000 (0:00:00.487)       0:00:18.070 ********* 
ok: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "cmd": [
        "cat",
        "/etc/crypttab"
    ],
    "delta": "0:00:00.002612",
    "end": "2022-07-21 06:28:56.269242",
    "failed_when_result": false,
    "rc": 1,
    "start": "2022-07-21 06:28:56.266630"
}

STDERR:

cat: /etc/crypttab: No such file or directory


MSG:

non-zero return code

TASK [Verify the volumes listed in storage_pools were correctly managed] *******
task path: /tmp/tmpfc26zqih/tests/storage/verify-role-results.yml:33
Thursday 21 July 2022  10:28:55 +0000 (0:00:00.370)       0:00:18.441 ********* 

TASK [Verify the volumes with no pool were correctly managed] ******************
task path: /tmp/tmpfc26zqih/tests/storage/verify-role-results.yml:43
Thursday 21 July 2022  10:28:55 +0000 (0:00:00.023)       0:00:18.465 ********* 
included: /tmp/tmpfc26zqih/tests/storage/test-verify-volume.yml for /cache/rhel-9.qcow2.snap => (item={'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'name': 'test1', 'raid_level': None, 'size': 10737418240, 'state': 'present', 'type': 'disk', 'disks': ['sda'], 'raid_device_count': None, 'raid_spare_count': None, 'raid_metadata_version': None, 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_chunk_size': None, 'compression': None, 'deduplication': None, 'vdo_pool_size': None, 'thin': None, 'thin_pool_name': None, 'thin_pool_size': None, 'cached': False, 'cache_size': 0, 'cache_mode': None, 'cache_devices': [], '_device': '/dev/sda', '_raw_device': '/dev/sda', '_mount_id': 'UUID=dc31169e-edd9-4a64-ab59-3286f5673c34', '_kernel_device': '/dev/sda', '_raw_kernel_device': '/dev/sda'})

TASK [set_fact] ****************************************************************
task path: /tmp/tmpfc26zqih/tests/storage/test-verify-volume.yml:2
Thursday 21 July 2022  10:28:55 +0000 (0:00:00.059)       0:00:18.524 ********* 
ok: [/cache/rhel-9.qcow2.snap] => {
    "ansible_facts": {
        "_storage_test_volume_present": true,
        "_storage_volume_tests": [
            "mount",
            "fstab",
            "fs",
            "device",
            "encryption",
            "md",
            "size",
            "cache"
        ]
    },
    "changed": false
}

TASK [include_tasks] ***********************************************************
task path: /tmp/tmpfc26zqih/tests/storage/test-verify-volume.yml:10
Thursday 21 July 2022  10:28:56 +0000 (0:00:00.079)       0:00:18.603 ********* 
included: /tmp/tmpfc26zqih/tests/storage/test-verify-volume-mount.yml for /cache/rhel-9.qcow2.snap => (item=mount)
included: /tmp/tmpfc26zqih/tests/storage/test-verify-volume-fstab.yml for /cache/rhel-9.qcow2.snap => (item=fstab)
included: /tmp/tmpfc26zqih/tests/storage/test-verify-volume-fs.yml for /cache/rhel-9.qcow2.snap => (item=fs)
included: /tmp/tmpfc26zqih/tests/storage/test-verify-volume-device.yml for /cache/rhel-9.qcow2.snap => (item=device)
included: /tmp/tmpfc26zqih/tests/storage/test-verify-volume-encryption.yml for /cache/rhel-9.qcow2.snap => (item=encryption)
included: /tmp/tmpfc26zqih/tests/storage/test-verify-volume-md.yml for /cache/rhel-9.qcow2.snap => (item=md)
included: /tmp/tmpfc26zqih/tests/storage/test-verify-volume-size.yml for /cache/rhel-9.qcow2.snap => (item=size)
included: /tmp/tmpfc26zqih/tests/storage/test-verify-volume-cache.yml for /cache/rhel-9.qcow2.snap => (item=cache)

TASK [Get expected mount device based on device type] **************************
task path: /tmp/tmpfc26zqih/tests/storage/test-verify-volume-mount.yml:6
Thursday 21 July 2022  10:28:56 +0000 (0:00:00.113)       0:00:18.716 ********* 
ok: [/cache/rhel-9.qcow2.snap] => {
    "ansible_facts": {
        "storage_test_device_path": "/dev/sda"
    },
    "changed": false
}

TASK [Set some facts] **********************************************************
task path: /tmp/tmpfc26zqih/tests/storage/test-verify-volume-mount.yml:10
Thursday 21 July 2022  10:28:56 +0000 (0:00:00.041)       0:00:18.758 ********* 
ok: [/cache/rhel-9.qcow2.snap] => {
    "ansible_facts": {
        "storage_test_mount_device_matches": [
            {
                "block_available": 2592358,
                "block_size": 4096,
                "block_total": 2618880,
                "block_used": 26522,
                "device": "/dev/sda",
                "fstype": "xfs",
                "inode_available": 5242877,
                "inode_total": 5242880,
                "inode_used": 3,
                "mount": "/opt/test1",
                "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota",
                "size_available": 10618298368,
                "size_total": 10726932480,
                "uuid": "dc31169e-edd9-4a64-ab59-3286f5673c34"
            }
        ],
        "storage_test_mount_expected_match_count": "1",
        "storage_test_mount_point_matches": [
            {
                "block_available": 2592358,
                "block_size": 4096,
                "block_total": 2618880,
                "block_used": 26522,
                "device": "/dev/sda",
                "fstype": "xfs",
                "inode_available": 5242877,
                "inode_total": 5242880,
                "inode_used": 3,
                "mount": "/opt/test1",
                "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota",
                "size_available": 10618298368,
                "size_total": 10726932480,
                "uuid": "dc31169e-edd9-4a64-ab59-3286f5673c34"
            }
        ],
        "storage_test_swap_expected_matches": "0"
    },
    "changed": false
}

TASK [Verify the current mount state by device] ********************************
task path: /tmp/tmpfc26zqih/tests/storage/test-verify-volume-mount.yml:20
Thursday 21 July 2022  10:28:56 +0000 (0:00:00.057)       0:00:18.815 ********* 
ok: [/cache/rhel-9.qcow2.snap] => {
    "changed": false
}

MSG:

All assertions passed

TASK [Verify the current mount state by mount point] ***************************
task path: /tmp/tmpfc26zqih/tests/storage/test-verify-volume-mount.yml:29
Thursday 21 July 2022  10:28:56 +0000 (0:00:00.050)       0:00:18.866 ********* 
ok: [/cache/rhel-9.qcow2.snap] => {
    "changed": false
}

MSG:

All assertions passed

TASK [Verify the mount fs type] ************************************************
task path: /tmp/tmpfc26zqih/tests/storage/test-verify-volume-mount.yml:37
Thursday 21 July 2022  10:28:56 +0000 (0:00:00.049)       0:00:18.916 ********* 
ok: [/cache/rhel-9.qcow2.snap] => {
    "changed": false
}

MSG:

All assertions passed

TASK [command] *****************************************************************
task path: /tmp/tmpfc26zqih/tests/storage/test-verify-volume-mount.yml:46
Thursday 21 July 2022  10:28:56 +0000 (0:00:00.050)       0:00:18.966 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [Gather swap info] ********************************************************
task path: /tmp/tmpfc26zqih/tests/storage/test-verify-volume-mount.yml:50
Thursday 21 July 2022  10:28:56 +0000 (0:00:00.022)       0:00:18.989 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [Verify swap status] ******************************************************
task path: /tmp/tmpfc26zqih/tests/storage/test-verify-volume-mount.yml:55
Thursday 21 July 2022  10:28:56 +0000 (0:00:00.023)       0:00:19.012 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [Unset facts] *************************************************************
task path: /tmp/tmpfc26zqih/tests/storage/test-verify-volume-mount.yml:65
Thursday 21 July 2022  10:28:56 +0000 (0:00:00.022)       0:00:19.035 ********* 
ok: [/cache/rhel-9.qcow2.snap] => {
    "ansible_facts": {
        "storage_test_mount_device_matches": null,
        "storage_test_mount_expected_match_count": null,
        "storage_test_mount_point_matches": null,
        "storage_test_swap_expected_matches": null,
        "storage_test_swaps": null,
        "storage_test_sys_node": null
    },
    "changed": false
}

TASK [Set some variables for fstab checking] ***********************************
task path: /tmp/tmpfc26zqih/tests/storage/test-verify-volume-fstab.yml:2
Thursday 21 July 2022  10:28:56 +0000 (0:00:00.035)       0:00:19.070 ********* 
ok: [/cache/rhel-9.qcow2.snap] => {
    "ansible_facts": {
        "storage_test_fstab_expected_id_matches": "1",
        "storage_test_fstab_expected_mount_options_matches": "1",
        "storage_test_fstab_expected_mount_point_matches": "1",
        "storage_test_fstab_id_matches": [
            "UUID=dc31169e-edd9-4a64-ab59-3286f5673c34 "
        ],
        "storage_test_fstab_mount_options_matches": [
            " /opt/test1 xfs defaults "
        ],
        "storage_test_fstab_mount_point_matches": [
            " /opt/test1 "
        ]
    },
    "changed": false
}

TASK [Verify that the device identifier appears in /etc/fstab] *****************
task path: /tmp/tmpfc26zqih/tests/storage/test-verify-volume-fstab.yml:12
Thursday 21 July 2022  10:28:56 +0000 (0:00:00.057)       0:00:19.128 ********* 
ok: [/cache/rhel-9.qcow2.snap] => {
    "changed": false
}

MSG:

All assertions passed

TASK [Verify the fstab mount point] ********************************************
task path: /tmp/tmpfc26zqih/tests/storage/test-verify-volume-fstab.yml:19
Thursday 21 July 2022  10:28:56 +0000 (0:00:00.051)       0:00:19.179 ********* 
ok: [/cache/rhel-9.qcow2.snap] => {
    "changed": false
}

MSG:

All assertions passed

TASK [Verify mount_options] ****************************************************
task path: /tmp/tmpfc26zqih/tests/storage/test-verify-volume-fstab.yml:25
Thursday 21 July 2022  10:28:56 +0000 (0:00:00.050)       0:00:19.229 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [Clean up variables] ******************************************************
task path: /tmp/tmpfc26zqih/tests/storage/test-verify-volume-fstab.yml:34
Thursday 21 July 2022  10:28:56 +0000 (0:00:00.039)       0:00:19.268 ********* 
ok: [/cache/rhel-9.qcow2.snap] => {
    "ansible_facts": {
        "storage_test_fstab_expected_id_matches": null,
        "storage_test_fstab_expected_mount_options_matches": null,
        "storage_test_fstab_expected_mount_point_matches": null,
        "storage_test_fstab_id_matches": null,
        "storage_test_fstab_mount_options_matches": null,
        "storage_test_fstab_mount_point_matches": null
    },
    "changed": false
}

TASK [Verify fs type] **********************************************************
task path: /tmp/tmpfc26zqih/tests/storage/test-verify-volume-fs.yml:4
Thursday 21 July 2022  10:28:56 +0000 (0:00:00.035)       0:00:19.304 ********* 
ok: [/cache/rhel-9.qcow2.snap] => {
    "changed": false
}

MSG:

All assertions passed

TASK [Verify fs label] *********************************************************
task path: /tmp/tmpfc26zqih/tests/storage/test-verify-volume-fs.yml:10
Thursday 21 July 2022  10:28:56 +0000 (0:00:00.038)       0:00:19.342 ********* 
ok: [/cache/rhel-9.qcow2.snap] => {
    "changed": false
}

MSG:

All assertions passed

TASK [See whether the device node is present] **********************************
task path: /tmp/tmpfc26zqih/tests/storage/test-verify-volume-device.yml:4
Thursday 21 July 2022  10:28:56 +0000 (0:00:00.039)       0:00:19.382 ********* 
ok: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "stat": {
        "atime": 1658399330.874754,
        "attr_flags": "",
        "attributes": [],
        "block_size": 4096,
        "blocks": 0,
        "charset": "binary",
        "ctime": 1658399330.874754,
        "dev": 5,
        "device_type": 2048,
        "executable": false,
        "exists": true,
        "gid": 6,
        "gr_name": "disk",
        "inode": 356,
        "isblk": true,
        "ischr": false,
        "isdir": false,
        "isfifo": false,
        "isgid": false,
        "islnk": false,
        "isreg": false,
        "issock": false,
        "isuid": false,
        "mimetype": "inode/blockdevice",
        "mode": "0660",
        "mtime": 1658399330.874754,
        "nlink": 1,
        "path": "/dev/sda",
        "pw_name": "root",
        "readable": true,
        "rgrp": true,
        "roth": false,
        "rusr": true,
        "size": 0,
        "uid": 0,
        "version": null,
        "wgrp": true,
        "woth": false,
        "writeable": true,
        "wusr": true,
        "xgrp": false,
        "xoth": false,
        "xusr": false
    }
}

TASK [Verify the presence/absence of the device node] **************************
task path: /tmp/tmpfc26zqih/tests/storage/test-verify-volume-device.yml:10
Thursday 21 July 2022  10:28:57 +0000 (0:00:00.352)       0:00:19.734 ********* 
ok: [/cache/rhel-9.qcow2.snap] => {
    "changed": false
}

MSG:

All assertions passed

TASK [Make sure we got info about this volume] *********************************
task path: /tmp/tmpfc26zqih/tests/storage/test-verify-volume-device.yml:15
Thursday 21 July 2022  10:28:57 +0000 (0:00:00.038)       0:00:19.773 ********* 
ok: [/cache/rhel-9.qcow2.snap] => {
    "changed": false
}

MSG:

All assertions passed

TASK [(1/2) Process volume type (set initial value)] ***************************
task path: /tmp/tmpfc26zqih/tests/storage/test-verify-volume-device.yml:21
Thursday 21 July 2022  10:28:57 +0000 (0:00:00.039)       0:00:19.812 ********* 
ok: [/cache/rhel-9.qcow2.snap] => {
    "ansible_facts": {
        "st_volume_type": "disk"
    },
    "changed": false
}

TASK [(2/2) Process volume type (get RAID value)] ******************************
task path: /tmp/tmpfc26zqih/tests/storage/test-verify-volume-device.yml:25
Thursday 21 July 2022  10:28:57 +0000 (0:00:00.035)       0:00:19.848 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [Verify the volume's device type] *****************************************
task path: /tmp/tmpfc26zqih/tests/storage/test-verify-volume-device.yml:30
Thursday 21 July 2022  10:28:57 +0000 (0:00:00.023)       0:00:19.871 ********* 
ok: [/cache/rhel-9.qcow2.snap] => {
    "changed": false
}

MSG:

All assertions passed

TASK [Stat the LUKS device, if encrypted] **************************************
task path: /tmp/tmpfc26zqih/tests/storage/test-verify-volume-encryption.yml:3
Thursday 21 July 2022  10:28:57 +0000 (0:00:00.039)       0:00:19.910 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [Ensure cryptsetup is present] ********************************************
task path: /tmp/tmpfc26zqih/tests/storage/test-verify-volume-encryption.yml:10
Thursday 21 July 2022  10:28:57 +0000 (0:00:00.024)       0:00:19.935 ********* 
ok: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "rc": 0,
    "results": []
}

MSG:

Nothing to do

TASK [Collect LUKS info for this volume] ***************************************
task path: /tmp/tmpfc26zqih/tests/storage/test-verify-volume-encryption.yml:15
Thursday 21 July 2022  10:28:58 +0000 (0:00:00.870)       0:00:20.806 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [Verify the presence/absence of the LUKS device node] *********************
task path: /tmp/tmpfc26zqih/tests/storage/test-verify-volume-encryption.yml:21
Thursday 21 July 2022  10:28:58 +0000 (0:00:00.058)       0:00:20.864 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [Verify that the raw device is the same as the device if not encrypted] ***
task path: /tmp/tmpfc26zqih/tests/storage/test-verify-volume-encryption.yml:27
Thursday 21 July 2022  10:28:58 +0000 (0:00:00.023)       0:00:20.888 ********* 
ok: [/cache/rhel-9.qcow2.snap] => {
    "changed": false
}

MSG:

All assertions passed

TASK [Make sure we got info about the LUKS volume if encrypted] ****************
task path: /tmp/tmpfc26zqih/tests/storage/test-verify-volume-encryption.yml:33
Thursday 21 July 2022  10:28:58 +0000 (0:00:00.050)       0:00:20.939 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [Verify the LUKS volume's device type if encrypted] ***********************
task path: /tmp/tmpfc26zqih/tests/storage/test-verify-volume-encryption.yml:39
Thursday 21 July 2022  10:28:58 +0000 (0:00:00.023)       0:00:20.962 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [Check LUKS version] ******************************************************
task path: /tmp/tmpfc26zqih/tests/storage/test-verify-volume-encryption.yml:44
Thursday 21 July 2022  10:28:58 +0000 (0:00:00.024)       0:00:20.986 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [Check LUKS key size] *****************************************************
task path: /tmp/tmpfc26zqih/tests/storage/test-verify-volume-encryption.yml:50
Thursday 21 July 2022  10:28:58 +0000 (0:00:00.023)       0:00:21.010 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [Check LUKS cipher] *******************************************************
task path: /tmp/tmpfc26zqih/tests/storage/test-verify-volume-encryption.yml:56
Thursday 21 July 2022  10:28:58 +0000 (0:00:00.022)       0:00:21.033 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [set_fact] ****************************************************************
task path: /tmp/tmpfc26zqih/tests/storage/test-verify-volume-encryption.yml:62
Thursday 21 July 2022  10:28:58 +0000 (0:00:00.022)       0:00:21.055 ********* 
ok: [/cache/rhel-9.qcow2.snap] => {
    "ansible_facts": {
        "_storage_test_crypttab_entries": [],
        "_storage_test_expected_crypttab_entries": "0",
        "_storage_test_expected_crypttab_key_file": "-"
    },
    "changed": false
}

TASK [Check for /etc/crypttab entry] *******************************************
task path: /tmp/tmpfc26zqih/tests/storage/test-verify-volume-encryption.yml:67
Thursday 21 July 2022  10:28:58 +0000 (0:00:00.048)       0:00:21.104 ********* 
ok: [/cache/rhel-9.qcow2.snap] => {
    "changed": false
}

MSG:

All assertions passed

TASK [Validate the format of the crypttab entry] *******************************
task path: /tmp/tmpfc26zqih/tests/storage/test-verify-volume-encryption.yml:72
Thursday 21 July 2022  10:28:58 +0000 (0:00:00.046)       0:00:21.150 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [Check backing device of crypttab entry] **********************************
task path: /tmp/tmpfc26zqih/tests/storage/test-verify-volume-encryption.yml:78
Thursday 21 July 2022  10:28:58 +0000 (0:00:00.034)       0:00:21.185 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [Check key file of crypttab entry] ****************************************
task path: /tmp/tmpfc26zqih/tests/storage/test-verify-volume-encryption.yml:84
Thursday 21 July 2022  10:28:58 +0000 (0:00:00.037)       0:00:21.222 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [set_fact] ****************************************************************
task path: /tmp/tmpfc26zqih/tests/storage/test-verify-volume-encryption.yml:90
Thursday 21 July 2022  10:28:58 +0000 (0:00:00.034)       0:00:21.256 ********* 
ok: [/cache/rhel-9.qcow2.snap] => {
    "ansible_facts": {
        "_storage_test_crypttab_entries": null,
        "_storage_test_expected_crypttab_entries": null,
        "_storage_test_expected_crypttab_key_file": null
    },
    "changed": false
}

TASK [get information about RAID] **********************************************
task path: /tmp/tmpfc26zqih/tests/storage/test-verify-volume-md.yml:7
Thursday 21 July 2022  10:28:58 +0000 (0:00:00.033)       0:00:21.290 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [set_fact] ****************************************************************
task path: /tmp/tmpfc26zqih/tests/storage/test-verify-volume-md.yml:13
Thursday 21 July 2022  10:28:58 +0000 (0:00:00.035)       0:00:21.325 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [set_fact] ****************************************************************
task path: /tmp/tmpfc26zqih/tests/storage/test-verify-volume-md.yml:17
Thursday 21 July 2022  10:28:58 +0000 (0:00:00.038)       0:00:21.363 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [set_fact] ****************************************************************
task path: /tmp/tmpfc26zqih/tests/storage/test-verify-volume-md.yml:21
Thursday 21 July 2022  10:28:58 +0000 (0:00:00.034)       0:00:21.398 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [check RAID active devices count] *****************************************
task path: /tmp/tmpfc26zqih/tests/storage/test-verify-volume-md.yml:25
Thursday 21 July 2022  10:28:58 +0000 (0:00:00.034)       0:00:21.432 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [check RAID spare devices count] ******************************************
task path: /tmp/tmpfc26zqih/tests/storage/test-verify-volume-md.yml:31
Thursday 21 July 2022  10:28:58 +0000 (0:00:00.034)       0:00:21.466 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [check RAID metadata version] *********************************************
task path: /tmp/tmpfc26zqih/tests/storage/test-verify-volume-md.yml:37
Thursday 21 July 2022  10:28:58 +0000 (0:00:00.038)       0:00:21.504 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [parse the actual size of the volume] *************************************
task path: /tmp/tmpfc26zqih/tests/storage/test-verify-volume-size.yml:3
Thursday 21 July 2022  10:28:58 +0000 (0:00:00.034)       0:00:21.539 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [parse the requested size of the volume] **********************************
task path: /tmp/tmpfc26zqih/tests/storage/test-verify-volume-size.yml:9
Thursday 21 July 2022  10:28:58 +0000 (0:00:00.023)       0:00:21.562 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [Establish base value for expected size] **********************************
task path: /tmp/tmpfc26zqih/tests/storage/test-verify-volume-size.yml:15
Thursday 21 July 2022  10:28:59 +0000 (0:00:00.035)       0:00:21.597 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [debug] *******************************************************************
task path: /tmp/tmpfc26zqih/tests/storage/test-verify-volume-size.yml:20
Thursday 21 July 2022  10:28:59 +0000 (0:00:00.038)       0:00:21.635 ********* 
ok: [/cache/rhel-9.qcow2.snap] => {
    "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined"
}

TASK [debug] *******************************************************************
task path: /tmp/tmpfc26zqih/tests/storage/test-verify-volume-size.yml:25
Thursday 21 July 2022  10:28:59 +0000 (0:00:00.035)       0:00:21.671 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {}

TASK [debug] *******************************************************************
task path: /tmp/tmpfc26zqih/tests/storage/test-verify-volume-size.yml:28
Thursday 21 July 2022  10:28:59 +0000 (0:00:00.034)       0:00:21.705 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {}

TASK [Get the size of parent/pool device] **************************************
task path: /tmp/tmpfc26zqih/tests/storage/test-verify-volume-size.yml:31
Thursday 21 July 2022  10:28:59 +0000 (0:00:00.034)       0:00:21.740 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [debug] *******************************************************************
task path: /tmp/tmpfc26zqih/tests/storage/test-verify-volume-size.yml:36
Thursday 21 July 2022  10:28:59 +0000 (0:00:00.038)       0:00:21.778 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {}

TASK [Calculate the expected size based on pool size and percentage value] *****
task path: /tmp/tmpfc26zqih/tests/storage/test-verify-volume-size.yml:39
Thursday 21 July 2022  10:28:59 +0000 (0:00:00.035)       0:00:21.813 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [debug] *******************************************************************
task path: /tmp/tmpfc26zqih/tests/storage/test-verify-volume-size.yml:44
Thursday 21 July 2022  10:28:59 +0000 (0:00:00.035)       0:00:21.849 ********* 
ok: [/cache/rhel-9.qcow2.snap] => {
    "storage_test_actual_size": {
        "changed": false,
        "skip_reason": "Conditional result was False",
        "skipped": true
    }
}

TASK [debug] *******************************************************************
task path: /tmp/tmpfc26zqih/tests/storage/test-verify-volume-size.yml:47
Thursday 21 July 2022  10:28:59 +0000 (0:00:00.035)       0:00:21.884 ********* 
ok: [/cache/rhel-9.qcow2.snap] => {
    "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined"
}

TASK [assert] ******************************************************************
task path: /tmp/tmpfc26zqih/tests/storage/test-verify-volume-size.yml:50
Thursday 21 July 2022  10:28:59 +0000 (0:00:00.038)       0:00:21.923 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [Get information about the LV] ********************************************
task path: /tmp/tmpfc26zqih/tests/storage/test-verify-volume-cache.yml:6
Thursday 21 July 2022  10:28:59 +0000 (0:00:00.023)       0:00:21.946 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [set_fact] ****************************************************************
task path: /tmp/tmpfc26zqih/tests/storage/test-verify-volume-cache.yml:14
Thursday 21 July 2022  10:28:59 +0000 (0:00:00.023)       0:00:21.969 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [check segment type] ******************************************************
task path: /tmp/tmpfc26zqih/tests/storage/test-verify-volume-cache.yml:17
Thursday 21 July 2022  10:28:59 +0000 (0:00:00.023)       0:00:21.993 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [set_fact] ****************************************************************
task path: /tmp/tmpfc26zqih/tests/storage/test-verify-volume-cache.yml:22
Thursday 21 July 2022  10:28:59 +0000 (0:00:00.024)       0:00:22.017 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [parse the requested cache size] ******************************************
task path: /tmp/tmpfc26zqih/tests/storage/test-verify-volume-cache.yml:26
Thursday 21 July 2022  10:28:59 +0000 (0:00:00.023)       0:00:22.041 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [set_fact] ****************************************************************
task path: /tmp/tmpfc26zqih/tests/storage/test-verify-volume-cache.yml:32
Thursday 21 July 2022  10:28:59 +0000 (0:00:00.022)       0:00:22.063 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [Check cache size] ********************************************************
task path: /tmp/tmpfc26zqih/tests/storage/test-verify-volume-cache.yml:36
Thursday 21 July 2022  10:28:59 +0000 (0:00:00.022)       0:00:22.086 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [Clean up facts] **********************************************************
task path: /tmp/tmpfc26zqih/tests/storage/test-verify-volume.yml:16
Thursday 21 July 2022  10:28:59 +0000 (0:00:00.023)       0:00:22.110 ********* 
ok: [/cache/rhel-9.qcow2.snap] => {
    "ansible_facts": {
        "_storage_test_volume_present": null
    },
    "changed": false
}

TASK [Clean up variable namespace] *********************************************
task path: /tmp/tmpfc26zqih/tests/storage/verify-role-results.yml:53
Thursday 21 July 2022  10:28:59 +0000 (0:00:00.030)       0:00:22.140 ********* 
ok: [/cache/rhel-9.qcow2.snap] => {
    "ansible_facts": {
        "storage_test_blkinfo": null,
        "storage_test_crypttab": null,
        "storage_test_fstab": null
    },
    "changed": false
}

TASK [Change the disk device mount location to "/opt/test2"] *******************
task path: /tmp/tmpfc26zqih/tests/storage/tests_change_disk_mount.yml:37
Thursday 21 July 2022  10:28:59 +0000 (0:00:00.063)       0:00:22.204 ********* 

TASK [fedora.linux_system_roles.storage : set platform/version specific variables] ***
task path: /tmp/tmpi19f9hzy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2
Thursday 21 July 2022  10:28:59 +0000 (0:00:00.074)       0:00:22.278 ********* 
included: /tmp/tmpi19f9hzy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for /cache/rhel-9.qcow2.snap

TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] ***
task path: /tmp/tmpi19f9hzy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2
Thursday 21 July 2022  10:28:59 +0000 (0:00:00.035)       0:00:22.313 ********* 
ok: [/cache/rhel-9.qcow2.snap]

TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] ***
task path: /tmp/tmpi19f9hzy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7
Thursday 21 July 2022  10:29:00 +0000 (0:00:00.513)       0:00:22.826 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => (item=RedHat.yml)  => {
    "ansible_loop_var": "item",
    "changed": false,
    "item": "RedHat.yml",
    "skip_reason": "Conditional result was False"
}
skipping: [/cache/rhel-9.qcow2.snap] => (item=RedHat.yml)  => {
    "ansible_loop_var": "item",
    "changed": false,
    "item": "RedHat.yml",
    "skip_reason": "Conditional result was False"
}
ok: [/cache/rhel-9.qcow2.snap] => (item=RedHat_9.yml) => {
    "ansible_facts": {
        "blivet_package_list": [
            "python3-blivet",
            "libblockdev-crypto",
            "libblockdev-dm",
            "libblockdev-lvm",
            "libblockdev-mdraid",
            "libblockdev-swap",
            "vdo",
            "kmod-kvdo",
            "xfsprogs"
        ]
    },
    "ansible_included_var_files": [
        "/tmp/tmpi19f9hzy/ansible_collections/fedora/linux_system_roles/roles/storage/vars/RedHat_9.yml"
    ],
    "ansible_loop_var": "item",
    "changed": false,
    "item": "RedHat_9.yml"
}
skipping: [/cache/rhel-9.qcow2.snap] => (item=RedHat_9.0.yml)  => {
    "ansible_loop_var": "item",
    "changed": false,
    "item": "RedHat_9.0.yml",
    "skip_reason": "Conditional result was False"
}

TASK [fedora.linux_system_roles.storage : define an empty list of pools to be used in testing] ***
task path: /tmp/tmpi19f9hzy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5
Thursday 21 July 2022  10:29:00 +0000 (0:00:00.061)       0:00:22.888 ********* 
ok: [/cache/rhel-9.qcow2.snap] => {
    "ansible_facts": {
        "_storage_pools_list": []
    },
    "changed": false
}

TASK [fedora.linux_system_roles.storage : define an empty list of volumes to be used in testing] ***
task path: /tmp/tmpi19f9hzy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9
Thursday 21 July 2022  10:29:00 +0000 (0:00:00.033)       0:00:22.921 ********* 
ok: [/cache/rhel-9.qcow2.snap] => {
    "ansible_facts": {
        "_storage_volumes_list": []
    },
    "changed": false
}

TASK [fedora.linux_system_roles.storage : include the appropriate provider tasks] ***
task path: /tmp/tmpi19f9hzy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13
Thursday 21 July 2022  10:29:00 +0000 (0:00:00.034)       0:00:22.956 ********* 
redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount
redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount
included: /tmp/tmpi19f9hzy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for /cache/rhel-9.qcow2.snap

TASK [fedora.linux_system_roles.storage : get a list of rpm packages installed on host machine] ***
task path: /tmp/tmpi19f9hzy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2
Thursday 21 July 2022  10:29:00 +0000 (0:00:00.054)       0:00:23.011 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [fedora.linux_system_roles.storage : make sure blivet is available] *******
task path: /tmp/tmpi19f9hzy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:7
Thursday 21 July 2022  10:29:00 +0000 (0:00:00.020)       0:00:23.031 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [fedora.linux_system_roles.storage : show storage_pools] ******************
task path: /tmp/tmpi19f9hzy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:13
Thursday 21 July 2022  10:29:00 +0000 (0:00:00.028)       0:00:23.059 ********* 
ok: [/cache/rhel-9.qcow2.snap] => {
    "storage_pools": "VARIABLE IS NOT DEFINED!: 'storage_pools' is undefined"
}

TASK [fedora.linux_system_roles.storage : show storage_volumes] ****************
task path: /tmp/tmpi19f9hzy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:18
Thursday 21 July 2022  10:29:00 +0000 (0:00:00.036)       0:00:23.096 ********* 
ok: [/cache/rhel-9.qcow2.snap] => {
    "storage_volumes": [
        {
            "disks": [
                "sda"
            ],
            "mount_point": "/opt/test2",
            "name": "test1",
            "type": "disk"
        }
    ]
}

TASK [fedora.linux_system_roles.storage : get required packages] ***************
task path: /tmp/tmpi19f9hzy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:23
Thursday 21 July 2022  10:29:00 +0000 (0:00:00.039)       0:00:23.136 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [fedora.linux_system_roles.storage : enable copr repositories if needed] ***
task path: /tmp/tmpi19f9hzy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:35
Thursday 21 July 2022  10:29:00 +0000 (0:00:00.027)       0:00:23.163 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [fedora.linux_system_roles.storage : make sure required packages are installed] ***
task path: /tmp/tmpi19f9hzy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:41
Thursday 21 July 2022  10:29:00 +0000 (0:00:00.026)       0:00:23.190 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [fedora.linux_system_roles.storage : get service facts] *******************
task path: /tmp/tmpi19f9hzy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:47
Thursday 21 July 2022  10:29:00 +0000 (0:00:00.026)       0:00:23.217 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [fedora.linux_system_roles.storage : Set storage_cryptsetup_services] *****
task path: /tmp/tmpi19f9hzy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:53
Thursday 21 July 2022  10:29:00 +0000 (0:00:00.028)       0:00:23.245 ********* 
ok: [/cache/rhel-9.qcow2.snap] => {
    "ansible_facts": {
        "storage_cryptsetup_services": []
    },
    "changed": false
}

TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] ***
task path: /tmp/tmpi19f9hzy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:58
Thursday 21 July 2022  10:29:00 +0000 (0:00:00.058)       0:00:23.303 ********* 

TASK [fedora.linux_system_roles.storage : manage the pools and volumes to match the specified state] ***
task path: /tmp/tmpi19f9hzy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:64
Thursday 21 July 2022  10:29:00 +0000 (0:00:00.023)       0:00:23.327 ********* 
ok: [/cache/rhel-9.qcow2.snap] => {
    "actions": [],
    "changed": false,
    "crypts": [],
    "leaves": [
        "/dev/sr0",
        "/dev/vda1",
        "/dev/vda2",
        "/dev/vda3",
        "/dev/vda4",
        "/dev/sda",
        "/dev/sdb",
        "/dev/sdc",
        "/dev/nvme0n1",
        "/dev/nvme1n1",
        "/dev/nvme2n1",
        "/dev/vdb1",
        "/dev/vdc1",
        "/dev/vdd",
        "/dev/vde",
        "/dev/vdf"
    ],
    "mounts": [
        {
            "path": "/opt/test1",
            "state": "absent"
        },
        {
            "dump": 0,
            "fstype": "xfs",
            "opts": "defaults",
            "passno": 0,
            "path": "/opt/test2",
            "src": "UUID=dc31169e-edd9-4a64-ab59-3286f5673c34",
            "state": "mounted"
        }
    ],
    "packages": [
        "dosfstools",
        "e2fsprogs",
        "xfsprogs"
    ],
    "pools": [],
    "volumes": [
        {
            "_device": "/dev/sda",
            "_kernel_device": "/dev/sda",
            "_mount_id": "UUID=dc31169e-edd9-4a64-ab59-3286f5673c34",
            "_raw_device": "/dev/sda",
            "_raw_kernel_device": "/dev/sda",
            "cache_devices": [],
            "cache_mode": null,
            "cache_size": 0,
            "cached": false,
            "compression": null,
            "deduplication": null,
            "disks": [
                "sda"
            ],
            "encryption": false,
            "encryption_cipher": null,
            "encryption_key": null,
            "encryption_key_size": null,
            "encryption_luks_version": null,
            "encryption_password": null,
            "fs_create_options": "",
            "fs_label": "",
            "fs_overwrite_existing": true,
            "fs_type": "xfs",
            "mount_check": 0,
            "mount_device_identifier": "uuid",
            "mount_options": "defaults",
            "mount_passno": 0,
            "mount_point": "/opt/test2",
            "name": "test1",
            "raid_chunk_size": null,
            "raid_device_count": null,
            "raid_level": null,
            "raid_metadata_version": null,
            "raid_spare_count": null,
            "size": 10737418240,
            "state": "present",
            "thin": null,
            "thin_pool_name": null,
            "thin_pool_size": null,
            "type": "disk",
            "vdo_pool_size": null
        }
    ]
}

TASK [fedora.linux_system_roles.storage : Workaround for udev issue on some platforms] ***
task path: /tmp/tmpi19f9hzy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:78
Thursday 21 July 2022  10:29:02 +0000 (0:00:01.313)       0:00:24.640 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] ***
task path: /tmp/tmpi19f9hzy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:90
Thursday 21 July 2022  10:29:02 +0000 (0:00:00.035)       0:00:24.675 ********* 

TASK [fedora.linux_system_roles.storage : show blivet_output] ******************
task path: /tmp/tmpi19f9hzy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:96
Thursday 21 July 2022  10:29:02 +0000 (0:00:00.021)       0:00:24.697 ********* 
ok: [/cache/rhel-9.qcow2.snap] => {
    "blivet_output": {
        "actions": [],
        "changed": false,
        "crypts": [],
        "failed": false,
        "leaves": [
            "/dev/sr0",
            "/dev/vda1",
            "/dev/vda2",
            "/dev/vda3",
            "/dev/vda4",
            "/dev/sda",
            "/dev/sdb",
            "/dev/sdc",
            "/dev/nvme0n1",
            "/dev/nvme1n1",
            "/dev/nvme2n1",
            "/dev/vdb1",
            "/dev/vdc1",
            "/dev/vdd",
            "/dev/vde",
            "/dev/vdf"
        ],
        "mounts": [
            {
                "path": "/opt/test1",
                "state": "absent"
            },
            {
                "dump": 0,
                "fstype": "xfs",
                "opts": "defaults",
                "passno": 0,
                "path": "/opt/test2",
                "src": "UUID=dc31169e-edd9-4a64-ab59-3286f5673c34",
                "state": "mounted"
            }
        ],
        "packages": [
            "dosfstools",
            "e2fsprogs",
            "xfsprogs"
        ],
        "pools": [],
        "volumes": [
            {
                "_device": "/dev/sda",
                "_kernel_device": "/dev/sda",
                "_mount_id": "UUID=dc31169e-edd9-4a64-ab59-3286f5673c34",
                "_raw_device": "/dev/sda",
                "_raw_kernel_device": "/dev/sda",
                "cache_devices": [],
                "cache_mode": null,
                "cache_size": 0,
                "cached": false,
                "compression": null,
                "deduplication": null,
                "disks": [
                    "sda"
                ],
                "encryption": false,
                "encryption_cipher": null,
                "encryption_key": null,
                "encryption_key_size": null,
                "encryption_luks_version": null,
                "encryption_password": null,
                "fs_create_options": "",
                "fs_label": "",
                "fs_overwrite_existing": true,
                "fs_type": "xfs",
                "mount_check": 0,
                "mount_device_identifier": "uuid",
                "mount_options": "defaults",
                "mount_passno": 0,
                "mount_point": "/opt/test2",
                "name": "test1",
                "raid_chunk_size": null,
                "raid_device_count": null,
                "raid_level": null,
                "raid_metadata_version": null,
                "raid_spare_count": null,
                "size": 10737418240,
                "state": "present",
                "thin": null,
                "thin_pool_name": null,
                "thin_pool_size": null,
                "type": "disk",
                "vdo_pool_size": null
            }
        ]
    }
}

TASK [fedora.linux_system_roles.storage : set the list of pools for test verification] ***
task path: /tmp/tmpi19f9hzy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:101
Thursday 21 July 2022  10:29:02 +0000 (0:00:00.084)       0:00:24.781 ********* 
ok: [/cache/rhel-9.qcow2.snap] => {
    "ansible_facts": {
        "_storage_pools_list": []
    },
    "changed": false
}

TASK [fedora.linux_system_roles.storage : set the list of volumes for test verification] ***
task path: /tmp/tmpi19f9hzy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:105
Thursday 21 July 2022  10:29:02 +0000 (0:00:00.036)       0:00:24.817 ********* 
ok: [/cache/rhel-9.qcow2.snap] => {
    "ansible_facts": {
        "_storage_volumes_list": [
            {
                "_device": "/dev/sda",
                "_kernel_device": "/dev/sda",
                "_mount_id": "UUID=dc31169e-edd9-4a64-ab59-3286f5673c34",
                "_raw_device": "/dev/sda",
                "_raw_kernel_device": "/dev/sda",
                "cache_devices": [],
                "cache_mode": null,
                "cache_size": 0,
                "cached": false,
                "compression": null,
                "deduplication": null,
                "disks": [
                    "sda"
                ],
                "encryption": false,
                "encryption_cipher": null,
                "encryption_key": null,
                "encryption_key_size": null,
                "encryption_luks_version": null,
                "encryption_password": null,
                "fs_create_options": "",
                "fs_label": "",
                "fs_overwrite_existing": true,
                "fs_type": "xfs",
                "mount_check": 0,
                "mount_device_identifier": "uuid",
                "mount_options": "defaults",
                "mount_passno": 0,
                "mount_point": "/opt/test2",
                "name": "test1",
                "raid_chunk_size": null,
                "raid_device_count": null,
                "raid_level": null,
                "raid_metadata_version": null,
                "raid_spare_count": null,
                "size": 10737418240,
                "state": "present",
                "thin": null,
                "thin_pool_name": null,
                "thin_pool_size": null,
                "type": "disk",
                "vdo_pool_size": null
            }
        ]
    },
    "changed": false
}

TASK [fedora.linux_system_roles.storage : remove obsolete mounts] **************
task path: /tmp/tmpi19f9hzy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:121
Thursday 21 July 2022  10:29:02 +0000 (0:00:00.066)       0:00:24.884 ********* 
redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount
changed: [/cache/rhel-9.qcow2.snap] => (item={'path': '/opt/test1', 'state': 'absent'}) => {
    "ansible_loop_var": "mount_info",
    "backup_file": "",
    "boot": "yes",
    "changed": true,
    "dump": "0",
    "fstab": "/etc/fstab",
    "mount_info": {
        "path": "/opt/test1",
        "state": "absent"
    },
    "name": "/opt/test1",
    "opts": "defaults",
    "passno": "0"
}

TASK [fedora.linux_system_roles.storage : tell systemd to refresh its view of /etc/fstab] ***
task path: /tmp/tmpi19f9hzy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:132
Thursday 21 July 2022  10:29:02 +0000 (0:00:00.421)       0:00:25.305 ********* 
ok: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "name": null,
    "status": {}
}

TASK [fedora.linux_system_roles.storage : set up new/current mounts] ***********
task path: /tmp/tmpi19f9hzy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:137
Thursday 21 July 2022  10:29:03 +0000 (0:00:00.729)       0:00:26.034 ********* 
redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount
changed: [/cache/rhel-9.qcow2.snap] => (item={'src': 'UUID=dc31169e-edd9-4a64-ab59-3286f5673c34', 'path': '/opt/test2', 'fstype': 'xfs', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted'}) => {
    "ansible_loop_var": "mount_info",
    "backup_file": "",
    "boot": "yes",
    "changed": true,
    "dump": "0",
    "fstab": "/etc/fstab",
    "fstype": "xfs",
    "mount_info": {
        "dump": 0,
        "fstype": "xfs",
        "opts": "defaults",
        "passno": 0,
        "path": "/opt/test2",
        "src": "UUID=dc31169e-edd9-4a64-ab59-3286f5673c34",
        "state": "mounted"
    },
    "name": "/opt/test2",
    "opts": "defaults",
    "passno": "0",
    "src": "UUID=dc31169e-edd9-4a64-ab59-3286f5673c34"
}

TASK [fedora.linux_system_roles.storage : tell systemd to refresh its view of /etc/fstab] ***
task path: /tmp/tmpi19f9hzy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:148
Thursday 21 July 2022  10:29:03 +0000 (0:00:00.411)       0:00:26.446 ********* 
ok: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "name": null,
    "status": {}
}

TASK [fedora.linux_system_roles.storage : retrieve facts for the /etc/crypttab file] ***
task path: /tmp/tmpi19f9hzy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:156
Thursday 21 July 2022  10:29:04 +0000 (0:00:00.658)       0:00:27.104 ********* 
ok: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "stat": {
        "exists": false
    }
}

TASK [fedora.linux_system_roles.storage : manage /etc/crypttab to account for changes we just made] ***
task path: /tmp/tmpi19f9hzy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:161
Thursday 21 July 2022  10:29:04 +0000 (0:00:00.353)       0:00:27.458 ********* 

TASK [fedora.linux_system_roles.storage : Update facts] ************************
task path: /tmp/tmpi19f9hzy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:183
Thursday 21 July 2022  10:29:04 +0000 (0:00:00.021)       0:00:27.479 ********* 
ok: [/cache/rhel-9.qcow2.snap]
META: role_complete for /cache/rhel-9.qcow2.snap

TASK [include_tasks] ***********************************************************
task path: /tmp/tmpfc26zqih/tests/storage/tests_change_disk_mount.yml:47
Thursday 21 July 2022  10:29:05 +0000 (0:00:00.957)       0:00:28.437 ********* 
included: /tmp/tmpfc26zqih/tests/storage/verify-role-results.yml for /cache/rhel-9.qcow2.snap

TASK [Print out pool information] **********************************************
task path: /tmp/tmpfc26zqih/tests/storage/verify-role-results.yml:1
Thursday 21 July 2022  10:29:05 +0000 (0:00:00.041)       0:00:28.479 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {}

TASK [Print out volume information] ********************************************
task path: /tmp/tmpfc26zqih/tests/storage/verify-role-results.yml:6
Thursday 21 July 2022  10:29:05 +0000 (0:00:00.039)       0:00:28.519 ********* 
ok: [/cache/rhel-9.qcow2.snap] => {
    "_storage_volumes_list": [
        {
            "_device": "/dev/sda",
            "_kernel_device": "/dev/sda",
            "_mount_id": "UUID=dc31169e-edd9-4a64-ab59-3286f5673c34",
            "_raw_device": "/dev/sda",
            "_raw_kernel_device": "/dev/sda",
            "cache_devices": [],
            "cache_mode": null,
            "cache_size": 0,
            "cached": false,
            "compression": null,
            "deduplication": null,
            "disks": [
                "sda"
            ],
            "encryption": false,
            "encryption_cipher": null,
            "encryption_key": null,
            "encryption_key_size": null,
            "encryption_luks_version": null,
            "encryption_password": null,
            "fs_create_options": "",
            "fs_label": "",
            "fs_overwrite_existing": true,
            "fs_type": "xfs",
            "mount_check": 0,
            "mount_device_identifier": "uuid",
            "mount_options": "defaults",
            "mount_passno": 0,
            "mount_point": "/opt/test2",
            "name": "test1",
            "raid_chunk_size": null,
            "raid_device_count": null,
            "raid_level": null,
            "raid_metadata_version": null,
            "raid_spare_count": null,
            "size": 10737418240,
            "state": "present",
            "thin": null,
            "thin_pool_name": null,
            "thin_pool_size": null,
            "type": "disk",
            "vdo_pool_size": null
        }
    ]
}

TASK [Collect info about the volumes.] *****************************************
task path: /tmp/tmpfc26zqih/tests/storage/verify-role-results.yml:14
Thursday 21 July 2022  10:29:05 +0000 (0:00:00.047)       0:00:28.567 ********* 
ok: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "info": {
        "/dev/nvme0n1": {
            "fstype": "",
            "label": "",
            "name": "/dev/nvme0n1",
            "size": "10G",
            "type": "disk",
            "uuid": ""
        },
        "/dev/nvme1n1": {
            "fstype": "",
            "label": "",
            "name": "/dev/nvme1n1",
            "size": "10G",
            "type": "disk",
            "uuid": ""
        },
        "/dev/nvme2n1": {
            "fstype": "",
            "label": "",
            "name": "/dev/nvme2n1",
            "size": "10G",
            "type": "disk",
            "uuid": ""
        },
        "/dev/sda": {
            "fstype": "xfs",
            "label": "",
            "name": "/dev/sda",
            "size": "10G",
            "type": "disk",
            "uuid": "dc31169e-edd9-4a64-ab59-3286f5673c34"
        },
        "/dev/sdb": {
            "fstype": "",
            "label": "",
            "name": "/dev/sdb",
            "size": "10G",
            "type": "disk",
            "uuid": ""
        },
        "/dev/sdc": {
            "fstype": "",
            "label": "",
            "name": "/dev/sdc",
            "size": "10G",
            "type": "disk",
            "uuid": ""
        },
        "/dev/sr0": {
            "fstype": "iso9660",
            "label": "cidata",
            "name": "/dev/sr0",
            "size": "364K",
            "type": "rom",
            "uuid": "2022-07-21-10-28-24-00"
        },
        "/dev/vda": {
            "fstype": "",
            "label": "",
            "name": "/dev/vda",
            "size": "10G",
            "type": "disk",
            "uuid": ""
        },
        "/dev/vda1": {
            "fstype": "",
            "label": "",
            "name": "/dev/vda1",
            "size": "1M",
            "type": "partition",
            "uuid": ""
        },
        "/dev/vda2": {
            "fstype": "vfat",
            "label": "",
            "name": "/dev/vda2",
            "size": "200M",
            "type": "partition",
            "uuid": "7B77-95E7"
        },
        "/dev/vda3": {
            "fstype": "xfs",
            "label": "boot",
            "name": "/dev/vda3",
            "size": "500M",
            "type": "partition",
            "uuid": "6e74e171-0370-451f-8340-f16ad2839183"
        },
        "/dev/vda4": {
            "fstype": "xfs",
            "label": "root",
            "name": "/dev/vda4",
            "size": "9.3G",
            "type": "partition",
            "uuid": "1bb53e4d-984c-4316-908a-59b5a62fa30e"
        },
        "/dev/vdb": {
            "fstype": "",
            "label": "",
            "name": "/dev/vdb",
            "size": "2G",
            "type": "disk",
            "uuid": ""
        },
        "/dev/vdb1": {
            "fstype": "ext4",
            "label": "yumcache",
            "name": "/dev/vdb1",
            "size": "2G",
            "type": "partition",
            "uuid": "12be66f7-7f03-4d54-9a48-a520825d8fa9"
        },
        "/dev/vdc": {
            "fstype": "",
            "label": "",
            "name": "/dev/vdc",
            "size": "2G",
            "type": "disk",
            "uuid": ""
        },
        "/dev/vdc1": {
            "fstype": "ext4",
            "label": "yumvarlib",
            "name": "/dev/vdc1",
            "size": "2G",
            "type": "partition",
            "uuid": "9bd037ed-b1bb-4b9c-a008-9f07d34ab071"
        },
        "/dev/vdd": {
            "fstype": "",
            "label": "",
            "name": "/dev/vdd",
            "size": "10G",
            "type": "disk",
            "uuid": ""
        },
        "/dev/vde": {
            "fstype": "",
            "label": "",
            "name": "/dev/vde",
            "size": "10G",
            "type": "disk",
            "uuid": ""
        },
        "/dev/vdf": {
            "fstype": "",
            "label": "",
            "name": "/dev/vdf",
            "size": "10G",
            "type": "disk",
            "uuid": ""
        }
    }
}

TASK [Read the /etc/fstab file for volume existence] ***************************
task path: /tmp/tmpfc26zqih/tests/storage/verify-role-results.yml:19
Thursday 21 July 2022  10:29:06 +0000 (0:00:00.391)       0:00:28.958 ********* 
ok: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "cmd": [
        "cat",
        "/etc/fstab"
    ],
    "delta": "0:00:00.003108",
    "end": "2022-07-21 06:29:07.151058",
    "rc": 0,
    "start": "2022-07-21 06:29:07.147950"
}

STDOUT:

UUID=6e74e171-0370-451f-8340-f16ad2839183	/boot	xfs	defaults	0	0
UUID=1bb53e4d-984c-4316-908a-59b5a62fa30e	/	xfs	defaults	0	0
UUID=7B77-95E7	/boot/efi	vfat	defaults,uid=0,gid=0,umask=077,shortname=winnt	0	2
/dev/vdb1	/var/cache/dnf	auto	defaults,nofail,x-systemd.requires=cloud-init.service,_netdev,comment=cloudconfig	0	2
/dev/vdc1	/var/lib/dnf	auto	defaults,nofail,x-systemd.requires=cloud-init.service,_netdev,comment=cloudconfig	0	2
UUID=dc31169e-edd9-4a64-ab59-3286f5673c34 /opt/test2 xfs defaults 0 0

TASK [Read the /etc/crypttab file] *********************************************
task path: /tmp/tmpfc26zqih/tests/storage/verify-role-results.yml:24
Thursday 21 July 2022  10:29:06 +0000 (0:00:00.368)       0:00:29.327 ********* 
ok: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "cmd": [
        "cat",
        "/etc/crypttab"
    ],
    "delta": "0:00:00.003099",
    "end": "2022-07-21 06:29:07.558333",
    "failed_when_result": false,
    "rc": 1,
    "start": "2022-07-21 06:29:07.555234"
}

STDERR:

cat: /etc/crypttab: No such file or directory


MSG:

non-zero return code

TASK [Verify the volumes listed in storage_pools were correctly managed] *******
task path: /tmp/tmpfc26zqih/tests/storage/verify-role-results.yml:33
Thursday 21 July 2022  10:29:07 +0000 (0:00:00.407)       0:00:29.735 ********* 

TASK [Verify the volumes with no pool were correctly managed] ******************
task path: /tmp/tmpfc26zqih/tests/storage/verify-role-results.yml:43
Thursday 21 July 2022  10:29:07 +0000 (0:00:00.021)       0:00:29.756 ********* 
included: /tmp/tmpfc26zqih/tests/storage/test-verify-volume.yml for /cache/rhel-9.qcow2.snap => (item={'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test2', 'name': 'test1', 'raid_level': None, 'size': 10737418240, 'state': 'present', 'type': 'disk', 'disks': ['sda'], 'raid_device_count': None, 'raid_spare_count': None, 'raid_metadata_version': None, 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_chunk_size': None, 'compression': None, 'deduplication': None, 'vdo_pool_size': None, 'thin': None, 'thin_pool_name': None, 'thin_pool_size': None, 'cached': False, 'cache_size': 0, 'cache_mode': None, 'cache_devices': [], '_device': '/dev/sda', '_raw_device': '/dev/sda', '_mount_id': 'UUID=dc31169e-edd9-4a64-ab59-3286f5673c34', '_kernel_device': '/dev/sda', '_raw_kernel_device': '/dev/sda'})

TASK [set_fact] ****************************************************************
task path: /tmp/tmpfc26zqih/tests/storage/test-verify-volume.yml:2
Thursday 21 July 2022  10:29:07 +0000 (0:00:00.105)       0:00:29.862 ********* 
ok: [/cache/rhel-9.qcow2.snap] => {
    "ansible_facts": {
        "_storage_test_volume_present": true,
        "_storage_volume_tests": [
            "mount",
            "fstab",
            "fs",
            "device",
            "encryption",
            "md",
            "size",
            "cache"
        ]
    },
    "changed": false
}

TASK [include_tasks] ***********************************************************
task path: /tmp/tmpfc26zqih/tests/storage/test-verify-volume.yml:10
Thursday 21 July 2022  10:29:07 +0000 (0:00:00.046)       0:00:29.908 ********* 
included: /tmp/tmpfc26zqih/tests/storage/test-verify-volume-mount.yml for /cache/rhel-9.qcow2.snap => (item=mount)
included: /tmp/tmpfc26zqih/tests/storage/test-verify-volume-fstab.yml for /cache/rhel-9.qcow2.snap => (item=fstab)
included: /tmp/tmpfc26zqih/tests/storage/test-verify-volume-fs.yml for /cache/rhel-9.qcow2.snap => (item=fs)
included: /tmp/tmpfc26zqih/tests/storage/test-verify-volume-device.yml for /cache/rhel-9.qcow2.snap => (item=device)
included: /tmp/tmpfc26zqih/tests/storage/test-verify-volume-encryption.yml for /cache/rhel-9.qcow2.snap => (item=encryption)
included: /tmp/tmpfc26zqih/tests/storage/test-verify-volume-md.yml for /cache/rhel-9.qcow2.snap => (item=md)
included: /tmp/tmpfc26zqih/tests/storage/test-verify-volume-size.yml for /cache/rhel-9.qcow2.snap => (item=size)
included: /tmp/tmpfc26zqih/tests/storage/test-verify-volume-cache.yml for /cache/rhel-9.qcow2.snap => (item=cache)

TASK [Get expected mount device based on device type] **************************
task path: /tmp/tmpfc26zqih/tests/storage/test-verify-volume-mount.yml:6
Thursday 21 July 2022  10:29:07 +0000 (0:00:00.071)       0:00:29.980 ********* 
ok: [/cache/rhel-9.qcow2.snap] => {
    "ansible_facts": {
        "storage_test_device_path": "/dev/sda"
    },
    "changed": false
}

TASK [Set some facts] **********************************************************
task path: /tmp/tmpfc26zqih/tests/storage/test-verify-volume-mount.yml:10
Thursday 21 July 2022  10:29:07 +0000 (0:00:00.040)       0:00:30.021 ********* 
ok: [/cache/rhel-9.qcow2.snap] => {
    "ansible_facts": {
        "storage_test_mount_device_matches": [
            {
                "block_available": 2592358,
                "block_size": 4096,
                "block_total": 2618880,
                "block_used": 26522,
                "device": "/dev/sda",
                "fstype": "xfs",
                "inode_available": 5242877,
                "inode_total": 5242880,
                "inode_used": 3,
                "mount": "/opt/test2",
                "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota",
                "size_available": 10618298368,
                "size_total": 10726932480,
                "uuid": "dc31169e-edd9-4a64-ab59-3286f5673c34"
            }
        ],
        "storage_test_mount_expected_match_count": "1",
        "storage_test_mount_point_matches": [
            {
                "block_available": 2592358,
                "block_size": 4096,
                "block_total": 2618880,
                "block_used": 26522,
                "device": "/dev/sda",
                "fstype": "xfs",
                "inode_available": 5242877,
                "inode_total": 5242880,
                "inode_used": 3,
                "mount": "/opt/test2",
                "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota",
                "size_available": 10618298368,
                "size_total": 10726932480,
                "uuid": "dc31169e-edd9-4a64-ab59-3286f5673c34"
            }
        ],
        "storage_test_swap_expected_matches": "0"
    },
    "changed": false
}

TASK [Verify the current mount state by device] ********************************
task path: /tmp/tmpfc26zqih/tests/storage/test-verify-volume-mount.yml:20
Thursday 21 July 2022  10:29:07 +0000 (0:00:00.053)       0:00:30.074 ********* 
ok: [/cache/rhel-9.qcow2.snap] => {
    "changed": false
}

MSG:

All assertions passed

TASK [Verify the current mount state by mount point] ***************************
task path: /tmp/tmpfc26zqih/tests/storage/test-verify-volume-mount.yml:29
Thursday 21 July 2022  10:29:07 +0000 (0:00:00.049)       0:00:30.124 ********* 
ok: [/cache/rhel-9.qcow2.snap] => {
    "changed": false
}

MSG:

All assertions passed

TASK [Verify the mount fs type] ************************************************
task path: /tmp/tmpfc26zqih/tests/storage/test-verify-volume-mount.yml:37
Thursday 21 July 2022  10:29:07 +0000 (0:00:00.048)       0:00:30.173 ********* 
ok: [/cache/rhel-9.qcow2.snap] => {
    "changed": false
}

MSG:

All assertions passed

TASK [command] *****************************************************************
task path: /tmp/tmpfc26zqih/tests/storage/test-verify-volume-mount.yml:46
Thursday 21 July 2022  10:29:07 +0000 (0:00:00.049)       0:00:30.222 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [Gather swap info] ********************************************************
task path: /tmp/tmpfc26zqih/tests/storage/test-verify-volume-mount.yml:50
Thursday 21 July 2022  10:29:07 +0000 (0:00:00.023)       0:00:30.246 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [Verify swap status] ******************************************************
task path: /tmp/tmpfc26zqih/tests/storage/test-verify-volume-mount.yml:55
Thursday 21 July 2022  10:29:07 +0000 (0:00:00.023)       0:00:30.270 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [Unset facts] *************************************************************
task path: /tmp/tmpfc26zqih/tests/storage/test-verify-volume-mount.yml:65
Thursday 21 July 2022  10:29:07 +0000 (0:00:00.024)       0:00:30.294 ********* 
ok: [/cache/rhel-9.qcow2.snap] => {
    "ansible_facts": {
        "storage_test_mount_device_matches": null,
        "storage_test_mount_expected_match_count": null,
        "storage_test_mount_point_matches": null,
        "storage_test_swap_expected_matches": null,
        "storage_test_swaps": null,
        "storage_test_sys_node": null
    },
    "changed": false
}

TASK [Set some variables for fstab checking] ***********************************
task path: /tmp/tmpfc26zqih/tests/storage/test-verify-volume-fstab.yml:2
Thursday 21 July 2022  10:29:07 +0000 (0:00:00.032)       0:00:30.326 ********* 
ok: [/cache/rhel-9.qcow2.snap] => {
    "ansible_facts": {
        "storage_test_fstab_expected_id_matches": "1",
        "storage_test_fstab_expected_mount_options_matches": "1",
        "storage_test_fstab_expected_mount_point_matches": "1",
        "storage_test_fstab_id_matches": [
            "UUID=dc31169e-edd9-4a64-ab59-3286f5673c34 "
        ],
        "storage_test_fstab_mount_options_matches": [
            " /opt/test2 xfs defaults "
        ],
        "storage_test_fstab_mount_point_matches": [
            " /opt/test2 "
        ]
    },
    "changed": false
}

TASK [Verify that the device identifier appears in /etc/fstab] *****************
task path: /tmp/tmpfc26zqih/tests/storage/test-verify-volume-fstab.yml:12
Thursday 21 July 2022  10:29:07 +0000 (0:00:00.059)       0:00:30.386 ********* 
ok: [/cache/rhel-9.qcow2.snap] => {
    "changed": false
}

MSG:

All assertions passed

TASK [Verify the fstab mount point] ********************************************
task path: /tmp/tmpfc26zqih/tests/storage/test-verify-volume-fstab.yml:19
Thursday 21 July 2022  10:29:07 +0000 (0:00:00.049)       0:00:30.436 ********* 
ok: [/cache/rhel-9.qcow2.snap] => {
    "changed": false
}

MSG:

All assertions passed

TASK [Verify mount_options] ****************************************************
task path: /tmp/tmpfc26zqih/tests/storage/test-verify-volume-fstab.yml:25
Thursday 21 July 2022  10:29:07 +0000 (0:00:00.047)       0:00:30.483 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [Clean up variables] ******************************************************
task path: /tmp/tmpfc26zqih/tests/storage/test-verify-volume-fstab.yml:34
Thursday 21 July 2022  10:29:07 +0000 (0:00:00.035)       0:00:30.518 ********* 
ok: [/cache/rhel-9.qcow2.snap] => {
    "ansible_facts": {
        "storage_test_fstab_expected_id_matches": null,
        "storage_test_fstab_expected_mount_options_matches": null,
        "storage_test_fstab_expected_mount_point_matches": null,
        "storage_test_fstab_id_matches": null,
        "storage_test_fstab_mount_options_matches": null,
        "storage_test_fstab_mount_point_matches": null
    },
    "changed": false
}

TASK [Verify fs type] **********************************************************
task path: /tmp/tmpfc26zqih/tests/storage/test-verify-volume-fs.yml:4
Thursday 21 July 2022  10:29:07 +0000 (0:00:00.034)       0:00:30.553 ********* 
ok: [/cache/rhel-9.qcow2.snap] => {
    "changed": false
}

MSG:

All assertions passed

TASK [Verify fs label] *********************************************************
task path: /tmp/tmpfc26zqih/tests/storage/test-verify-volume-fs.yml:10
Thursday 21 July 2022  10:29:08 +0000 (0:00:00.041)       0:00:30.594 ********* 
ok: [/cache/rhel-9.qcow2.snap] => {
    "changed": false
}

MSG:

All assertions passed

TASK [See whether the device node is present] **********************************
task path: /tmp/tmpfc26zqih/tests/storage/test-verify-volume-device.yml:4
Thursday 21 July 2022  10:29:08 +0000 (0:00:00.038)       0:00:30.633 ********* 
ok: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "stat": {
        "atime": 1658399330.874754,
        "attr_flags": "",
        "attributes": [],
        "block_size": 4096,
        "blocks": 0,
        "charset": "binary",
        "ctime": 1658399330.874754,
        "dev": 5,
        "device_type": 2048,
        "executable": false,
        "exists": true,
        "gid": 6,
        "gr_name": "disk",
        "inode": 356,
        "isblk": true,
        "ischr": false,
        "isdir": false,
        "isfifo": false,
        "isgid": false,
        "islnk": false,
        "isreg": false,
        "issock": false,
        "isuid": false,
        "mimetype": "inode/blockdevice",
        "mode": "0660",
        "mtime": 1658399330.874754,
        "nlink": 1,
        "path": "/dev/sda",
        "pw_name": "root",
        "readable": true,
        "rgrp": true,
        "roth": false,
        "rusr": true,
        "size": 0,
        "uid": 0,
        "version": null,
        "wgrp": true,
        "woth": false,
        "writeable": true,
        "wusr": true,
        "xgrp": false,
        "xoth": false,
        "xusr": false
    }
}

TASK [Verify the presence/absence of the device node] **************************
task path: /tmp/tmpfc26zqih/tests/storage/test-verify-volume-device.yml:10
Thursday 21 July 2022  10:29:08 +0000 (0:00:00.372)       0:00:31.006 ********* 
ok: [/cache/rhel-9.qcow2.snap] => {
    "changed": false
}

MSG:

All assertions passed

TASK [Make sure we got info about this volume] *********************************
task path: /tmp/tmpfc26zqih/tests/storage/test-verify-volume-device.yml:15
Thursday 21 July 2022  10:29:08 +0000 (0:00:00.037)       0:00:31.043 ********* 
ok: [/cache/rhel-9.qcow2.snap] => {
    "changed": false
}

MSG:

All assertions passed

TASK [(1/2) Process volume type (set initial value)] ***************************
task path: /tmp/tmpfc26zqih/tests/storage/test-verify-volume-device.yml:21
Thursday 21 July 2022  10:29:08 +0000 (0:00:00.038)       0:00:31.081 ********* 
ok: [/cache/rhel-9.qcow2.snap] => {
    "ansible_facts": {
        "st_volume_type": "disk"
    },
    "changed": false
}

TASK [(2/2) Process volume type (get RAID value)] ******************************
task path: /tmp/tmpfc26zqih/tests/storage/test-verify-volume-device.yml:25
Thursday 21 July 2022  10:29:08 +0000 (0:00:00.034)       0:00:31.116 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [Verify the volume's device type] *****************************************
task path: /tmp/tmpfc26zqih/tests/storage/test-verify-volume-device.yml:30
Thursday 21 July 2022  10:29:08 +0000 (0:00:00.023)       0:00:31.139 ********* 
ok: [/cache/rhel-9.qcow2.snap] => {
    "changed": false
}

MSG:

All assertions passed

TASK [Stat the LUKS device, if encrypted] **************************************
task path: /tmp/tmpfc26zqih/tests/storage/test-verify-volume-encryption.yml:3
Thursday 21 July 2022  10:29:08 +0000 (0:00:00.037)       0:00:31.176 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [Ensure cryptsetup is present] ********************************************
task path: /tmp/tmpfc26zqih/tests/storage/test-verify-volume-encryption.yml:10
Thursday 21 July 2022  10:29:08 +0000 (0:00:00.023)       0:00:31.200 ********* 
ok: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "rc": 0,
    "results": []
}

MSG:

Nothing to do

TASK [Collect LUKS info for this volume] ***************************************
task path: /tmp/tmpfc26zqih/tests/storage/test-verify-volume-encryption.yml:15
Thursday 21 July 2022  10:29:09 +0000 (0:00:00.865)       0:00:32.065 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [Verify the presence/absence of the LUKS device node] *********************
task path: /tmp/tmpfc26zqih/tests/storage/test-verify-volume-encryption.yml:21
Thursday 21 July 2022  10:29:09 +0000 (0:00:00.025)       0:00:32.090 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [Verify that the raw device is the same as the device if not encrypted] ***
task path: /tmp/tmpfc26zqih/tests/storage/test-verify-volume-encryption.yml:27
Thursday 21 July 2022  10:29:09 +0000 (0:00:00.024)       0:00:32.115 ********* 
ok: [/cache/rhel-9.qcow2.snap] => {
    "changed": false
}

MSG:

All assertions passed

TASK [Make sure we got info about the LUKS volume if encrypted] ****************
task path: /tmp/tmpfc26zqih/tests/storage/test-verify-volume-encryption.yml:33
Thursday 21 July 2022  10:29:09 +0000 (0:00:00.054)       0:00:32.169 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [Verify the LUKS volume's device type if encrypted] ***********************
task path: /tmp/tmpfc26zqih/tests/storage/test-verify-volume-encryption.yml:39
Thursday 21 July 2022  10:29:09 +0000 (0:00:00.035)       0:00:32.204 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [Check LUKS version] ******************************************************
task path: /tmp/tmpfc26zqih/tests/storage/test-verify-volume-encryption.yml:44
Thursday 21 July 2022  10:29:09 +0000 (0:00:00.032)       0:00:32.237 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [Check LUKS key size] *****************************************************
task path: /tmp/tmpfc26zqih/tests/storage/test-verify-volume-encryption.yml:50
Thursday 21 July 2022  10:29:09 +0000 (0:00:00.033)       0:00:32.271 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [Check LUKS cipher] *******************************************************
task path: /tmp/tmpfc26zqih/tests/storage/test-verify-volume-encryption.yml:56
Thursday 21 July 2022  10:29:09 +0000 (0:00:00.032)       0:00:32.304 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [set_fact] ****************************************************************
task path: /tmp/tmpfc26zqih/tests/storage/test-verify-volume-encryption.yml:62
Thursday 21 July 2022  10:29:09 +0000 (0:00:00.034)       0:00:32.339 ********* 
ok: [/cache/rhel-9.qcow2.snap] => {
    "ansible_facts": {
        "_storage_test_crypttab_entries": [],
        "_storage_test_expected_crypttab_entries": "0",
        "_storage_test_expected_crypttab_key_file": "-"
    },
    "changed": false
}

TASK [Check for /etc/crypttab entry] *******************************************
task path: /tmp/tmpfc26zqih/tests/storage/test-verify-volume-encryption.yml:67
Thursday 21 July 2022  10:29:09 +0000 (0:00:00.048)       0:00:32.387 ********* 
ok: [/cache/rhel-9.qcow2.snap] => {
    "changed": false
}

MSG:

All assertions passed

TASK [Validate the format of the crypttab entry] *******************************
task path: /tmp/tmpfc26zqih/tests/storage/test-verify-volume-encryption.yml:72
Thursday 21 July 2022  10:29:09 +0000 (0:00:00.048)       0:00:32.435 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [Check backing device of crypttab entry] **********************************
task path: /tmp/tmpfc26zqih/tests/storage/test-verify-volume-encryption.yml:78
Thursday 21 July 2022  10:29:09 +0000 (0:00:00.038)       0:00:32.473 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [Check key file of crypttab entry] ****************************************
task path: /tmp/tmpfc26zqih/tests/storage/test-verify-volume-encryption.yml:84
Thursday 21 July 2022  10:29:09 +0000 (0:00:00.038)       0:00:32.511 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [set_fact] ****************************************************************
task path: /tmp/tmpfc26zqih/tests/storage/test-verify-volume-encryption.yml:90
Thursday 21 July 2022  10:29:09 +0000 (0:00:00.036)       0:00:32.547 ********* 
ok: [/cache/rhel-9.qcow2.snap] => {
    "ansible_facts": {
        "_storage_test_crypttab_entries": null,
        "_storage_test_expected_crypttab_entries": null,
        "_storage_test_expected_crypttab_key_file": null
    },
    "changed": false
}

TASK [get information about RAID] **********************************************
task path: /tmp/tmpfc26zqih/tests/storage/test-verify-volume-md.yml:7
Thursday 21 July 2022  10:29:10 +0000 (0:00:00.034)       0:00:32.581 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [set_fact] ****************************************************************
task path: /tmp/tmpfc26zqih/tests/storage/test-verify-volume-md.yml:13
Thursday 21 July 2022  10:29:10 +0000 (0:00:00.035)       0:00:32.617 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [set_fact] ****************************************************************
task path: /tmp/tmpfc26zqih/tests/storage/test-verify-volume-md.yml:17
Thursday 21 July 2022  10:29:10 +0000 (0:00:00.034)       0:00:32.651 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [set_fact] ****************************************************************
task path: /tmp/tmpfc26zqih/tests/storage/test-verify-volume-md.yml:21
Thursday 21 July 2022  10:29:10 +0000 (0:00:00.034)       0:00:32.685 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [check RAID active devices count] *****************************************
task path: /tmp/tmpfc26zqih/tests/storage/test-verify-volume-md.yml:25
Thursday 21 July 2022  10:29:10 +0000 (0:00:00.033)       0:00:32.719 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [check RAID spare devices count] ******************************************
task path: /tmp/tmpfc26zqih/tests/storage/test-verify-volume-md.yml:31
Thursday 21 July 2022  10:29:10 +0000 (0:00:00.035)       0:00:32.755 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [check RAID metadata version] *********************************************
task path: /tmp/tmpfc26zqih/tests/storage/test-verify-volume-md.yml:37
Thursday 21 July 2022  10:29:10 +0000 (0:00:00.036)       0:00:32.792 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [parse the actual size of the volume] *************************************
task path: /tmp/tmpfc26zqih/tests/storage/test-verify-volume-size.yml:3
Thursday 21 July 2022  10:29:10 +0000 (0:00:00.036)       0:00:32.828 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [parse the requested size of the volume] **********************************
task path: /tmp/tmpfc26zqih/tests/storage/test-verify-volume-size.yml:9
Thursday 21 July 2022  10:29:10 +0000 (0:00:00.024)       0:00:32.853 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [Establish base value for expected size] **********************************
task path: /tmp/tmpfc26zqih/tests/storage/test-verify-volume-size.yml:15
Thursday 21 July 2022  10:29:10 +0000 (0:00:00.037)       0:00:32.890 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [debug] *******************************************************************
task path: /tmp/tmpfc26zqih/tests/storage/test-verify-volume-size.yml:20
Thursday 21 July 2022  10:29:10 +0000 (0:00:00.035)       0:00:32.926 ********* 
ok: [/cache/rhel-9.qcow2.snap] => {
    "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined"
}

TASK [debug] *******************************************************************
task path: /tmp/tmpfc26zqih/tests/storage/test-verify-volume-size.yml:25
Thursday 21 July 2022  10:29:10 +0000 (0:00:00.034)       0:00:32.961 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {}

TASK [debug] *******************************************************************
task path: /tmp/tmpfc26zqih/tests/storage/test-verify-volume-size.yml:28
Thursday 21 July 2022  10:29:10 +0000 (0:00:00.035)       0:00:32.996 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {}

TASK [Get the size of parent/pool device] **************************************
task path: /tmp/tmpfc26zqih/tests/storage/test-verify-volume-size.yml:31
Thursday 21 July 2022  10:29:10 +0000 (0:00:00.037)       0:00:33.034 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [debug] *******************************************************************
task path: /tmp/tmpfc26zqih/tests/storage/test-verify-volume-size.yml:36
Thursday 21 July 2022  10:29:10 +0000 (0:00:00.036)       0:00:33.071 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {}

TASK [Calculate the expected size based on pool size and percentage value] *****
task path: /tmp/tmpfc26zqih/tests/storage/test-verify-volume-size.yml:39
Thursday 21 July 2022  10:29:10 +0000 (0:00:00.035)       0:00:33.106 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [debug] *******************************************************************
task path: /tmp/tmpfc26zqih/tests/storage/test-verify-volume-size.yml:44
Thursday 21 July 2022  10:29:10 +0000 (0:00:00.038)       0:00:33.145 ********* 
ok: [/cache/rhel-9.qcow2.snap] => {
    "storage_test_actual_size": {
        "changed": false,
        "skip_reason": "Conditional result was False",
        "skipped": true
    }
}

TASK [debug] *******************************************************************
task path: /tmp/tmpfc26zqih/tests/storage/test-verify-volume-size.yml:47
Thursday 21 July 2022  10:29:10 +0000 (0:00:00.040)       0:00:33.186 ********* 
ok: [/cache/rhel-9.qcow2.snap] => {
    "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined"
}

TASK [assert] ******************************************************************
task path: /tmp/tmpfc26zqih/tests/storage/test-verify-volume-size.yml:50
Thursday 21 July 2022  10:29:10 +0000 (0:00:00.034)       0:00:33.221 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [Get information about the LV] ********************************************
task path: /tmp/tmpfc26zqih/tests/storage/test-verify-volume-cache.yml:6
Thursday 21 July 2022  10:29:10 +0000 (0:00:00.025)       0:00:33.246 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [set_fact] ****************************************************************
task path: /tmp/tmpfc26zqih/tests/storage/test-verify-volume-cache.yml:14
Thursday 21 July 2022  10:29:10 +0000 (0:00:00.024)       0:00:33.271 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [check segment type] ******************************************************
task path: /tmp/tmpfc26zqih/tests/storage/test-verify-volume-cache.yml:17
Thursday 21 July 2022  10:29:10 +0000 (0:00:00.026)       0:00:33.297 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [set_fact] ****************************************************************
task path: /tmp/tmpfc26zqih/tests/storage/test-verify-volume-cache.yml:22
Thursday 21 July 2022  10:29:10 +0000 (0:00:00.023)       0:00:33.321 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [parse the requested cache size] ******************************************
task path: /tmp/tmpfc26zqih/tests/storage/test-verify-volume-cache.yml:26
Thursday 21 July 2022  10:29:10 +0000 (0:00:00.024)       0:00:33.345 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [set_fact] ****************************************************************
task path: /tmp/tmpfc26zqih/tests/storage/test-verify-volume-cache.yml:32
Thursday 21 July 2022  10:29:10 +0000 (0:00:00.023)       0:00:33.369 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [Check cache size] ********************************************************
task path: /tmp/tmpfc26zqih/tests/storage/test-verify-volume-cache.yml:36
Thursday 21 July 2022  10:29:10 +0000 (0:00:00.025)       0:00:33.394 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [Clean up facts] **********************************************************
task path: /tmp/tmpfc26zqih/tests/storage/test-verify-volume.yml:16
Thursday 21 July 2022  10:29:10 +0000 (0:00:00.054)       0:00:33.449 ********* 
ok: [/cache/rhel-9.qcow2.snap] => {
    "ansible_facts": {
        "_storage_test_volume_present": null
    },
    "changed": false
}

TASK [Clean up variable namespace] *********************************************
task path: /tmp/tmpfc26zqih/tests/storage/verify-role-results.yml:53
Thursday 21 July 2022  10:29:10 +0000 (0:00:00.032)       0:00:33.481 ********* 
ok: [/cache/rhel-9.qcow2.snap] => {
    "ansible_facts": {
        "storage_test_blkinfo": null,
        "storage_test_crypttab": null,
        "storage_test_fstab": null
    },
    "changed": false
}

TASK [Repeat the previous invocation to verify idempotence] ********************
task path: /tmp/tmpfc26zqih/tests/storage/tests_change_disk_mount.yml:49
Thursday 21 July 2022  10:29:10 +0000 (0:00:00.033)       0:00:33.514 ********* 

TASK [fedora.linux_system_roles.storage : set platform/version specific variables] ***
task path: /tmp/tmpi19f9hzy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2
Thursday 21 July 2022  10:29:10 +0000 (0:00:00.049)       0:00:33.564 ********* 
included: /tmp/tmpi19f9hzy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for /cache/rhel-9.qcow2.snap

TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] ***
task path: /tmp/tmpi19f9hzy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2
Thursday 21 July 2022  10:29:11 +0000 (0:00:00.034)       0:00:33.599 ********* 
ok: [/cache/rhel-9.qcow2.snap]

TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] ***
task path: /tmp/tmpi19f9hzy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7
Thursday 21 July 2022  10:29:11 +0000 (0:00:00.517)       0:00:34.117 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => (item=RedHat.yml)  => {
    "ansible_loop_var": "item",
    "changed": false,
    "item": "RedHat.yml",
    "skip_reason": "Conditional result was False"
}
skipping: [/cache/rhel-9.qcow2.snap] => (item=RedHat.yml)  => {
    "ansible_loop_var": "item",
    "changed": false,
    "item": "RedHat.yml",
    "skip_reason": "Conditional result was False"
}
ok: [/cache/rhel-9.qcow2.snap] => (item=RedHat_9.yml) => {
    "ansible_facts": {
        "blivet_package_list": [
            "python3-blivet",
            "libblockdev-crypto",
            "libblockdev-dm",
            "libblockdev-lvm",
            "libblockdev-mdraid",
            "libblockdev-swap",
            "vdo",
            "kmod-kvdo",
            "xfsprogs"
        ]
    },
    "ansible_included_var_files": [
        "/tmp/tmpi19f9hzy/ansible_collections/fedora/linux_system_roles/roles/storage/vars/RedHat_9.yml"
    ],
    "ansible_loop_var": "item",
    "changed": false,
    "item": "RedHat_9.yml"
}
skipping: [/cache/rhel-9.qcow2.snap] => (item=RedHat_9.0.yml)  => {
    "ansible_loop_var": "item",
    "changed": false,
    "item": "RedHat_9.0.yml",
    "skip_reason": "Conditional result was False"
}

TASK [fedora.linux_system_roles.storage : define an empty list of pools to be used in testing] ***
task path: /tmp/tmpi19f9hzy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5
Thursday 21 July 2022  10:29:11 +0000 (0:00:00.065)       0:00:34.182 ********* 
ok: [/cache/rhel-9.qcow2.snap] => {
    "ansible_facts": {
        "_storage_pools_list": []
    },
    "changed": false
}

TASK [fedora.linux_system_roles.storage : define an empty list of volumes to be used in testing] ***
task path: /tmp/tmpi19f9hzy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9
Thursday 21 July 2022  10:29:11 +0000 (0:00:00.033)       0:00:34.216 ********* 
ok: [/cache/rhel-9.qcow2.snap] => {
    "ansible_facts": {
        "_storage_volumes_list": []
    },
    "changed": false
}

TASK [fedora.linux_system_roles.storage : include the appropriate provider tasks] ***
task path: /tmp/tmpi19f9hzy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13
Thursday 21 July 2022  10:29:11 +0000 (0:00:00.033)       0:00:34.249 ********* 
redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount
redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount
included: /tmp/tmpi19f9hzy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for /cache/rhel-9.qcow2.snap

TASK [fedora.linux_system_roles.storage : get a list of rpm packages installed on host machine] ***
task path: /tmp/tmpi19f9hzy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2
Thursday 21 July 2022  10:29:11 +0000 (0:00:00.053)       0:00:34.303 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [fedora.linux_system_roles.storage : make sure blivet is available] *******
task path: /tmp/tmpi19f9hzy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:7
Thursday 21 July 2022  10:29:11 +0000 (0:00:00.021)       0:00:34.324 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [fedora.linux_system_roles.storage : show storage_pools] ******************
task path: /tmp/tmpi19f9hzy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:13
Thursday 21 July 2022  10:29:11 +0000 (0:00:00.026)       0:00:34.351 ********* 
ok: [/cache/rhel-9.qcow2.snap] => {
    "storage_pools": "VARIABLE IS NOT DEFINED!: 'storage_pools' is undefined"
}

TASK [fedora.linux_system_roles.storage : show storage_volumes] ****************
task path: /tmp/tmpi19f9hzy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:18
Thursday 21 July 2022  10:29:11 +0000 (0:00:00.037)       0:00:34.388 ********* 
ok: [/cache/rhel-9.qcow2.snap] => {
    "storage_volumes": [
        {
            "disks": [
                "sda"
            ],
            "mount_point": "/opt/test2",
            "name": "test1",
            "type": "disk"
        }
    ]
}

TASK [fedora.linux_system_roles.storage : get required packages] ***************
task path: /tmp/tmpi19f9hzy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:23
Thursday 21 July 2022  10:29:11 +0000 (0:00:00.035)       0:00:34.423 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [fedora.linux_system_roles.storage : enable copr repositories if needed] ***
task path: /tmp/tmpi19f9hzy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:35
Thursday 21 July 2022  10:29:11 +0000 (0:00:00.026)       0:00:34.450 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [fedora.linux_system_roles.storage : make sure required packages are installed] ***
task path: /tmp/tmpi19f9hzy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:41
Thursday 21 July 2022  10:29:11 +0000 (0:00:00.026)       0:00:34.477 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [fedora.linux_system_roles.storage : get service facts] *******************
task path: /tmp/tmpi19f9hzy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:47
Thursday 21 July 2022  10:29:11 +0000 (0:00:00.027)       0:00:34.505 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [fedora.linux_system_roles.storage : Set storage_cryptsetup_services] *****
task path: /tmp/tmpi19f9hzy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:53
Thursday 21 July 2022  10:29:11 +0000 (0:00:00.030)       0:00:34.535 ********* 
ok: [/cache/rhel-9.qcow2.snap] => {
    "ansible_facts": {
        "storage_cryptsetup_services": []
    },
    "changed": false
}

TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] ***
task path: /tmp/tmpi19f9hzy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:58
Thursday 21 July 2022  10:29:12 +0000 (0:00:00.058)       0:00:34.594 ********* 

TASK [fedora.linux_system_roles.storage : manage the pools and volumes to match the specified state] ***
task path: /tmp/tmpi19f9hzy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:64
Thursday 21 July 2022  10:29:12 +0000 (0:00:00.020)       0:00:34.615 ********* 
ok: [/cache/rhel-9.qcow2.snap] => {
    "actions": [],
    "changed": false,
    "crypts": [],
    "leaves": [
        "/dev/sr0",
        "/dev/vda1",
        "/dev/vda2",
        "/dev/vda3",
        "/dev/vda4",
        "/dev/sda",
        "/dev/sdb",
        "/dev/sdc",
        "/dev/nvme0n1",
        "/dev/nvme1n1",
        "/dev/nvme2n1",
        "/dev/vdb1",
        "/dev/vdc1",
        "/dev/vdd",
        "/dev/vde",
        "/dev/vdf"
    ],
    "mounts": [
        {
            "dump": 0,
            "fstype": "xfs",
            "opts": "defaults",
            "passno": 0,
            "path": "/opt/test2",
            "src": "UUID=dc31169e-edd9-4a64-ab59-3286f5673c34",
            "state": "mounted"
        }
    ],
    "packages": [
        "dosfstools",
        "e2fsprogs",
        "xfsprogs"
    ],
    "pools": [],
    "volumes": [
        {
            "_device": "/dev/sda",
            "_kernel_device": "/dev/sda",
            "_mount_id": "UUID=dc31169e-edd9-4a64-ab59-3286f5673c34",
            "_raw_device": "/dev/sda",
            "_raw_kernel_device": "/dev/sda",
            "cache_devices": [],
            "cache_mode": null,
            "cache_size": 0,
            "cached": false,
            "compression": null,
            "deduplication": null,
            "disks": [
                "sda"
            ],
            "encryption": false,
            "encryption_cipher": null,
            "encryption_key": null,
            "encryption_key_size": null,
            "encryption_luks_version": null,
            "encryption_password": null,
            "fs_create_options": "",
            "fs_label": "",
            "fs_overwrite_existing": true,
            "fs_type": "xfs",
            "mount_check": 0,
            "mount_device_identifier": "uuid",
            "mount_options": "defaults",
            "mount_passno": 0,
            "mount_point": "/opt/test2",
            "name": "test1",
            "raid_chunk_size": null,
            "raid_device_count": null,
            "raid_level": null,
            "raid_metadata_version": null,
            "raid_spare_count": null,
            "size": 10737418240,
            "state": "present",
            "thin": null,
            "thin_pool_name": null,
            "thin_pool_size": null,
            "type": "disk",
            "vdo_pool_size": null
        }
    ]
}

TASK [fedora.linux_system_roles.storage : Workaround for udev issue on some platforms] ***
task path: /tmp/tmpi19f9hzy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:78
Thursday 21 July 2022  10:29:13 +0000 (0:00:01.298)       0:00:35.913 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] ***
task path: /tmp/tmpi19f9hzy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:90
Thursday 21 July 2022  10:29:13 +0000 (0:00:00.037)       0:00:35.950 ********* 

TASK [fedora.linux_system_roles.storage : show blivet_output] ******************
task path: /tmp/tmpi19f9hzy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:96
Thursday 21 July 2022  10:29:13 +0000 (0:00:00.023)       0:00:35.974 ********* 
ok: [/cache/rhel-9.qcow2.snap] => {
    "blivet_output": {
        "actions": [],
        "changed": false,
        "crypts": [],
        "failed": false,
        "leaves": [
            "/dev/sr0",
            "/dev/vda1",
            "/dev/vda2",
            "/dev/vda3",
            "/dev/vda4",
            "/dev/sda",
            "/dev/sdb",
            "/dev/sdc",
            "/dev/nvme0n1",
            "/dev/nvme1n1",
            "/dev/nvme2n1",
            "/dev/vdb1",
            "/dev/vdc1",
            "/dev/vdd",
            "/dev/vde",
            "/dev/vdf"
        ],
        "mounts": [
            {
                "dump": 0,
                "fstype": "xfs",
                "opts": "defaults",
                "passno": 0,
                "path": "/opt/test2",
                "src": "UUID=dc31169e-edd9-4a64-ab59-3286f5673c34",
                "state": "mounted"
            }
        ],
        "packages": [
            "dosfstools",
            "e2fsprogs",
            "xfsprogs"
        ],
        "pools": [],
        "volumes": [
            {
                "_device": "/dev/sda",
                "_kernel_device": "/dev/sda",
                "_mount_id": "UUID=dc31169e-edd9-4a64-ab59-3286f5673c34",
                "_raw_device": "/dev/sda",
                "_raw_kernel_device": "/dev/sda",
                "cache_devices": [],
                "cache_mode": null,
                "cache_size": 0,
                "cached": false,
                "compression": null,
                "deduplication": null,
                "disks": [
                    "sda"
                ],
                "encryption": false,
                "encryption_cipher": null,
                "encryption_key": null,
                "encryption_key_size": null,
                "encryption_luks_version": null,
                "encryption_password": null,
                "fs_create_options": "",
                "fs_label": "",
                "fs_overwrite_existing": true,
                "fs_type": "xfs",
                "mount_check": 0,
                "mount_device_identifier": "uuid",
                "mount_options": "defaults",
                "mount_passno": 0,
                "mount_point": "/opt/test2",
                "name": "test1",
                "raid_chunk_size": null,
                "raid_device_count": null,
                "raid_level": null,
                "raid_metadata_version": null,
                "raid_spare_count": null,
                "size": 10737418240,
                "state": "present",
                "thin": null,
                "thin_pool_name": null,
                "thin_pool_size": null,
                "type": "disk",
                "vdo_pool_size": null
            }
        ]
    }
}

TASK [fedora.linux_system_roles.storage : set the list of pools for test verification] ***
task path: /tmp/tmpi19f9hzy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:101
Thursday 21 July 2022  10:29:13 +0000 (0:00:00.078)       0:00:36.052 ********* 
ok: [/cache/rhel-9.qcow2.snap] => {
    "ansible_facts": {
        "_storage_pools_list": []
    },
    "changed": false
}

TASK [fedora.linux_system_roles.storage : set the list of volumes for test verification] ***
task path: /tmp/tmpi19f9hzy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:105
Thursday 21 July 2022  10:29:13 +0000 (0:00:00.037)       0:00:36.090 ********* 
ok: [/cache/rhel-9.qcow2.snap] => {
    "ansible_facts": {
        "_storage_volumes_list": [
            {
                "_device": "/dev/sda",
                "_kernel_device": "/dev/sda",
                "_mount_id": "UUID=dc31169e-edd9-4a64-ab59-3286f5673c34",
                "_raw_device": "/dev/sda",
                "_raw_kernel_device": "/dev/sda",
                "cache_devices": [],
                "cache_mode": null,
                "cache_size": 0,
                "cached": false,
                "compression": null,
                "deduplication": null,
                "disks": [
                    "sda"
                ],
                "encryption": false,
                "encryption_cipher": null,
                "encryption_key": null,
                "encryption_key_size": null,
                "encryption_luks_version": null,
                "encryption_password": null,
                "fs_create_options": "",
                "fs_label": "",
                "fs_overwrite_existing": true,
                "fs_type": "xfs",
                "mount_check": 0,
                "mount_device_identifier": "uuid",
                "mount_options": "defaults",
                "mount_passno": 0,
                "mount_point": "/opt/test2",
                "name": "test1",
                "raid_chunk_size": null,
                "raid_device_count": null,
                "raid_level": null,
                "raid_metadata_version": null,
                "raid_spare_count": null,
                "size": 10737418240,
                "state": "present",
                "thin": null,
                "thin_pool_name": null,
                "thin_pool_size": null,
                "type": "disk",
                "vdo_pool_size": null
            }
        ]
    },
    "changed": false
}

TASK [fedora.linux_system_roles.storage : remove obsolete mounts] **************
task path: /tmp/tmpi19f9hzy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:121
Thursday 21 July 2022  10:29:13 +0000 (0:00:00.076)       0:00:36.166 ********* 

TASK [fedora.linux_system_roles.storage : tell systemd to refresh its view of /etc/fstab] ***
task path: /tmp/tmpi19f9hzy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:132
Thursday 21 July 2022  10:29:13 +0000 (0:00:00.073)       0:00:36.239 ********* 
ok: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "name": null,
    "status": {}
}

TASK [fedora.linux_system_roles.storage : set up new/current mounts] ***********
task path: /tmp/tmpi19f9hzy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:137
Thursday 21 July 2022  10:29:14 +0000 (0:00:00.638)       0:00:36.878 ********* 
redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount
ok: [/cache/rhel-9.qcow2.snap] => (item={'src': 'UUID=dc31169e-edd9-4a64-ab59-3286f5673c34', 'path': '/opt/test2', 'fstype': 'xfs', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted'}) => {
    "ansible_loop_var": "mount_info",
    "backup_file": "",
    "boot": "yes",
    "changed": false,
    "dump": "0",
    "fstab": "/etc/fstab",
    "fstype": "xfs",
    "mount_info": {
        "dump": 0,
        "fstype": "xfs",
        "opts": "defaults",
        "passno": 0,
        "path": "/opt/test2",
        "src": "UUID=dc31169e-edd9-4a64-ab59-3286f5673c34",
        "state": "mounted"
    },
    "name": "/opt/test2",
    "opts": "defaults",
    "passno": "0",
    "src": "UUID=dc31169e-edd9-4a64-ab59-3286f5673c34"
}

TASK [fedora.linux_system_roles.storage : tell systemd to refresh its view of /etc/fstab] ***
task path: /tmp/tmpi19f9hzy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:148
Thursday 21 July 2022  10:29:14 +0000 (0:00:00.380)       0:00:37.259 ********* 
ok: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "name": null,
    "status": {}
}

TASK [fedora.linux_system_roles.storage : retrieve facts for the /etc/crypttab file] ***
task path: /tmp/tmpi19f9hzy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:156
Thursday 21 July 2022  10:29:15 +0000 (0:00:00.636)       0:00:37.896 ********* 
ok: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "stat": {
        "exists": false
    }
}

TASK [fedora.linux_system_roles.storage : manage /etc/crypttab to account for changes we just made] ***
task path: /tmp/tmpi19f9hzy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:161
Thursday 21 July 2022  10:29:15 +0000 (0:00:00.356)       0:00:38.252 ********* 

TASK [fedora.linux_system_roles.storage : Update facts] ************************
task path: /tmp/tmpi19f9hzy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:183
Thursday 21 July 2022  10:29:15 +0000 (0:00:00.022)       0:00:38.275 ********* 
ok: [/cache/rhel-9.qcow2.snap]
META: role_complete for /cache/rhel-9.qcow2.snap

TASK [include_tasks] ***********************************************************
task path: /tmp/tmpfc26zqih/tests/storage/tests_change_disk_mount.yml:59
Thursday 21 July 2022  10:29:16 +0000 (0:00:00.947)       0:00:39.223 ********* 
included: /tmp/tmpfc26zqih/tests/storage/verify-role-results.yml for /cache/rhel-9.qcow2.snap

TASK [Print out pool information] **********************************************
task path: /tmp/tmpfc26zqih/tests/storage/verify-role-results.yml:1
Thursday 21 July 2022  10:29:16 +0000 (0:00:00.043)       0:00:39.267 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {}

TASK [Print out volume information] ********************************************
task path: /tmp/tmpfc26zqih/tests/storage/verify-role-results.yml:6
Thursday 21 July 2022  10:29:16 +0000 (0:00:00.033)       0:00:39.301 ********* 
ok: [/cache/rhel-9.qcow2.snap] => {
    "_storage_volumes_list": [
        {
            "_device": "/dev/sda",
            "_kernel_device": "/dev/sda",
            "_mount_id": "UUID=dc31169e-edd9-4a64-ab59-3286f5673c34",
            "_raw_device": "/dev/sda",
            "_raw_kernel_device": "/dev/sda",
            "cache_devices": [],
            "cache_mode": null,
            "cache_size": 0,
            "cached": false,
            "compression": null,
            "deduplication": null,
            "disks": [
                "sda"
            ],
            "encryption": false,
            "encryption_cipher": null,
            "encryption_key": null,
            "encryption_key_size": null,
            "encryption_luks_version": null,
            "encryption_password": null,
            "fs_create_options": "",
            "fs_label": "",
            "fs_overwrite_existing": true,
            "fs_type": "xfs",
            "mount_check": 0,
            "mount_device_identifier": "uuid",
            "mount_options": "defaults",
            "mount_passno": 0,
            "mount_point": "/opt/test2",
            "name": "test1",
            "raid_chunk_size": null,
            "raid_device_count": null,
            "raid_level": null,
            "raid_metadata_version": null,
            "raid_spare_count": null,
            "size": 10737418240,
            "state": "present",
            "thin": null,
            "thin_pool_name": null,
            "thin_pool_size": null,
            "type": "disk",
            "vdo_pool_size": null
        }
    ]
}

TASK [Collect info about the volumes.] *****************************************
task path: /tmp/tmpfc26zqih/tests/storage/verify-role-results.yml:14
Thursday 21 July 2022  10:29:16 +0000 (0:00:00.047)       0:00:39.348 ********* 
ok: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "info": {
        "/dev/nvme0n1": {
            "fstype": "",
            "label": "",
            "name": "/dev/nvme0n1",
            "size": "10G",
            "type": "disk",
            "uuid": ""
        },
        "/dev/nvme1n1": {
            "fstype": "",
            "label": "",
            "name": "/dev/nvme1n1",
            "size": "10G",
            "type": "disk",
            "uuid": ""
        },
        "/dev/nvme2n1": {
            "fstype": "",
            "label": "",
            "name": "/dev/nvme2n1",
            "size": "10G",
            "type": "disk",
            "uuid": ""
        },
        "/dev/sda": {
            "fstype": "xfs",
            "label": "",
            "name": "/dev/sda",
            "size": "10G",
            "type": "disk",
            "uuid": "dc31169e-edd9-4a64-ab59-3286f5673c34"
        },
        "/dev/sdb": {
            "fstype": "",
            "label": "",
            "name": "/dev/sdb",
            "size": "10G",
            "type": "disk",
            "uuid": ""
        },
        "/dev/sdc": {
            "fstype": "",
            "label": "",
            "name": "/dev/sdc",
            "size": "10G",
            "type": "disk",
            "uuid": ""
        },
        "/dev/sr0": {
            "fstype": "iso9660",
            "label": "cidata",
            "name": "/dev/sr0",
            "size": "364K",
            "type": "rom",
            "uuid": "2022-07-21-10-28-24-00"
        },
        "/dev/vda": {
            "fstype": "",
            "label": "",
            "name": "/dev/vda",
            "size": "10G",
            "type": "disk",
            "uuid": ""
        },
        "/dev/vda1": {
            "fstype": "",
            "label": "",
            "name": "/dev/vda1",
            "size": "1M",
            "type": "partition",
            "uuid": ""
        },
        "/dev/vda2": {
            "fstype": "vfat",
            "label": "",
            "name": "/dev/vda2",
            "size": "200M",
            "type": "partition",
            "uuid": "7B77-95E7"
        },
        "/dev/vda3": {
            "fstype": "xfs",
            "label": "boot",
            "name": "/dev/vda3",
            "size": "500M",
            "type": "partition",
            "uuid": "6e74e171-0370-451f-8340-f16ad2839183"
        },
        "/dev/vda4": {
            "fstype": "xfs",
            "label": "root",
            "name": "/dev/vda4",
            "size": "9.3G",
            "type": "partition",
            "uuid": "1bb53e4d-984c-4316-908a-59b5a62fa30e"
        },
        "/dev/vdb": {
            "fstype": "",
            "label": "",
            "name": "/dev/vdb",
            "size": "2G",
            "type": "disk",
            "uuid": ""
        },
        "/dev/vdb1": {
            "fstype": "ext4",
            "label": "yumcache",
            "name": "/dev/vdb1",
            "size": "2G",
            "type": "partition",
            "uuid": "12be66f7-7f03-4d54-9a48-a520825d8fa9"
        },
        "/dev/vdc": {
            "fstype": "",
            "label": "",
            "name": "/dev/vdc",
            "size": "2G",
            "type": "disk",
            "uuid": ""
        },
        "/dev/vdc1": {
            "fstype": "ext4",
            "label": "yumvarlib",
            "name": "/dev/vdc1",
            "size": "2G",
            "type": "partition",
            "uuid": "9bd037ed-b1bb-4b9c-a008-9f07d34ab071"
        },
        "/dev/vdd": {
            "fstype": "",
            "label": "",
            "name": "/dev/vdd",
            "size": "10G",
            "type": "disk",
            "uuid": ""
        },
        "/dev/vde": {
            "fstype": "",
            "label": "",
            "name": "/dev/vde",
            "size": "10G",
            "type": "disk",
            "uuid": ""
        },
        "/dev/vdf": {
            "fstype": "",
            "label": "",
            "name": "/dev/vdf",
            "size": "10G",
            "type": "disk",
            "uuid": ""
        }
    }
}

TASK [Read the /etc/fstab file for volume existence] ***************************
task path: /tmp/tmpfc26zqih/tests/storage/verify-role-results.yml:19
Thursday 21 July 2022  10:29:17 +0000 (0:00:00.355)       0:00:39.704 ********* 
ok: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "cmd": [
        "cat",
        "/etc/fstab"
    ],
    "delta": "0:00:00.002442",
    "end": "2022-07-21 06:29:17.923278",
    "rc": 0,
    "start": "2022-07-21 06:29:17.920836"
}

STDOUT:

UUID=6e74e171-0370-451f-8340-f16ad2839183	/boot	xfs	defaults	0	0
UUID=1bb53e4d-984c-4316-908a-59b5a62fa30e	/	xfs	defaults	0	0
UUID=7B77-95E7	/boot/efi	vfat	defaults,uid=0,gid=0,umask=077,shortname=winnt	0	2
/dev/vdb1	/var/cache/dnf	auto	defaults,nofail,x-systemd.requires=cloud-init.service,_netdev,comment=cloudconfig	0	2
/dev/vdc1	/var/lib/dnf	auto	defaults,nofail,x-systemd.requires=cloud-init.service,_netdev,comment=cloudconfig	0	2
UUID=dc31169e-edd9-4a64-ab59-3286f5673c34 /opt/test2 xfs defaults 0 0

TASK [Read the /etc/crypttab file] *********************************************
task path: /tmp/tmpfc26zqih/tests/storage/verify-role-results.yml:24
Thursday 21 July 2022  10:29:17 +0000 (0:00:00.391)       0:00:40.095 ********* 
ok: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "cmd": [
        "cat",
        "/etc/crypttab"
    ],
    "delta": "0:00:00.002556",
    "end": "2022-07-21 06:29:18.272731",
    "failed_when_result": false,
    "rc": 1,
    "start": "2022-07-21 06:29:18.270175"
}

STDERR:

cat: /etc/crypttab: No such file or directory


MSG:

non-zero return code

TASK [Verify the volumes listed in storage_pools were correctly managed] *******
task path: /tmp/tmpfc26zqih/tests/storage/verify-role-results.yml:33
Thursday 21 July 2022  10:29:17 +0000 (0:00:00.348)       0:00:40.444 ********* 

TASK [Verify the volumes with no pool were correctly managed] ******************
task path: /tmp/tmpfc26zqih/tests/storage/verify-role-results.yml:43
Thursday 21 July 2022  10:29:17 +0000 (0:00:00.020)       0:00:40.465 ********* 
included: /tmp/tmpfc26zqih/tests/storage/test-verify-volume.yml for /cache/rhel-9.qcow2.snap => (item={'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test2', 'name': 'test1', 'raid_level': None, 'size': 10737418240, 'state': 'present', 'type': 'disk', 'disks': ['sda'], 'raid_device_count': None, 'raid_spare_count': None, 'raid_metadata_version': None, 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_chunk_size': None, 'compression': None, 'deduplication': None, 'vdo_pool_size': None, 'thin': None, 'thin_pool_name': None, 'thin_pool_size': None, 'cached': False, 'cache_size': 0, 'cache_mode': None, 'cache_devices': [], '_device': '/dev/sda', '_raw_device': '/dev/sda', '_mount_id': 'UUID=dc31169e-edd9-4a64-ab59-3286f5673c34', '_kernel_device': '/dev/sda', '_raw_kernel_device': '/dev/sda'})

TASK [set_fact] ****************************************************************
task path: /tmp/tmpfc26zqih/tests/storage/test-verify-volume.yml:2
Thursday 21 July 2022  10:29:17 +0000 (0:00:00.084)       0:00:40.550 ********* 
ok: [/cache/rhel-9.qcow2.snap] => {
    "ansible_facts": {
        "_storage_test_volume_present": true,
        "_storage_volume_tests": [
            "mount",
            "fstab",
            "fs",
            "device",
            "encryption",
            "md",
            "size",
            "cache"
        ]
    },
    "changed": false
}

TASK [include_tasks] ***********************************************************
task path: /tmp/tmpfc26zqih/tests/storage/test-verify-volume.yml:10
Thursday 21 July 2022  10:29:18 +0000 (0:00:00.048)       0:00:40.598 ********* 
included: /tmp/tmpfc26zqih/tests/storage/test-verify-volume-mount.yml for /cache/rhel-9.qcow2.snap => (item=mount)
included: /tmp/tmpfc26zqih/tests/storage/test-verify-volume-fstab.yml for /cache/rhel-9.qcow2.snap => (item=fstab)
included: /tmp/tmpfc26zqih/tests/storage/test-verify-volume-fs.yml for /cache/rhel-9.qcow2.snap => (item=fs)
included: /tmp/tmpfc26zqih/tests/storage/test-verify-volume-device.yml for /cache/rhel-9.qcow2.snap => (item=device)
included: /tmp/tmpfc26zqih/tests/storage/test-verify-volume-encryption.yml for /cache/rhel-9.qcow2.snap => (item=encryption)
included: /tmp/tmpfc26zqih/tests/storage/test-verify-volume-md.yml for /cache/rhel-9.qcow2.snap => (item=md)
included: /tmp/tmpfc26zqih/tests/storage/test-verify-volume-size.yml for /cache/rhel-9.qcow2.snap => (item=size)
included: /tmp/tmpfc26zqih/tests/storage/test-verify-volume-cache.yml for /cache/rhel-9.qcow2.snap => (item=cache)

TASK [Get expected mount device based on device type] **************************
task path: /tmp/tmpfc26zqih/tests/storage/test-verify-volume-mount.yml:6
Thursday 21 July 2022  10:29:18 +0000 (0:00:00.074)       0:00:40.673 ********* 
ok: [/cache/rhel-9.qcow2.snap] => {
    "ansible_facts": {
        "storage_test_device_path": "/dev/sda"
    },
    "changed": false
}

TASK [Set some facts] **********************************************************
task path: /tmp/tmpfc26zqih/tests/storage/test-verify-volume-mount.yml:10
Thursday 21 July 2022  10:29:18 +0000 (0:00:00.039)       0:00:40.713 ********* 
ok: [/cache/rhel-9.qcow2.snap] => {
    "ansible_facts": {
        "storage_test_mount_device_matches": [
            {
                "block_available": 2592358,
                "block_size": 4096,
                "block_total": 2618880,
                "block_used": 26522,
                "device": "/dev/sda",
                "fstype": "xfs",
                "inode_available": 5242877,
                "inode_total": 5242880,
                "inode_used": 3,
                "mount": "/opt/test2",
                "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota",
                "size_available": 10618298368,
                "size_total": 10726932480,
                "uuid": "dc31169e-edd9-4a64-ab59-3286f5673c34"
            }
        ],
        "storage_test_mount_expected_match_count": "1",
        "storage_test_mount_point_matches": [
            {
                "block_available": 2592358,
                "block_size": 4096,
                "block_total": 2618880,
                "block_used": 26522,
                "device": "/dev/sda",
                "fstype": "xfs",
                "inode_available": 5242877,
                "inode_total": 5242880,
                "inode_used": 3,
                "mount": "/opt/test2",
                "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota",
                "size_available": 10618298368,
                "size_total": 10726932480,
                "uuid": "dc31169e-edd9-4a64-ab59-3286f5673c34"
            }
        ],
        "storage_test_swap_expected_matches": "0"
    },
    "changed": false
}

TASK [Verify the current mount state by device] ********************************
task path: /tmp/tmpfc26zqih/tests/storage/test-verify-volume-mount.yml:20
Thursday 21 July 2022  10:29:18 +0000 (0:00:00.066)       0:00:40.779 ********* 
ok: [/cache/rhel-9.qcow2.snap] => {
    "changed": false
}

MSG:

All assertions passed

TASK [Verify the current mount state by mount point] ***************************
task path: /tmp/tmpfc26zqih/tests/storage/test-verify-volume-mount.yml:29
Thursday 21 July 2022  10:29:18 +0000 (0:00:00.056)       0:00:40.835 ********* 
ok: [/cache/rhel-9.qcow2.snap] => {
    "changed": false
}

MSG:

All assertions passed

TASK [Verify the mount fs type] ************************************************
task path: /tmp/tmpfc26zqih/tests/storage/test-verify-volume-mount.yml:37
Thursday 21 July 2022  10:29:18 +0000 (0:00:00.049)       0:00:40.885 ********* 
ok: [/cache/rhel-9.qcow2.snap] => {
    "changed": false
}

MSG:

All assertions passed

TASK [command] *****************************************************************
task path: /tmp/tmpfc26zqih/tests/storage/test-verify-volume-mount.yml:46
Thursday 21 July 2022  10:29:18 +0000 (0:00:00.050)       0:00:40.935 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [Gather swap info] ********************************************************
task path: /tmp/tmpfc26zqih/tests/storage/test-verify-volume-mount.yml:50
Thursday 21 July 2022  10:29:18 +0000 (0:00:00.024)       0:00:40.960 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [Verify swap status] ******************************************************
task path: /tmp/tmpfc26zqih/tests/storage/test-verify-volume-mount.yml:55
Thursday 21 July 2022  10:29:18 +0000 (0:00:00.023)       0:00:40.984 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [Unset facts] *************************************************************
task path: /tmp/tmpfc26zqih/tests/storage/test-verify-volume-mount.yml:65
Thursday 21 July 2022  10:29:18 +0000 (0:00:00.023)       0:00:41.008 ********* 
ok: [/cache/rhel-9.qcow2.snap] => {
    "ansible_facts": {
        "storage_test_mount_device_matches": null,
        "storage_test_mount_expected_match_count": null,
        "storage_test_mount_point_matches": null,
        "storage_test_swap_expected_matches": null,
        "storage_test_swaps": null,
        "storage_test_sys_node": null
    },
    "changed": false
}

TASK [Set some variables for fstab checking] ***********************************
task path: /tmp/tmpfc26zqih/tests/storage/test-verify-volume-fstab.yml:2
Thursday 21 July 2022  10:29:18 +0000 (0:00:00.034)       0:00:41.042 ********* 
ok: [/cache/rhel-9.qcow2.snap] => {
    "ansible_facts": {
        "storage_test_fstab_expected_id_matches": "1",
        "storage_test_fstab_expected_mount_options_matches": "1",
        "storage_test_fstab_expected_mount_point_matches": "1",
        "storage_test_fstab_id_matches": [
            "UUID=dc31169e-edd9-4a64-ab59-3286f5673c34 "
        ],
        "storage_test_fstab_mount_options_matches": [
            " /opt/test2 xfs defaults "
        ],
        "storage_test_fstab_mount_point_matches": [
            " /opt/test2 "
        ]
    },
    "changed": false
}

TASK [Verify that the device identifier appears in /etc/fstab] *****************
task path: /tmp/tmpfc26zqih/tests/storage/test-verify-volume-fstab.yml:12
Thursday 21 July 2022  10:29:18 +0000 (0:00:00.062)       0:00:41.104 ********* 
ok: [/cache/rhel-9.qcow2.snap] => {
    "changed": false
}

MSG:

All assertions passed

TASK [Verify the fstab mount point] ********************************************
task path: /tmp/tmpfc26zqih/tests/storage/test-verify-volume-fstab.yml:19
Thursday 21 July 2022  10:29:18 +0000 (0:00:00.047)       0:00:41.152 ********* 
ok: [/cache/rhel-9.qcow2.snap] => {
    "changed": false
}

MSG:

All assertions passed

TASK [Verify mount_options] ****************************************************
task path: /tmp/tmpfc26zqih/tests/storage/test-verify-volume-fstab.yml:25
Thursday 21 July 2022  10:29:18 +0000 (0:00:00.048)       0:00:41.200 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [Clean up variables] ******************************************************
task path: /tmp/tmpfc26zqih/tests/storage/test-verify-volume-fstab.yml:34
Thursday 21 July 2022  10:29:18 +0000 (0:00:00.038)       0:00:41.239 ********* 
ok: [/cache/rhel-9.qcow2.snap] => {
    "ansible_facts": {
        "storage_test_fstab_expected_id_matches": null,
        "storage_test_fstab_expected_mount_options_matches": null,
        "storage_test_fstab_expected_mount_point_matches": null,
        "storage_test_fstab_id_matches": null,
        "storage_test_fstab_mount_options_matches": null,
        "storage_test_fstab_mount_point_matches": null
    },
    "changed": false
}

TASK [Verify fs type] **********************************************************
task path: /tmp/tmpfc26zqih/tests/storage/test-verify-volume-fs.yml:4
Thursday 21 July 2022  10:29:18 +0000 (0:00:00.032)       0:00:41.271 ********* 
ok: [/cache/rhel-9.qcow2.snap] => {
    "changed": false
}

MSG:

All assertions passed

TASK [Verify fs label] *********************************************************
task path: /tmp/tmpfc26zqih/tests/storage/test-verify-volume-fs.yml:10
Thursday 21 July 2022  10:29:18 +0000 (0:00:00.039)       0:00:41.310 ********* 
ok: [/cache/rhel-9.qcow2.snap] => {
    "changed": false
}

MSG:

All assertions passed

TASK [See whether the device node is present] **********************************
task path: /tmp/tmpfc26zqih/tests/storage/test-verify-volume-device.yml:4
Thursday 21 July 2022  10:29:18 +0000 (0:00:00.041)       0:00:41.352 ********* 
ok: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "stat": {
        "atime": 1658399330.874754,
        "attr_flags": "",
        "attributes": [],
        "block_size": 4096,
        "blocks": 0,
        "charset": "binary",
        "ctime": 1658399330.874754,
        "dev": 5,
        "device_type": 2048,
        "executable": false,
        "exists": true,
        "gid": 6,
        "gr_name": "disk",
        "inode": 356,
        "isblk": true,
        "ischr": false,
        "isdir": false,
        "isfifo": false,
        "isgid": false,
        "islnk": false,
        "isreg": false,
        "issock": false,
        "isuid": false,
        "mimetype": "inode/blockdevice",
        "mode": "0660",
        "mtime": 1658399330.874754,
        "nlink": 1,
        "path": "/dev/sda",
        "pw_name": "root",
        "readable": true,
        "rgrp": true,
        "roth": false,
        "rusr": true,
        "size": 0,
        "uid": 0,
        "version": null,
        "wgrp": true,
        "woth": false,
        "writeable": true,
        "wusr": true,
        "xgrp": false,
        "xoth": false,
        "xusr": false
    }
}

TASK [Verify the presence/absence of the device node] **************************
task path: /tmp/tmpfc26zqih/tests/storage/test-verify-volume-device.yml:10
Thursday 21 July 2022  10:29:19 +0000 (0:00:00.378)       0:00:41.730 ********* 
ok: [/cache/rhel-9.qcow2.snap] => {
    "changed": false
}

MSG:

All assertions passed

TASK [Make sure we got info about this volume] *********************************
task path: /tmp/tmpfc26zqih/tests/storage/test-verify-volume-device.yml:15
Thursday 21 July 2022  10:29:19 +0000 (0:00:00.037)       0:00:41.768 ********* 
ok: [/cache/rhel-9.qcow2.snap] => {
    "changed": false
}

MSG:

All assertions passed

TASK [(1/2) Process volume type (set initial value)] ***************************
task path: /tmp/tmpfc26zqih/tests/storage/test-verify-volume-device.yml:21
Thursday 21 July 2022  10:29:19 +0000 (0:00:00.039)       0:00:41.808 ********* 
ok: [/cache/rhel-9.qcow2.snap] => {
    "ansible_facts": {
        "st_volume_type": "disk"
    },
    "changed": false
}

TASK [(2/2) Process volume type (get RAID value)] ******************************
task path: /tmp/tmpfc26zqih/tests/storage/test-verify-volume-device.yml:25
Thursday 21 July 2022  10:29:19 +0000 (0:00:00.037)       0:00:41.845 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [Verify the volume's device type] *****************************************
task path: /tmp/tmpfc26zqih/tests/storage/test-verify-volume-device.yml:30
Thursday 21 July 2022  10:29:19 +0000 (0:00:00.023)       0:00:41.869 ********* 
ok: [/cache/rhel-9.qcow2.snap] => {
    "changed": false
}

MSG:

All assertions passed

TASK [Stat the LUKS device, if encrypted] **************************************
task path: /tmp/tmpfc26zqih/tests/storage/test-verify-volume-encryption.yml:3
Thursday 21 July 2022  10:29:19 +0000 (0:00:00.037)       0:00:41.906 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [Ensure cryptsetup is present] ********************************************
task path: /tmp/tmpfc26zqih/tests/storage/test-verify-volume-encryption.yml:10
Thursday 21 July 2022  10:29:19 +0000 (0:00:00.022)       0:00:41.929 ********* 
ok: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "rc": 0,
    "results": []
}

MSG:

Nothing to do

TASK [Collect LUKS info for this volume] ***************************************
task path: /tmp/tmpfc26zqih/tests/storage/test-verify-volume-encryption.yml:15
Thursday 21 July 2022  10:29:20 +0000 (0:00:00.847)       0:00:42.777 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [Verify the presence/absence of the LUKS device node] *********************
task path: /tmp/tmpfc26zqih/tests/storage/test-verify-volume-encryption.yml:21
Thursday 21 July 2022  10:29:20 +0000 (0:00:00.023)       0:00:42.800 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [Verify that the raw device is the same as the device if not encrypted] ***
task path: /tmp/tmpfc26zqih/tests/storage/test-verify-volume-encryption.yml:27
Thursday 21 July 2022  10:29:20 +0000 (0:00:00.023)       0:00:42.824 ********* 
ok: [/cache/rhel-9.qcow2.snap] => {
    "changed": false
}

MSG:

All assertions passed

TASK [Make sure we got info about the LUKS volume if encrypted] ****************
task path: /tmp/tmpfc26zqih/tests/storage/test-verify-volume-encryption.yml:33
Thursday 21 July 2022  10:29:20 +0000 (0:00:00.090)       0:00:42.915 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [Verify the LUKS volume's device type if encrypted] ***********************
task path: /tmp/tmpfc26zqih/tests/storage/test-verify-volume-encryption.yml:39
Thursday 21 July 2022  10:29:20 +0000 (0:00:00.023)       0:00:42.939 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [Check LUKS version] ******************************************************
task path: /tmp/tmpfc26zqih/tests/storage/test-verify-volume-encryption.yml:44
Thursday 21 July 2022  10:29:20 +0000 (0:00:00.050)       0:00:42.989 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [Check LUKS key size] *****************************************************
task path: /tmp/tmpfc26zqih/tests/storage/test-verify-volume-encryption.yml:50
Thursday 21 July 2022  10:29:20 +0000 (0:00:00.025)       0:00:43.015 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [Check LUKS cipher] *******************************************************
task path: /tmp/tmpfc26zqih/tests/storage/test-verify-volume-encryption.yml:56
Thursday 21 July 2022  10:29:20 +0000 (0:00:00.024)       0:00:43.039 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [set_fact] ****************************************************************
task path: /tmp/tmpfc26zqih/tests/storage/test-verify-volume-encryption.yml:62
Thursday 21 July 2022  10:29:20 +0000 (0:00:00.024)       0:00:43.064 ********* 
ok: [/cache/rhel-9.qcow2.snap] => {
    "ansible_facts": {
        "_storage_test_crypttab_entries": [],
        "_storage_test_expected_crypttab_entries": "0",
        "_storage_test_expected_crypttab_key_file": "-"
    },
    "changed": false
}

TASK [Check for /etc/crypttab entry] *******************************************
task path: /tmp/tmpfc26zqih/tests/storage/test-verify-volume-encryption.yml:67
Thursday 21 July 2022  10:29:20 +0000 (0:00:00.059)       0:00:43.124 ********* 
ok: [/cache/rhel-9.qcow2.snap] => {
    "changed": false
}

MSG:

All assertions passed

TASK [Validate the format of the crypttab entry] *******************************
task path: /tmp/tmpfc26zqih/tests/storage/test-verify-volume-encryption.yml:72
Thursday 21 July 2022  10:29:20 +0000 (0:00:00.055)       0:00:43.179 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [Check backing device of crypttab entry] **********************************
task path: /tmp/tmpfc26zqih/tests/storage/test-verify-volume-encryption.yml:78
Thursday 21 July 2022  10:29:20 +0000 (0:00:00.035)       0:00:43.214 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [Check key file of crypttab entry] ****************************************
task path: /tmp/tmpfc26zqih/tests/storage/test-verify-volume-encryption.yml:84
Thursday 21 July 2022  10:29:20 +0000 (0:00:00.034)       0:00:43.249 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [set_fact] ****************************************************************
task path: /tmp/tmpfc26zqih/tests/storage/test-verify-volume-encryption.yml:90
Thursday 21 July 2022  10:29:20 +0000 (0:00:00.037)       0:00:43.287 ********* 
ok: [/cache/rhel-9.qcow2.snap] => {
    "ansible_facts": {
        "_storage_test_crypttab_entries": null,
        "_storage_test_expected_crypttab_entries": null,
        "_storage_test_expected_crypttab_key_file": null
    },
    "changed": false
}

TASK [get information about RAID] **********************************************
task path: /tmp/tmpfc26zqih/tests/storage/test-verify-volume-md.yml:7
Thursday 21 July 2022  10:29:20 +0000 (0:00:00.032)       0:00:43.319 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [set_fact] ****************************************************************
task path: /tmp/tmpfc26zqih/tests/storage/test-verify-volume-md.yml:13
Thursday 21 July 2022  10:29:20 +0000 (0:00:00.041)       0:00:43.360 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [set_fact] ****************************************************************
task path: /tmp/tmpfc26zqih/tests/storage/test-verify-volume-md.yml:17
Thursday 21 July 2022  10:29:20 +0000 (0:00:00.035)       0:00:43.395 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [set_fact] ****************************************************************
task path: /tmp/tmpfc26zqih/tests/storage/test-verify-volume-md.yml:21
Thursday 21 July 2022  10:29:20 +0000 (0:00:00.044)       0:00:43.439 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [check RAID active devices count] *****************************************
task path: /tmp/tmpfc26zqih/tests/storage/test-verify-volume-md.yml:25
Thursday 21 July 2022  10:29:20 +0000 (0:00:00.037)       0:00:43.476 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [check RAID spare devices count] ******************************************
task path: /tmp/tmpfc26zqih/tests/storage/test-verify-volume-md.yml:31
Thursday 21 July 2022  10:29:20 +0000 (0:00:00.041)       0:00:43.518 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [check RAID metadata version] *********************************************
task path: /tmp/tmpfc26zqih/tests/storage/test-verify-volume-md.yml:37
Thursday 21 July 2022  10:29:20 +0000 (0:00:00.038)       0:00:43.556 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [parse the actual size of the volume] *************************************
task path: /tmp/tmpfc26zqih/tests/storage/test-verify-volume-size.yml:3
Thursday 21 July 2022  10:29:21 +0000 (0:00:00.045)       0:00:43.601 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [parse the requested size of the volume] **********************************
task path: /tmp/tmpfc26zqih/tests/storage/test-verify-volume-size.yml:9
Thursday 21 July 2022  10:29:21 +0000 (0:00:00.024)       0:00:43.626 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [Establish base value for expected size] **********************************
task path: /tmp/tmpfc26zqih/tests/storage/test-verify-volume-size.yml:15
Thursday 21 July 2022  10:29:21 +0000 (0:00:00.036)       0:00:43.663 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [debug] *******************************************************************
task path: /tmp/tmpfc26zqih/tests/storage/test-verify-volume-size.yml:20
Thursday 21 July 2022  10:29:21 +0000 (0:00:00.037)       0:00:43.700 ********* 
ok: [/cache/rhel-9.qcow2.snap] => {
    "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined"
}

TASK [debug] *******************************************************************
task path: /tmp/tmpfc26zqih/tests/storage/test-verify-volume-size.yml:25
Thursday 21 July 2022  10:29:21 +0000 (0:00:00.038)       0:00:43.739 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {}

TASK [debug] *******************************************************************
task path: /tmp/tmpfc26zqih/tests/storage/test-verify-volume-size.yml:28
Thursday 21 July 2022  10:29:21 +0000 (0:00:00.037)       0:00:43.776 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {}

TASK [Get the size of parent/pool device] **************************************
task path: /tmp/tmpfc26zqih/tests/storage/test-verify-volume-size.yml:31
Thursday 21 July 2022  10:29:21 +0000 (0:00:00.037)       0:00:43.813 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [debug] *******************************************************************
task path: /tmp/tmpfc26zqih/tests/storage/test-verify-volume-size.yml:36
Thursday 21 July 2022  10:29:21 +0000 (0:00:00.035)       0:00:43.849 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {}

TASK [Calculate the expected size based on pool size and percentage value] *****
task path: /tmp/tmpfc26zqih/tests/storage/test-verify-volume-size.yml:39
Thursday 21 July 2022  10:29:21 +0000 (0:00:00.040)       0:00:43.889 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [debug] *******************************************************************
task path: /tmp/tmpfc26zqih/tests/storage/test-verify-volume-size.yml:44
Thursday 21 July 2022  10:29:21 +0000 (0:00:00.036)       0:00:43.926 ********* 
ok: [/cache/rhel-9.qcow2.snap] => {
    "storage_test_actual_size": {
        "changed": false,
        "skip_reason": "Conditional result was False",
        "skipped": true
    }
}

TASK [debug] *******************************************************************
task path: /tmp/tmpfc26zqih/tests/storage/test-verify-volume-size.yml:47
Thursday 21 July 2022  10:29:21 +0000 (0:00:00.037)       0:00:43.963 ********* 
ok: [/cache/rhel-9.qcow2.snap] => {
    "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined"
}

TASK [assert] ******************************************************************
task path: /tmp/tmpfc26zqih/tests/storage/test-verify-volume-size.yml:50
Thursday 21 July 2022  10:29:21 +0000 (0:00:00.035)       0:00:43.999 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [Get information about the LV] ********************************************
task path: /tmp/tmpfc26zqih/tests/storage/test-verify-volume-cache.yml:6
Thursday 21 July 2022  10:29:21 +0000 (0:00:00.024)       0:00:44.024 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [set_fact] ****************************************************************
task path: /tmp/tmpfc26zqih/tests/storage/test-verify-volume-cache.yml:14
Thursday 21 July 2022  10:29:21 +0000 (0:00:00.024)       0:00:44.048 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [check segment type] ******************************************************
task path: /tmp/tmpfc26zqih/tests/storage/test-verify-volume-cache.yml:17
Thursday 21 July 2022  10:29:21 +0000 (0:00:00.023)       0:00:44.072 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [set_fact] ****************************************************************
task path: /tmp/tmpfc26zqih/tests/storage/test-verify-volume-cache.yml:22
Thursday 21 July 2022  10:29:21 +0000 (0:00:00.023)       0:00:44.095 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [parse the requested cache size] ******************************************
task path: /tmp/tmpfc26zqih/tests/storage/test-verify-volume-cache.yml:26
Thursday 21 July 2022  10:29:21 +0000 (0:00:00.026)       0:00:44.121 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [set_fact] ****************************************************************
task path: /tmp/tmpfc26zqih/tests/storage/test-verify-volume-cache.yml:32
Thursday 21 July 2022  10:29:21 +0000 (0:00:00.024)       0:00:44.145 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [Check cache size] ********************************************************
task path: /tmp/tmpfc26zqih/tests/storage/test-verify-volume-cache.yml:36
Thursday 21 July 2022  10:29:21 +0000 (0:00:00.023)       0:00:44.168 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [Clean up facts] **********************************************************
task path: /tmp/tmpfc26zqih/tests/storage/test-verify-volume.yml:16
Thursday 21 July 2022  10:29:21 +0000 (0:00:00.023)       0:00:44.192 ********* 
ok: [/cache/rhel-9.qcow2.snap] => {
    "ansible_facts": {
        "_storage_test_volume_present": null
    },
    "changed": false
}

TASK [Clean up variable namespace] *********************************************
task path: /tmp/tmpfc26zqih/tests/storage/verify-role-results.yml:53
Thursday 21 July 2022  10:29:21 +0000 (0:00:00.036)       0:00:44.228 ********* 
ok: [/cache/rhel-9.qcow2.snap] => {
    "ansible_facts": {
        "storage_test_blkinfo": null,
        "storage_test_crypttab": null,
        "storage_test_fstab": null
    },
    "changed": false
}

TASK [Clean up] ****************************************************************
task path: /tmp/tmpfc26zqih/tests/storage/tests_change_disk_mount.yml:61
Thursday 21 July 2022  10:29:21 +0000 (0:00:00.033)       0:00:44.262 ********* 

TASK [fedora.linux_system_roles.storage : set platform/version specific variables] ***
task path: /tmp/tmpi19f9hzy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2
Thursday 21 July 2022  10:29:21 +0000 (0:00:00.059)       0:00:44.321 ********* 
included: /tmp/tmpi19f9hzy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for /cache/rhel-9.qcow2.snap

TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] ***
task path: /tmp/tmpi19f9hzy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2
Thursday 21 July 2022  10:29:21 +0000 (0:00:00.035)       0:00:44.357 ********* 
ok: [/cache/rhel-9.qcow2.snap]

TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] ***
task path: /tmp/tmpi19f9hzy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7
Thursday 21 July 2022  10:29:22 +0000 (0:00:00.539)       0:00:44.897 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => (item=RedHat.yml)  => {
    "ansible_loop_var": "item",
    "changed": false,
    "item": "RedHat.yml",
    "skip_reason": "Conditional result was False"
}
skipping: [/cache/rhel-9.qcow2.snap] => (item=RedHat.yml)  => {
    "ansible_loop_var": "item",
    "changed": false,
    "item": "RedHat.yml",
    "skip_reason": "Conditional result was False"
}
ok: [/cache/rhel-9.qcow2.snap] => (item=RedHat_9.yml) => {
    "ansible_facts": {
        "blivet_package_list": [
            "python3-blivet",
            "libblockdev-crypto",
            "libblockdev-dm",
            "libblockdev-lvm",
            "libblockdev-mdraid",
            "libblockdev-swap",
            "vdo",
            "kmod-kvdo",
            "xfsprogs"
        ]
    },
    "ansible_included_var_files": [
        "/tmp/tmpi19f9hzy/ansible_collections/fedora/linux_system_roles/roles/storage/vars/RedHat_9.yml"
    ],
    "ansible_loop_var": "item",
    "changed": false,
    "item": "RedHat_9.yml"
}
skipping: [/cache/rhel-9.qcow2.snap] => (item=RedHat_9.0.yml)  => {
    "ansible_loop_var": "item",
    "changed": false,
    "item": "RedHat_9.0.yml",
    "skip_reason": "Conditional result was False"
}

TASK [fedora.linux_system_roles.storage : define an empty list of pools to be used in testing] ***
task path: /tmp/tmpi19f9hzy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5
Thursday 21 July 2022  10:29:22 +0000 (0:00:00.127)       0:00:45.025 ********* 
ok: [/cache/rhel-9.qcow2.snap] => {
    "ansible_facts": {
        "_storage_pools_list": []
    },
    "changed": false
}

TASK [fedora.linux_system_roles.storage : define an empty list of volumes to be used in testing] ***
task path: /tmp/tmpi19f9hzy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9
Thursday 21 July 2022  10:29:22 +0000 (0:00:00.034)       0:00:45.059 ********* 
ok: [/cache/rhel-9.qcow2.snap] => {
    "ansible_facts": {
        "_storage_volumes_list": []
    },
    "changed": false
}

TASK [fedora.linux_system_roles.storage : include the appropriate provider tasks] ***
task path: /tmp/tmpi19f9hzy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13
Thursday 21 July 2022  10:29:22 +0000 (0:00:00.035)       0:00:45.094 ********* 
redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount
redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount
included: /tmp/tmpi19f9hzy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for /cache/rhel-9.qcow2.snap

TASK [fedora.linux_system_roles.storage : get a list of rpm packages installed on host machine] ***
task path: /tmp/tmpi19f9hzy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2
Thursday 21 July 2022  10:29:22 +0000 (0:00:00.056)       0:00:45.151 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [fedora.linux_system_roles.storage : make sure blivet is available] *******
task path: /tmp/tmpi19f9hzy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:7
Thursday 21 July 2022  10:29:22 +0000 (0:00:00.021)       0:00:45.172 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [fedora.linux_system_roles.storage : show storage_pools] ******************
task path: /tmp/tmpi19f9hzy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:13
Thursday 21 July 2022  10:29:22 +0000 (0:00:00.028)       0:00:45.201 ********* 
ok: [/cache/rhel-9.qcow2.snap] => {
    "storage_pools": "VARIABLE IS NOT DEFINED!: 'storage_pools' is undefined"
}

TASK [fedora.linux_system_roles.storage : show storage_volumes] ****************
task path: /tmp/tmpi19f9hzy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:18
Thursday 21 July 2022  10:29:22 +0000 (0:00:00.038)       0:00:45.239 ********* 
ok: [/cache/rhel-9.qcow2.snap] => {
    "storage_volumes": [
        {
            "disks": [
                "sda"
            ],
            "mount_point": "/opt/test2",
            "name": "test1",
            "state": "absent",
            "type": "disk"
        }
    ]
}

TASK [fedora.linux_system_roles.storage : get required packages] ***************
task path: /tmp/tmpi19f9hzy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:23
Thursday 21 July 2022  10:29:22 +0000 (0:00:00.037)       0:00:45.277 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [fedora.linux_system_roles.storage : enable copr repositories if needed] ***
task path: /tmp/tmpi19f9hzy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:35
Thursday 21 July 2022  10:29:22 +0000 (0:00:00.027)       0:00:45.305 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [fedora.linux_system_roles.storage : make sure required packages are installed] ***
task path: /tmp/tmpi19f9hzy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:41
Thursday 21 July 2022  10:29:22 +0000 (0:00:00.028)       0:00:45.333 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [fedora.linux_system_roles.storage : get service facts] *******************
task path: /tmp/tmpi19f9hzy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:47
Thursday 21 July 2022  10:29:22 +0000 (0:00:00.028)       0:00:45.362 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [fedora.linux_system_roles.storage : Set storage_cryptsetup_services] *****
task path: /tmp/tmpi19f9hzy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:53
Thursday 21 July 2022  10:29:22 +0000 (0:00:00.028)       0:00:45.390 ********* 
ok: [/cache/rhel-9.qcow2.snap] => {
    "ansible_facts": {
        "storage_cryptsetup_services": []
    },
    "changed": false
}

TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] ***
task path: /tmp/tmpi19f9hzy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:58
Thursday 21 July 2022  10:29:22 +0000 (0:00:00.056)       0:00:45.447 ********* 

TASK [fedora.linux_system_roles.storage : manage the pools and volumes to match the specified state] ***
task path: /tmp/tmpi19f9hzy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:64
Thursday 21 July 2022  10:29:22 +0000 (0:00:00.021)       0:00:45.469 ********* 
changed: [/cache/rhel-9.qcow2.snap] => {
    "actions": [
        {
            "action": "destroy format",
            "device": "/dev/sda",
            "fs_type": "xfs"
        }
    ],
    "changed": true,
    "crypts": [],
    "leaves": [
        "/dev/sr0",
        "/dev/vda1",
        "/dev/vda2",
        "/dev/vda3",
        "/dev/vda4",
        "/dev/sda",
        "/dev/sdb",
        "/dev/sdc",
        "/dev/nvme0n1",
        "/dev/nvme1n1",
        "/dev/nvme2n1",
        "/dev/vdb1",
        "/dev/vdc1",
        "/dev/vdd",
        "/dev/vde",
        "/dev/vdf"
    ],
    "mounts": [
        {
            "fstype": "xfs",
            "path": "/opt/test2",
            "src": "UUID=dc31169e-edd9-4a64-ab59-3286f5673c34",
            "state": "absent"
        }
    ],
    "packages": [
        "e2fsprogs",
        "xfsprogs",
        "dosfstools"
    ],
    "pools": [],
    "volumes": [
        {
            "_device": "/dev/sda",
            "_mount_id": "UUID=dc31169e-edd9-4a64-ab59-3286f5673c34",
            "_raw_device": "/dev/sda",
            "cache_devices": [],
            "cache_mode": null,
            "cache_size": 0,
            "cached": false,
            "compression": null,
            "deduplication": null,
            "disks": [
                "sda"
            ],
            "encryption": false,
            "encryption_cipher": null,
            "encryption_key": null,
            "encryption_key_size": null,
            "encryption_luks_version": null,
            "encryption_password": null,
            "fs_create_options": "",
            "fs_label": "",
            "fs_overwrite_existing": true,
            "fs_type": "xfs",
            "mount_check": 0,
            "mount_device_identifier": "uuid",
            "mount_options": "defaults",
            "mount_passno": 0,
            "mount_point": "/opt/test2",
            "name": "test1",
            "raid_chunk_size": null,
            "raid_device_count": null,
            "raid_level": null,
            "raid_metadata_version": null,
            "raid_spare_count": null,
            "size": 10737418240,
            "state": "absent",
            "thin": null,
            "thin_pool_name": null,
            "thin_pool_size": null,
            "type": "disk",
            "vdo_pool_size": null
        }
    ]
}

TASK [fedora.linux_system_roles.storage : Workaround for udev issue on some platforms] ***
task path: /tmp/tmpi19f9hzy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:78
Thursday 21 July 2022  10:29:24 +0000 (0:00:01.698)       0:00:47.168 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] ***
task path: /tmp/tmpi19f9hzy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:90
Thursday 21 July 2022  10:29:24 +0000 (0:00:00.036)       0:00:47.204 ********* 

TASK [fedora.linux_system_roles.storage : show blivet_output] ******************
task path: /tmp/tmpi19f9hzy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:96
Thursday 21 July 2022  10:29:24 +0000 (0:00:00.022)       0:00:47.227 ********* 
ok: [/cache/rhel-9.qcow2.snap] => {
    "blivet_output": {
        "actions": [
            {
                "action": "destroy format",
                "device": "/dev/sda",
                "fs_type": "xfs"
            }
        ],
        "changed": true,
        "crypts": [],
        "failed": false,
        "leaves": [
            "/dev/sr0",
            "/dev/vda1",
            "/dev/vda2",
            "/dev/vda3",
            "/dev/vda4",
            "/dev/sda",
            "/dev/sdb",
            "/dev/sdc",
            "/dev/nvme0n1",
            "/dev/nvme1n1",
            "/dev/nvme2n1",
            "/dev/vdb1",
            "/dev/vdc1",
            "/dev/vdd",
            "/dev/vde",
            "/dev/vdf"
        ],
        "mounts": [
            {
                "fstype": "xfs",
                "path": "/opt/test2",
                "src": "UUID=dc31169e-edd9-4a64-ab59-3286f5673c34",
                "state": "absent"
            }
        ],
        "packages": [
            "e2fsprogs",
            "xfsprogs",
            "dosfstools"
        ],
        "pools": [],
        "volumes": [
            {
                "_device": "/dev/sda",
                "_mount_id": "UUID=dc31169e-edd9-4a64-ab59-3286f5673c34",
                "_raw_device": "/dev/sda",
                "cache_devices": [],
                "cache_mode": null,
                "cache_size": 0,
                "cached": false,
                "compression": null,
                "deduplication": null,
                "disks": [
                    "sda"
                ],
                "encryption": false,
                "encryption_cipher": null,
                "encryption_key": null,
                "encryption_key_size": null,
                "encryption_luks_version": null,
                "encryption_password": null,
                "fs_create_options": "",
                "fs_label": "",
                "fs_overwrite_existing": true,
                "fs_type": "xfs",
                "mount_check": 0,
                "mount_device_identifier": "uuid",
                "mount_options": "defaults",
                "mount_passno": 0,
                "mount_point": "/opt/test2",
                "name": "test1",
                "raid_chunk_size": null,
                "raid_device_count": null,
                "raid_level": null,
                "raid_metadata_version": null,
                "raid_spare_count": null,
                "size": 10737418240,
                "state": "absent",
                "thin": null,
                "thin_pool_name": null,
                "thin_pool_size": null,
                "type": "disk",
                "vdo_pool_size": null
            }
        ]
    }
}

TASK [fedora.linux_system_roles.storage : set the list of pools for test verification] ***
task path: /tmp/tmpi19f9hzy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:101
Thursday 21 July 2022  10:29:24 +0000 (0:00:00.038)       0:00:47.266 ********* 
ok: [/cache/rhel-9.qcow2.snap] => {
    "ansible_facts": {
        "_storage_pools_list": []
    },
    "changed": false
}

TASK [fedora.linux_system_roles.storage : set the list of volumes for test verification] ***
task path: /tmp/tmpi19f9hzy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:105
Thursday 21 July 2022  10:29:24 +0000 (0:00:00.037)       0:00:47.304 ********* 
ok: [/cache/rhel-9.qcow2.snap] => {
    "ansible_facts": {
        "_storage_volumes_list": [
            {
                "_device": "/dev/sda",
                "_mount_id": "UUID=dc31169e-edd9-4a64-ab59-3286f5673c34",
                "_raw_device": "/dev/sda",
                "cache_devices": [],
                "cache_mode": null,
                "cache_size": 0,
                "cached": false,
                "compression": null,
                "deduplication": null,
                "disks": [
                    "sda"
                ],
                "encryption": false,
                "encryption_cipher": null,
                "encryption_key": null,
                "encryption_key_size": null,
                "encryption_luks_version": null,
                "encryption_password": null,
                "fs_create_options": "",
                "fs_label": "",
                "fs_overwrite_existing": true,
                "fs_type": "xfs",
                "mount_check": 0,
                "mount_device_identifier": "uuid",
                "mount_options": "defaults",
                "mount_passno": 0,
                "mount_point": "/opt/test2",
                "name": "test1",
                "raid_chunk_size": null,
                "raid_device_count": null,
                "raid_level": null,
                "raid_metadata_version": null,
                "raid_spare_count": null,
                "size": 10737418240,
                "state": "absent",
                "thin": null,
                "thin_pool_name": null,
                "thin_pool_size": null,
                "type": "disk",
                "vdo_pool_size": null
            }
        ]
    },
    "changed": false
}

TASK [fedora.linux_system_roles.storage : remove obsolete mounts] **************
task path: /tmp/tmpi19f9hzy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:121
Thursday 21 July 2022  10:29:24 +0000 (0:00:00.039)       0:00:47.344 ********* 
redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount
changed: [/cache/rhel-9.qcow2.snap] => (item={'src': 'UUID=dc31169e-edd9-4a64-ab59-3286f5673c34', 'path': '/opt/test2', 'state': 'absent', 'fstype': 'xfs'}) => {
    "ansible_loop_var": "mount_info",
    "backup_file": "",
    "boot": "yes",
    "changed": true,
    "dump": "0",
    "fstab": "/etc/fstab",
    "fstype": "xfs",
    "mount_info": {
        "fstype": "xfs",
        "path": "/opt/test2",
        "src": "UUID=dc31169e-edd9-4a64-ab59-3286f5673c34",
        "state": "absent"
    },
    "name": "/opt/test2",
    "opts": "defaults",
    "passno": "0",
    "src": "UUID=dc31169e-edd9-4a64-ab59-3286f5673c34"
}

TASK [fedora.linux_system_roles.storage : tell systemd to refresh its view of /etc/fstab] ***
task path: /tmp/tmpi19f9hzy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:132
Thursday 21 July 2022  10:29:25 +0000 (0:00:00.400)       0:00:47.745 ********* 
ok: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "name": null,
    "status": {}
}

TASK [fedora.linux_system_roles.storage : set up new/current mounts] ***********
task path: /tmp/tmpi19f9hzy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:137
Thursday 21 July 2022  10:29:25 +0000 (0:00:00.676)       0:00:48.421 ********* 

TASK [fedora.linux_system_roles.storage : tell systemd to refresh its view of /etc/fstab] ***
task path: /tmp/tmpi19f9hzy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:148
Thursday 21 July 2022  10:29:25 +0000 (0:00:00.038)       0:00:48.460 ********* 
ok: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "name": null,
    "status": {}
}

TASK [fedora.linux_system_roles.storage : retrieve facts for the /etc/crypttab file] ***
task path: /tmp/tmpi19f9hzy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:156
Thursday 21 July 2022  10:29:26 +0000 (0:00:00.716)       0:00:49.176 ********* 
ok: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "stat": {
        "exists": false
    }
}

TASK [fedora.linux_system_roles.storage : manage /etc/crypttab to account for changes we just made] ***
task path: /tmp/tmpi19f9hzy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:161
Thursday 21 July 2022  10:29:26 +0000 (0:00:00.366)       0:00:49.543 ********* 

TASK [fedora.linux_system_roles.storage : Update facts] ************************
task path: /tmp/tmpi19f9hzy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:183
Thursday 21 July 2022  10:29:26 +0000 (0:00:00.023)       0:00:49.566 ********* 
ok: [/cache/rhel-9.qcow2.snap]
META: role_complete for /cache/rhel-9.qcow2.snap

TASK [include_tasks] ***********************************************************
task path: /tmp/tmpfc26zqih/tests/storage/tests_change_disk_mount.yml:72
Thursday 21 July 2022  10:29:28 +0000 (0:00:01.030)       0:00:50.597 ********* 
included: /tmp/tmpfc26zqih/tests/storage/verify-role-results.yml for /cache/rhel-9.qcow2.snap

TASK [Print out pool information] **********************************************
task path: /tmp/tmpfc26zqih/tests/storage/verify-role-results.yml:1
Thursday 21 July 2022  10:29:28 +0000 (0:00:00.048)       0:00:50.645 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {}

TASK [Print out volume information] ********************************************
task path: /tmp/tmpfc26zqih/tests/storage/verify-role-results.yml:6
Thursday 21 July 2022  10:29:28 +0000 (0:00:00.036)       0:00:50.682 ********* 
ok: [/cache/rhel-9.qcow2.snap] => {
    "_storage_volumes_list": [
        {
            "_device": "/dev/sda",
            "_mount_id": "UUID=dc31169e-edd9-4a64-ab59-3286f5673c34",
            "_raw_device": "/dev/sda",
            "cache_devices": [],
            "cache_mode": null,
            "cache_size": 0,
            "cached": false,
            "compression": null,
            "deduplication": null,
            "disks": [
                "sda"
            ],
            "encryption": false,
            "encryption_cipher": null,
            "encryption_key": null,
            "encryption_key_size": null,
            "encryption_luks_version": null,
            "encryption_password": null,
            "fs_create_options": "",
            "fs_label": "",
            "fs_overwrite_existing": true,
            "fs_type": "xfs",
            "mount_check": 0,
            "mount_device_identifier": "uuid",
            "mount_options": "defaults",
            "mount_passno": 0,
            "mount_point": "/opt/test2",
            "name": "test1",
            "raid_chunk_size": null,
            "raid_device_count": null,
            "raid_level": null,
            "raid_metadata_version": null,
            "raid_spare_count": null,
            "size": 10737418240,
            "state": "absent",
            "thin": null,
            "thin_pool_name": null,
            "thin_pool_size": null,
            "type": "disk",
            "vdo_pool_size": null
        }
    ]
}

TASK [Collect info about the volumes.] *****************************************
task path: /tmp/tmpfc26zqih/tests/storage/verify-role-results.yml:14
Thursday 21 July 2022  10:29:28 +0000 (0:00:00.052)       0:00:50.734 ********* 
ok: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "info": {
        "/dev/nvme0n1": {
            "fstype": "",
            "label": "",
            "name": "/dev/nvme0n1",
            "size": "10G",
            "type": "disk",
            "uuid": ""
        },
        "/dev/nvme1n1": {
            "fstype": "",
            "label": "",
            "name": "/dev/nvme1n1",
            "size": "10G",
            "type": "disk",
            "uuid": ""
        },
        "/dev/nvme2n1": {
            "fstype": "",
            "label": "",
            "name": "/dev/nvme2n1",
            "size": "10G",
            "type": "disk",
            "uuid": ""
        },
        "/dev/sda": {
            "fstype": "",
            "label": "",
            "name": "/dev/sda",
            "size": "10G",
            "type": "disk",
            "uuid": ""
        },
        "/dev/sdb": {
            "fstype": "",
            "label": "",
            "name": "/dev/sdb",
            "size": "10G",
            "type": "disk",
            "uuid": ""
        },
        "/dev/sdc": {
            "fstype": "",
            "label": "",
            "name": "/dev/sdc",
            "size": "10G",
            "type": "disk",
            "uuid": ""
        },
        "/dev/sr0": {
            "fstype": "iso9660",
            "label": "cidata",
            "name": "/dev/sr0",
            "size": "364K",
            "type": "rom",
            "uuid": "2022-07-21-10-28-24-00"
        },
        "/dev/vda": {
            "fstype": "",
            "label": "",
            "name": "/dev/vda",
            "size": "10G",
            "type": "disk",
            "uuid": ""
        },
        "/dev/vda1": {
            "fstype": "",
            "label": "",
            "name": "/dev/vda1",
            "size": "1M",
            "type": "partition",
            "uuid": ""
        },
        "/dev/vda2": {
            "fstype": "vfat",
            "label": "",
            "name": "/dev/vda2",
            "size": "200M",
            "type": "partition",
            "uuid": "7B77-95E7"
        },
        "/dev/vda3": {
            "fstype": "xfs",
            "label": "boot",
            "name": "/dev/vda3",
            "size": "500M",
            "type": "partition",
            "uuid": "6e74e171-0370-451f-8340-f16ad2839183"
        },
        "/dev/vda4": {
            "fstype": "xfs",
            "label": "root",
            "name": "/dev/vda4",
            "size": "9.3G",
            "type": "partition",
            "uuid": "1bb53e4d-984c-4316-908a-59b5a62fa30e"
        },
        "/dev/vdb": {
            "fstype": "",
            "label": "",
            "name": "/dev/vdb",
            "size": "2G",
            "type": "disk",
            "uuid": ""
        },
        "/dev/vdb1": {
            "fstype": "ext4",
            "label": "yumcache",
            "name": "/dev/vdb1",
            "size": "2G",
            "type": "partition",
            "uuid": "12be66f7-7f03-4d54-9a48-a520825d8fa9"
        },
        "/dev/vdc": {
            "fstype": "",
            "label": "",
            "name": "/dev/vdc",
            "size": "2G",
            "type": "disk",
            "uuid": ""
        },
        "/dev/vdc1": {
            "fstype": "ext4",
            "label": "yumvarlib",
            "name": "/dev/vdc1",
            "size": "2G",
            "type": "partition",
            "uuid": "9bd037ed-b1bb-4b9c-a008-9f07d34ab071"
        },
        "/dev/vdd": {
            "fstype": "",
            "label": "",
            "name": "/dev/vdd",
            "size": "10G",
            "type": "disk",
            "uuid": ""
        },
        "/dev/vde": {
            "fstype": "",
            "label": "",
            "name": "/dev/vde",
            "size": "10G",
            "type": "disk",
            "uuid": ""
        },
        "/dev/vdf": {
            "fstype": "",
            "label": "",
            "name": "/dev/vdf",
            "size": "10G",
            "type": "disk",
            "uuid": ""
        }
    }
}

TASK [Read the /etc/fstab file for volume existence] ***************************
task path: /tmp/tmpfc26zqih/tests/storage/verify-role-results.yml:19
Thursday 21 July 2022  10:29:28 +0000 (0:00:00.392)       0:00:51.127 ********* 
ok: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "cmd": [
        "cat",
        "/etc/fstab"
    ],
    "delta": "0:00:00.003338",
    "end": "2022-07-21 06:29:29.341099",
    "rc": 0,
    "start": "2022-07-21 06:29:29.337761"
}

STDOUT:

UUID=6e74e171-0370-451f-8340-f16ad2839183	/boot	xfs	defaults	0	0
UUID=1bb53e4d-984c-4316-908a-59b5a62fa30e	/	xfs	defaults	0	0
UUID=7B77-95E7	/boot/efi	vfat	defaults,uid=0,gid=0,umask=077,shortname=winnt	0	2
/dev/vdb1	/var/cache/dnf	auto	defaults,nofail,x-systemd.requires=cloud-init.service,_netdev,comment=cloudconfig	0	2
/dev/vdc1	/var/lib/dnf	auto	defaults,nofail,x-systemd.requires=cloud-init.service,_netdev,comment=cloudconfig	0	2

TASK [Read the /etc/crypttab file] *********************************************
task path: /tmp/tmpfc26zqih/tests/storage/verify-role-results.yml:24
Thursday 21 July 2022  10:29:28 +0000 (0:00:00.390)       0:00:51.518 ********* 
ok: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "cmd": [
        "cat",
        "/etc/crypttab"
    ],
    "delta": "0:00:00.002568",
    "end": "2022-07-21 06:29:29.712309",
    "failed_when_result": false,
    "rc": 1,
    "start": "2022-07-21 06:29:29.709741"
}

STDERR:

cat: /etc/crypttab: No such file or directory


MSG:

non-zero return code

TASK [Verify the volumes listed in storage_pools were correctly managed] *******
task path: /tmp/tmpfc26zqih/tests/storage/verify-role-results.yml:33
Thursday 21 July 2022  10:29:29 +0000 (0:00:00.366)       0:00:51.884 ********* 

TASK [Verify the volumes with no pool were correctly managed] ******************
task path: /tmp/tmpfc26zqih/tests/storage/verify-role-results.yml:43
Thursday 21 July 2022  10:29:29 +0000 (0:00:00.021)       0:00:51.906 ********* 
included: /tmp/tmpfc26zqih/tests/storage/test-verify-volume.yml for /cache/rhel-9.qcow2.snap => (item={'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test2', 'name': 'test1', 'raid_level': None, 'size': 10737418240, 'state': 'absent', 'type': 'disk', 'disks': ['sda'], 'raid_device_count': None, 'raid_spare_count': None, 'raid_metadata_version': None, 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_chunk_size': None, 'compression': None, 'deduplication': None, 'vdo_pool_size': None, 'thin': None, 'thin_pool_name': None, 'thin_pool_size': None, 'cached': False, 'cache_size': 0, 'cache_mode': None, 'cache_devices': [], '_device': '/dev/sda', '_raw_device': '/dev/sda', '_mount_id': 'UUID=dc31169e-edd9-4a64-ab59-3286f5673c34'})

TASK [set_fact] ****************************************************************
task path: /tmp/tmpfc26zqih/tests/storage/test-verify-volume.yml:2
Thursday 21 July 2022  10:29:29 +0000 (0:00:00.057)       0:00:51.963 ********* 
ok: [/cache/rhel-9.qcow2.snap] => {
    "ansible_facts": {
        "_storage_test_volume_present": false,
        "_storage_volume_tests": [
            "mount",
            "fstab",
            "fs",
            "device",
            "encryption",
            "md",
            "size",
            "cache"
        ]
    },
    "changed": false
}

TASK [include_tasks] ***********************************************************
task path: /tmp/tmpfc26zqih/tests/storage/test-verify-volume.yml:10
Thursday 21 July 2022  10:29:29 +0000 (0:00:00.049)       0:00:52.012 ********* 
included: /tmp/tmpfc26zqih/tests/storage/test-verify-volume-mount.yml for /cache/rhel-9.qcow2.snap => (item=mount)
included: /tmp/tmpfc26zqih/tests/storage/test-verify-volume-fstab.yml for /cache/rhel-9.qcow2.snap => (item=fstab)
included: /tmp/tmpfc26zqih/tests/storage/test-verify-volume-fs.yml for /cache/rhel-9.qcow2.snap => (item=fs)
included: /tmp/tmpfc26zqih/tests/storage/test-verify-volume-device.yml for /cache/rhel-9.qcow2.snap => (item=device)
included: /tmp/tmpfc26zqih/tests/storage/test-verify-volume-encryption.yml for /cache/rhel-9.qcow2.snap => (item=encryption)
included: /tmp/tmpfc26zqih/tests/storage/test-verify-volume-md.yml for /cache/rhel-9.qcow2.snap => (item=md)
included: /tmp/tmpfc26zqih/tests/storage/test-verify-volume-size.yml for /cache/rhel-9.qcow2.snap => (item=size)
included: /tmp/tmpfc26zqih/tests/storage/test-verify-volume-cache.yml for /cache/rhel-9.qcow2.snap => (item=cache)

TASK [Get expected mount device based on device type] **************************
task path: /tmp/tmpfc26zqih/tests/storage/test-verify-volume-mount.yml:6
Thursday 21 July 2022  10:29:29 +0000 (0:00:00.071)       0:00:52.084 ********* 
ok: [/cache/rhel-9.qcow2.snap] => {
    "ansible_facts": {
        "storage_test_device_path": "/dev/sda"
    },
    "changed": false
}

TASK [Set some facts] **********************************************************
task path: /tmp/tmpfc26zqih/tests/storage/test-verify-volume-mount.yml:10
Thursday 21 July 2022  10:29:29 +0000 (0:00:00.093)       0:00:52.177 ********* 
ok: [/cache/rhel-9.qcow2.snap] => {
    "ansible_facts": {
        "storage_test_mount_device_matches": [],
        "storage_test_mount_expected_match_count": "0",
        "storage_test_mount_point_matches": [],
        "storage_test_swap_expected_matches": "0"
    },
    "changed": false
}

TASK [Verify the current mount state by device] ********************************
task path: /tmp/tmpfc26zqih/tests/storage/test-verify-volume-mount.yml:20
Thursday 21 July 2022  10:29:29 +0000 (0:00:00.055)       0:00:52.232 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [Verify the current mount state by mount point] ***************************
task path: /tmp/tmpfc26zqih/tests/storage/test-verify-volume-mount.yml:29
Thursday 21 July 2022  10:29:29 +0000 (0:00:00.025)       0:00:52.258 ********* 
ok: [/cache/rhel-9.qcow2.snap] => {
    "changed": false
}

MSG:

All assertions passed

TASK [Verify the mount fs type] ************************************************
task path: /tmp/tmpfc26zqih/tests/storage/test-verify-volume-mount.yml:37
Thursday 21 July 2022  10:29:29 +0000 (0:00:00.052)       0:00:52.311 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [command] *****************************************************************
task path: /tmp/tmpfc26zqih/tests/storage/test-verify-volume-mount.yml:46
Thursday 21 July 2022  10:29:29 +0000 (0:00:00.038)       0:00:52.349 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [Gather swap info] ********************************************************
task path: /tmp/tmpfc26zqih/tests/storage/test-verify-volume-mount.yml:50
Thursday 21 July 2022  10:29:29 +0000 (0:00:00.024)       0:00:52.373 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [Verify swap status] ******************************************************
task path: /tmp/tmpfc26zqih/tests/storage/test-verify-volume-mount.yml:55
Thursday 21 July 2022  10:29:29 +0000 (0:00:00.023)       0:00:52.396 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [Unset facts] *************************************************************
task path: /tmp/tmpfc26zqih/tests/storage/test-verify-volume-mount.yml:65
Thursday 21 July 2022  10:29:29 +0000 (0:00:00.024)       0:00:52.420 ********* 
ok: [/cache/rhel-9.qcow2.snap] => {
    "ansible_facts": {
        "storage_test_mount_device_matches": null,
        "storage_test_mount_expected_match_count": null,
        "storage_test_mount_point_matches": null,
        "storage_test_swap_expected_matches": null,
        "storage_test_swaps": null,
        "storage_test_sys_node": null
    },
    "changed": false
}

TASK [Set some variables for fstab checking] ***********************************
task path: /tmp/tmpfc26zqih/tests/storage/test-verify-volume-fstab.yml:2
Thursday 21 July 2022  10:29:29 +0000 (0:00:00.036)       0:00:52.456 ********* 
ok: [/cache/rhel-9.qcow2.snap] => {
    "ansible_facts": {
        "storage_test_fstab_expected_id_matches": "0",
        "storage_test_fstab_expected_mount_options_matches": "0",
        "storage_test_fstab_expected_mount_point_matches": "0",
        "storage_test_fstab_id_matches": [],
        "storage_test_fstab_mount_options_matches": [],
        "storage_test_fstab_mount_point_matches": []
    },
    "changed": false
}

TASK [Verify that the device identifier appears in /etc/fstab] *****************
task path: /tmp/tmpfc26zqih/tests/storage/test-verify-volume-fstab.yml:12
Thursday 21 July 2022  10:29:29 +0000 (0:00:00.062)       0:00:52.519 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [Verify the fstab mount point] ********************************************
task path: /tmp/tmpfc26zqih/tests/storage/test-verify-volume-fstab.yml:19
Thursday 21 July 2022  10:29:29 +0000 (0:00:00.023)       0:00:52.543 ********* 
ok: [/cache/rhel-9.qcow2.snap] => {
    "changed": false
}

MSG:

All assertions passed

TASK [Verify mount_options] ****************************************************
task path: /tmp/tmpfc26zqih/tests/storage/test-verify-volume-fstab.yml:25
Thursday 21 July 2022  10:29:30 +0000 (0:00:00.047)       0:00:52.590 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [Clean up variables] ******************************************************
task path: /tmp/tmpfc26zqih/tests/storage/test-verify-volume-fstab.yml:34
Thursday 21 July 2022  10:29:30 +0000 (0:00:00.035)       0:00:52.626 ********* 
ok: [/cache/rhel-9.qcow2.snap] => {
    "ansible_facts": {
        "storage_test_fstab_expected_id_matches": null,
        "storage_test_fstab_expected_mount_options_matches": null,
        "storage_test_fstab_expected_mount_point_matches": null,
        "storage_test_fstab_id_matches": null,
        "storage_test_fstab_mount_options_matches": null,
        "storage_test_fstab_mount_point_matches": null
    },
    "changed": false
}

TASK [Verify fs type] **********************************************************
task path: /tmp/tmpfc26zqih/tests/storage/test-verify-volume-fs.yml:4
Thursday 21 July 2022  10:29:30 +0000 (0:00:00.034)       0:00:52.660 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [Verify fs label] *********************************************************
task path: /tmp/tmpfc26zqih/tests/storage/test-verify-volume-fs.yml:10
Thursday 21 July 2022  10:29:30 +0000 (0:00:00.024)       0:00:52.685 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [See whether the device node is present] **********************************
task path: /tmp/tmpfc26zqih/tests/storage/test-verify-volume-device.yml:4
Thursday 21 July 2022  10:29:30 +0000 (0:00:00.022)       0:00:52.708 ********* 
ok: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "stat": {
        "atime": 1658399364.9317539,
        "attr_flags": "",
        "attributes": [],
        "block_size": 4096,
        "blocks": 0,
        "charset": "binary",
        "ctime": 1658399364.9317539,
        "dev": 5,
        "device_type": 2048,
        "executable": false,
        "exists": true,
        "gid": 6,
        "gr_name": "disk",
        "inode": 356,
        "isblk": true,
        "ischr": false,
        "isdir": false,
        "isfifo": false,
        "isgid": false,
        "islnk": false,
        "isreg": false,
        "issock": false,
        "isuid": false,
        "mimetype": "inode/blockdevice",
        "mode": "0660",
        "mtime": 1658399364.9317539,
        "nlink": 1,
        "path": "/dev/sda",
        "pw_name": "root",
        "readable": true,
        "rgrp": true,
        "roth": false,
        "rusr": true,
        "size": 0,
        "uid": 0,
        "version": null,
        "wgrp": true,
        "woth": false,
        "writeable": true,
        "wusr": true,
        "xgrp": false,
        "xoth": false,
        "xusr": false
    }
}

TASK [Verify the presence/absence of the device node] **************************
task path: /tmp/tmpfc26zqih/tests/storage/test-verify-volume-device.yml:10
Thursday 21 July 2022  10:29:30 +0000 (0:00:00.382)       0:00:53.091 ********* 
ok: [/cache/rhel-9.qcow2.snap] => {
    "changed": false
}

MSG:

All assertions passed

TASK [Make sure we got info about this volume] *********************************
task path: /tmp/tmpfc26zqih/tests/storage/test-verify-volume-device.yml:15
Thursday 21 July 2022  10:29:30 +0000 (0:00:00.038)       0:00:53.129 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [(1/2) Process volume type (set initial value)] ***************************
task path: /tmp/tmpfc26zqih/tests/storage/test-verify-volume-device.yml:21
Thursday 21 July 2022  10:29:30 +0000 (0:00:00.024)       0:00:53.153 ********* 
ok: [/cache/rhel-9.qcow2.snap] => {
    "ansible_facts": {
        "st_volume_type": "disk"
    },
    "changed": false
}

TASK [(2/2) Process volume type (get RAID value)] ******************************
task path: /tmp/tmpfc26zqih/tests/storage/test-verify-volume-device.yml:25
Thursday 21 July 2022  10:29:30 +0000 (0:00:00.035)       0:00:53.189 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [Verify the volume's device type] *****************************************
task path: /tmp/tmpfc26zqih/tests/storage/test-verify-volume-device.yml:30
Thursday 21 July 2022  10:29:30 +0000 (0:00:00.023)       0:00:53.212 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [Stat the LUKS device, if encrypted] **************************************
task path: /tmp/tmpfc26zqih/tests/storage/test-verify-volume-encryption.yml:3
Thursday 21 July 2022  10:29:30 +0000 (0:00:00.024)       0:00:53.237 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [Ensure cryptsetup is present] ********************************************
task path: /tmp/tmpfc26zqih/tests/storage/test-verify-volume-encryption.yml:10
Thursday 21 July 2022  10:29:30 +0000 (0:00:00.023)       0:00:53.260 ********* 
ok: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "rc": 0,
    "results": []
}

MSG:

Nothing to do

TASK [Collect LUKS info for this volume] ***************************************
task path: /tmp/tmpfc26zqih/tests/storage/test-verify-volume-encryption.yml:15
Thursday 21 July 2022  10:29:31 +0000 (0:00:00.890)       0:00:54.151 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [Verify the presence/absence of the LUKS device node] *********************
task path: /tmp/tmpfc26zqih/tests/storage/test-verify-volume-encryption.yml:21
Thursday 21 July 2022  10:29:31 +0000 (0:00:00.023)       0:00:54.174 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [Verify that the raw device is the same as the device if not encrypted] ***
task path: /tmp/tmpfc26zqih/tests/storage/test-verify-volume-encryption.yml:27
Thursday 21 July 2022  10:29:31 +0000 (0:00:00.023)       0:00:54.197 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [Make sure we got info about the LUKS volume if encrypted] ****************
task path: /tmp/tmpfc26zqih/tests/storage/test-verify-volume-encryption.yml:33
Thursday 21 July 2022  10:29:31 +0000 (0:00:00.023)       0:00:54.221 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [Verify the LUKS volume's device type if encrypted] ***********************
task path: /tmp/tmpfc26zqih/tests/storage/test-verify-volume-encryption.yml:39
Thursday 21 July 2022  10:29:31 +0000 (0:00:00.022)       0:00:54.243 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [Check LUKS version] ******************************************************
task path: /tmp/tmpfc26zqih/tests/storage/test-verify-volume-encryption.yml:44
Thursday 21 July 2022  10:29:31 +0000 (0:00:00.021)       0:00:54.265 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [Check LUKS key size] *****************************************************
task path: /tmp/tmpfc26zqih/tests/storage/test-verify-volume-encryption.yml:50
Thursday 21 July 2022  10:29:31 +0000 (0:00:00.021)       0:00:54.287 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [Check LUKS cipher] *******************************************************
task path: /tmp/tmpfc26zqih/tests/storage/test-verify-volume-encryption.yml:56
Thursday 21 July 2022  10:29:31 +0000 (0:00:00.048)       0:00:54.335 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [set_fact] ****************************************************************
task path: /tmp/tmpfc26zqih/tests/storage/test-verify-volume-encryption.yml:62
Thursday 21 July 2022  10:29:31 +0000 (0:00:00.024)       0:00:54.360 ********* 
ok: [/cache/rhel-9.qcow2.snap] => {
    "ansible_facts": {
        "_storage_test_crypttab_entries": [],
        "_storage_test_expected_crypttab_entries": "0",
        "_storage_test_expected_crypttab_key_file": "-"
    },
    "changed": false
}

TASK [Check for /etc/crypttab entry] *******************************************
task path: /tmp/tmpfc26zqih/tests/storage/test-verify-volume-encryption.yml:67
Thursday 21 July 2022  10:29:31 +0000 (0:00:00.052)       0:00:54.413 ********* 
ok: [/cache/rhel-9.qcow2.snap] => {
    "changed": false
}

MSG:

All assertions passed

TASK [Validate the format of the crypttab entry] *******************************
task path: /tmp/tmpfc26zqih/tests/storage/test-verify-volume-encryption.yml:72
Thursday 21 July 2022  10:29:31 +0000 (0:00:00.049)       0:00:54.463 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [Check backing device of crypttab entry] **********************************
task path: /tmp/tmpfc26zqih/tests/storage/test-verify-volume-encryption.yml:78
Thursday 21 July 2022  10:29:31 +0000 (0:00:00.039)       0:00:54.502 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [Check key file of crypttab entry] ****************************************
task path: /tmp/tmpfc26zqih/tests/storage/test-verify-volume-encryption.yml:84
Thursday 21 July 2022  10:29:31 +0000 (0:00:00.037)       0:00:54.539 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [set_fact] ****************************************************************
task path: /tmp/tmpfc26zqih/tests/storage/test-verify-volume-encryption.yml:90
Thursday 21 July 2022  10:29:32 +0000 (0:00:00.035)       0:00:54.575 ********* 
ok: [/cache/rhel-9.qcow2.snap] => {
    "ansible_facts": {
        "_storage_test_crypttab_entries": null,
        "_storage_test_expected_crypttab_entries": null,
        "_storage_test_expected_crypttab_key_file": null
    },
    "changed": false
}

TASK [get information about RAID] **********************************************
task path: /tmp/tmpfc26zqih/tests/storage/test-verify-volume-md.yml:7
Thursday 21 July 2022  10:29:32 +0000 (0:00:00.034)       0:00:54.609 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [set_fact] ****************************************************************
task path: /tmp/tmpfc26zqih/tests/storage/test-verify-volume-md.yml:13
Thursday 21 July 2022  10:29:32 +0000 (0:00:00.038)       0:00:54.648 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [set_fact] ****************************************************************
task path: /tmp/tmpfc26zqih/tests/storage/test-verify-volume-md.yml:17
Thursday 21 July 2022  10:29:32 +0000 (0:00:00.035)       0:00:54.684 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [set_fact] ****************************************************************
task path: /tmp/tmpfc26zqih/tests/storage/test-verify-volume-md.yml:21
Thursday 21 July 2022  10:29:32 +0000 (0:00:00.035)       0:00:54.719 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [check RAID active devices count] *****************************************
task path: /tmp/tmpfc26zqih/tests/storage/test-verify-volume-md.yml:25
Thursday 21 July 2022  10:29:32 +0000 (0:00:00.036)       0:00:54.755 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [check RAID spare devices count] ******************************************
task path: /tmp/tmpfc26zqih/tests/storage/test-verify-volume-md.yml:31
Thursday 21 July 2022  10:29:32 +0000 (0:00:00.040)       0:00:54.796 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [check RAID metadata version] *********************************************
task path: /tmp/tmpfc26zqih/tests/storage/test-verify-volume-md.yml:37
Thursday 21 July 2022  10:29:32 +0000 (0:00:00.036)       0:00:54.832 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [parse the actual size of the volume] *************************************
task path: /tmp/tmpfc26zqih/tests/storage/test-verify-volume-size.yml:3
Thursday 21 July 2022  10:29:32 +0000 (0:00:00.036)       0:00:54.869 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [parse the requested size of the volume] **********************************
task path: /tmp/tmpfc26zqih/tests/storage/test-verify-volume-size.yml:9
Thursday 21 July 2022  10:29:32 +0000 (0:00:00.024)       0:00:54.893 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [Establish base value for expected size] **********************************
task path: /tmp/tmpfc26zqih/tests/storage/test-verify-volume-size.yml:15
Thursday 21 July 2022  10:29:32 +0000 (0:00:00.040)       0:00:54.934 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [debug] *******************************************************************
task path: /tmp/tmpfc26zqih/tests/storage/test-verify-volume-size.yml:20
Thursday 21 July 2022  10:29:32 +0000 (0:00:00.037)       0:00:54.972 ********* 
ok: [/cache/rhel-9.qcow2.snap] => {
    "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined"
}

TASK [debug] *******************************************************************
task path: /tmp/tmpfc26zqih/tests/storage/test-verify-volume-size.yml:25
Thursday 21 July 2022  10:29:32 +0000 (0:00:00.037)       0:00:55.010 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {}

TASK [debug] *******************************************************************
task path: /tmp/tmpfc26zqih/tests/storage/test-verify-volume-size.yml:28
Thursday 21 July 2022  10:29:32 +0000 (0:00:00.038)       0:00:55.048 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {}

TASK [Get the size of parent/pool device] **************************************
task path: /tmp/tmpfc26zqih/tests/storage/test-verify-volume-size.yml:31
Thursday 21 July 2022  10:29:32 +0000 (0:00:00.040)       0:00:55.089 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [debug] *******************************************************************
task path: /tmp/tmpfc26zqih/tests/storage/test-verify-volume-size.yml:36
Thursday 21 July 2022  10:29:32 +0000 (0:00:00.036)       0:00:55.126 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {}

TASK [Calculate the expected size based on pool size and percentage value] *****
task path: /tmp/tmpfc26zqih/tests/storage/test-verify-volume-size.yml:39
Thursday 21 July 2022  10:29:32 +0000 (0:00:00.035)       0:00:55.161 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [debug] *******************************************************************
task path: /tmp/tmpfc26zqih/tests/storage/test-verify-volume-size.yml:44
Thursday 21 July 2022  10:29:32 +0000 (0:00:00.035)       0:00:55.197 ********* 
ok: [/cache/rhel-9.qcow2.snap] => {
    "storage_test_actual_size": {
        "changed": false,
        "skip_reason": "Conditional result was False",
        "skipped": true
    }
}

TASK [debug] *******************************************************************
task path: /tmp/tmpfc26zqih/tests/storage/test-verify-volume-size.yml:47
Thursday 21 July 2022  10:29:32 +0000 (0:00:00.037)       0:00:55.234 ********* 
ok: [/cache/rhel-9.qcow2.snap] => {
    "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined"
}

TASK [assert] ******************************************************************
task path: /tmp/tmpfc26zqih/tests/storage/test-verify-volume-size.yml:50
Thursday 21 July 2022  10:29:32 +0000 (0:00:00.034)       0:00:55.269 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [Get information about the LV] ********************************************
task path: /tmp/tmpfc26zqih/tests/storage/test-verify-volume-cache.yml:6
Thursday 21 July 2022  10:29:32 +0000 (0:00:00.023)       0:00:55.292 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [set_fact] ****************************************************************
task path: /tmp/tmpfc26zqih/tests/storage/test-verify-volume-cache.yml:14
Thursday 21 July 2022  10:29:32 +0000 (0:00:00.023)       0:00:55.316 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [check segment type] ******************************************************
task path: /tmp/tmpfc26zqih/tests/storage/test-verify-volume-cache.yml:17
Thursday 21 July 2022  10:29:32 +0000 (0:00:00.025)       0:00:55.341 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [set_fact] ****************************************************************
task path: /tmp/tmpfc26zqih/tests/storage/test-verify-volume-cache.yml:22
Thursday 21 July 2022  10:29:32 +0000 (0:00:00.023)       0:00:55.365 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [parse the requested cache size] ******************************************
task path: /tmp/tmpfc26zqih/tests/storage/test-verify-volume-cache.yml:26
Thursday 21 July 2022  10:29:32 +0000 (0:00:00.024)       0:00:55.389 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [set_fact] ****************************************************************
task path: /tmp/tmpfc26zqih/tests/storage/test-verify-volume-cache.yml:32
Thursday 21 July 2022  10:29:32 +0000 (0:00:00.023)       0:00:55.413 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [Check cache size] ********************************************************
task path: /tmp/tmpfc26zqih/tests/storage/test-verify-volume-cache.yml:36
Thursday 21 July 2022  10:29:32 +0000 (0:00:00.024)       0:00:55.437 ********* 
skipping: [/cache/rhel-9.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [Clean up facts] **********************************************************
task path: /tmp/tmpfc26zqih/tests/storage/test-verify-volume.yml:16
Thursday 21 July 2022  10:29:32 +0000 (0:00:00.023)       0:00:55.461 ********* 
ok: [/cache/rhel-9.qcow2.snap] => {
    "ansible_facts": {
        "_storage_test_volume_present": null
    },
    "changed": false
}

TASK [Clean up variable namespace] *********************************************
task path: /tmp/tmpfc26zqih/tests/storage/verify-role-results.yml:53
Thursday 21 July 2022  10:29:32 +0000 (0:00:00.034)       0:00:55.495 ********* 
ok: [/cache/rhel-9.qcow2.snap] => {
    "ansible_facts": {
        "storage_test_blkinfo": null,
        "storage_test_crypttab": null,
        "storage_test_fstab": null
    },
    "changed": false
}
META: ran handlers
META: ran handlers

PLAY RECAP *********************************************************************
/cache/rhel-9.qcow2.snap   : ok=259  changed=6    unreachable=0    failed=0    skipped=228  rescued=0    ignored=0   

Thursday 21 July 2022  10:29:32 +0000 (0:00:00.049)       0:00:55.544 ********* 
=============================================================================== 
fedora.linux_system_roles.storage : get service facts ------------------- 1.80s
/tmp/tmpi19f9hzy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:47 
fedora.linux_system_roles.storage : manage the pools and volumes to match the specified state --- 1.70s
/tmp/tmpi19f9hzy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:64 
fedora.linux_system_roles.storage : manage the pools and volumes to match the specified state --- 1.65s
/tmp/tmpi19f9hzy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:64 
Gathering Facts --------------------------------------------------------- 1.36s
/tmp/tmpfc26zqih/tests/storage/tests_change_disk_mount.yml:2 ------------------
fedora.linux_system_roles.storage : manage the pools and volumes to match the specified state --- 1.31s
/tmp/tmpi19f9hzy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:64 
fedora.linux_system_roles.storage : manage the pools and volumes to match the specified state --- 1.30s
/tmp/tmpi19f9hzy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:64 
fedora.linux_system_roles.storage : make sure blivet is available ------- 1.29s
/tmp/tmpi19f9hzy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:7 
fedora.linux_system_roles.storage : Update facts ------------------------ 1.03s
/tmp/tmpi19f9hzy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:183 
fedora.linux_system_roles.storage : Update facts ------------------------ 0.99s
/tmp/tmpi19f9hzy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:183 
fedora.linux_system_roles.storage : Update facts ------------------------ 0.98s
/tmp/tmpi19f9hzy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:183 
fedora.linux_system_roles.storage : Update facts ------------------------ 0.96s
/tmp/tmpi19f9hzy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:183 
fedora.linux_system_roles.storage : Update facts ------------------------ 0.95s
/tmp/tmpi19f9hzy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:183 
fedora.linux_system_roles.storage : tell systemd to refresh its view of /etc/fstab --- 0.92s
/tmp/tmpi19f9hzy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:132 
fedora.linux_system_roles.storage : make sure required packages are installed --- 0.89s
/tmp/tmpi19f9hzy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:41 
Ensure cryptsetup is present -------------------------------------------- 0.89s
/tmp/tmpfc26zqih/tests/storage/test-verify-volume-encryption.yml:10 -----------
Ensure cryptsetup is present -------------------------------------------- 0.87s
/tmp/tmpfc26zqih/tests/storage/test-verify-volume-encryption.yml:10 -----------
Ensure cryptsetup is present -------------------------------------------- 0.87s
/tmp/tmpfc26zqih/tests/storage/test-verify-volume-encryption.yml:10 -----------
Ensure cryptsetup is present -------------------------------------------- 0.85s
/tmp/tmpfc26zqih/tests/storage/test-verify-volume-encryption.yml:10 -----------
fedora.linux_system_roles.storage : get required packages --------------- 0.75s
/tmp/tmpi19f9hzy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:23 
fedora.linux_system_roles.storage : tell systemd to refresh its view of /etc/fstab --- 0.73s
/tmp/tmpi19f9hzy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:132