ansible-playbook 2.9.27 config file = /etc/ansible/ansible.cfg configured module search path = [u'/root/.ansible/plugins/modules', u'/usr/share/ansible/plugins/modules'] ansible python module location = /usr/lib/python2.7/site-packages/ansible executable location = /usr/bin/ansible-playbook python version = 2.7.5 (default, Nov 14 2023, 16:14:06) [GCC 4.8.5 20150623 (Red Hat 4.8.5-44)] Using /etc/ansible/ansible.cfg as config file [WARNING]: running playbook inside collection fedora.linux_system_roles statically imported: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/create-test-file.yml statically imported: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml statically imported: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/create-test-file.yml statically imported: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml statically imported: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/create-test-file.yml statically imported: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml statically imported: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/create-test-file.yml statically imported: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml statically imported: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/create-test-file.yml statically imported: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml statically imported: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/create-test-file.yml statically imported: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml Skipping callback 'actionable', as we already have a stdout callback. Skipping callback 'counter_enabled', as we already have a stdout callback. Skipping callback 'debug', as we already have a stdout callback. Skipping callback 'dense', as we already have a stdout callback. Skipping callback 'dense', as we already have a stdout callback. Skipping callback 'full_skip', as we already have a stdout callback. Skipping callback 'json', as we already have a stdout callback. Skipping callback 'minimal', as we already have a stdout callback. Skipping callback 'null', as we already have a stdout callback. Skipping callback 'oneline', as we already have a stdout callback. Skipping callback 'selective', as we already have a stdout callback. Skipping callback 'skippy', as we already have a stdout callback. Skipping callback 'stderr', as we already have a stdout callback. Skipping callback 'unixy', as we already have a stdout callback. Skipping callback 'yaml', as we already have a stdout callback. PLAYBOOK: tests_luks2.yml ****************************************************** 1 plays in /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml PLAY [Test LUKS2] ************************************************************** TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:2 Friday 17 January 2025 04:10:07 -0500 (0:00:00.028) 0:00:00.028 ******** ok: [managed-node1] META: ran handlers TASK [Enable FIPS mode] ******************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:20 Friday 17 January 2025 04:10:08 -0500 (0:00:01.210) 0:00:01.240 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Reboot] ****************************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:28 Friday 17 January 2025 04:10:08 -0500 (0:00:00.047) 0:00:01.287 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Enable FIPS mode] ******************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:39 Friday 17 January 2025 04:10:08 -0500 (0:00:00.045) 0:00:01.332 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Reboot] ****************************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:43 Friday 17 January 2025 04:10:08 -0500 (0:00:00.046) 0:00:01.379 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Ensure dracut-fips] ****************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:53 Friday 17 January 2025 04:10:08 -0500 (0:00:00.048) 0:00:01.427 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Configure boot for FIPS] ************************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:59 Friday 17 January 2025 04:10:08 -0500 (0:00:00.073) 0:00:01.501 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Reboot] ****************************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:68 Friday 17 January 2025 04:10:08 -0500 (0:00:00.075) 0:00:01.576 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Run the role] ************************************************************ task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:72 Friday 17 January 2025 04:10:08 -0500 (0:00:00.075) 0:00:01.652 ******** TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Friday 17 January 2025 04:10:09 -0500 (0:00:00.091) 0:00:01.743 ******** included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node1 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Friday 17 January 2025 04:10:09 -0500 (0:00:00.065) 0:00:01.809 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Friday 17 January 2025 04:10:09 -0500 (0:00:00.072) 0:00:01.881 ******** skipping: [managed-node1] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node1] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node1] => (item=CentOS_7.yml) => { "ansible_facts": { "__storage_blivet_diskvolume_mkfs_option_map": { "ext2": "-F", "ext3": "-F", "ext4": "-F" }, "blivet_package_list": [ "python-enum34", "python-blivet3", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "{{ 'libblockdev-s390' if ansible_architecture == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_7.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_7.yml" } skipping: [managed-node1] => (item=CentOS_7.9.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS_7.9.yml", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Friday 17 January 2025 04:10:09 -0500 (0:00:00.130) 0:00:02.011 ******** ok: [managed-node1] => { "changed": false, "stat": { "exists": false } } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Friday 17 January 2025 04:10:09 -0500 (0:00:00.469) 0:00:02.480 ******** ok: [managed-node1] => { "ansible_facts": { "__storage_is_ostree": false }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Friday 17 January 2025 04:10:09 -0500 (0:00:00.051) 0:00:02.532 ******** ok: [managed-node1] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Friday 17 January 2025 04:10:09 -0500 (0:00:00.025) 0:00:02.557 ******** ok: [managed-node1] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Friday 17 January 2025 04:10:09 -0500 (0:00:00.031) 0:00:02.588 ******** included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node1 TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2 Friday 17 January 2025 04:10:10 -0500 (0:00:00.131) 0:00:02.719 ******** changed: [managed-node1] => { "changed": true, "changes": { "installed": [ "python-enum34", "python-blivet3", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "libblockdev" ] }, "rc": 0, "results": [ "Loaded plugins: fastestmirror\nLoading mirror speeds from cached hostfile\n * epel: d2lzkl7pfhq30w.cloudfront.net\n * epel-debuginfo: d2lzkl7pfhq30w.cloudfront.net\n * epel-source: d2lzkl7pfhq30w.cloudfront.net\nResolving Dependencies\n--> Running transaction check\n---> Package libblockdev.x86_64 0:2.18-5.el7 will be installed\n--> Processing Dependency: libblockdev-utils(x86-64) = 2.18-5.el7 for package: libblockdev-2.18-5.el7.x86_64\n--> Processing Dependency: libbd_utils.so.2()(64bit) for package: libblockdev-2.18-5.el7.x86_64\n---> Package libblockdev-crypto.x86_64 0:2.18-5.el7 will be installed\n--> Processing Dependency: libvolume_key.so.1()(64bit) for package: libblockdev-crypto-2.18-5.el7.x86_64\n---> Package libblockdev-dm.x86_64 0:2.18-5.el7 will be installed\n--> Processing Dependency: libdmraid.so.1(Base)(64bit) for package: libblockdev-dm-2.18-5.el7.x86_64\n--> Processing Dependency: dmraid for package: libblockdev-dm-2.18-5.el7.x86_64\n--> Processing Dependency: libdmraid.so.1()(64bit) for package: libblockdev-dm-2.18-5.el7.x86_64\n---> Package libblockdev-lvm.x86_64 0:2.18-5.el7 will be installed\n--> Processing Dependency: lvm2 for package: libblockdev-lvm-2.18-5.el7.x86_64\n--> Processing Dependency: device-mapper-persistent-data for package: libblockdev-lvm-2.18-5.el7.x86_64\n---> Package libblockdev-mdraid.x86_64 0:2.18-5.el7 will be installed\n--> Processing Dependency: mdadm for package: libblockdev-mdraid-2.18-5.el7.x86_64\n--> Processing Dependency: libbytesize.so.1()(64bit) for package: libblockdev-mdraid-2.18-5.el7.x86_64\n---> Package libblockdev-swap.x86_64 0:2.18-5.el7 will be installed\n---> Package python-enum34.noarch 0:1.0.4-1.el7 will be installed\n---> Package python2-blivet3.noarch 1:3.1.3-3.el7 will be installed\n--> Processing Dependency: blivet3-data = 1:3.1.3-3.el7 for package: 1:python2-blivet3-3.1.3-3.el7.noarch\n--> Processing Dependency: python2-bytesize >= 0.3 for package: 1:python2-blivet3-3.1.3-3.el7.noarch\n--> Processing Dependency: python2-blockdev >= 2.17 for package: 1:python2-blivet3-3.1.3-3.el7.noarch\n--> Processing Dependency: pyparted >= 3.9 for package: 1:python2-blivet3-3.1.3-3.el7.noarch\n--> Processing Dependency: python2-hawkey for package: 1:python2-blivet3-3.1.3-3.el7.noarch\n--> Processing Dependency: lsof for package: 1:python2-blivet3-3.1.3-3.el7.noarch\n--> Running transaction check\n---> Package blivet3-data.noarch 1:3.1.3-3.el7 will be installed\n---> Package device-mapper-persistent-data.x86_64 0:0.8.5-3.el7_9.2 will be installed\n--> Processing Dependency: libaio.so.1(LIBAIO_0.4)(64bit) for package: device-mapper-persistent-data-0.8.5-3.el7_9.2.x86_64\n--> Processing Dependency: libaio.so.1(LIBAIO_0.1)(64bit) for package: device-mapper-persistent-data-0.8.5-3.el7_9.2.x86_64\n--> Processing Dependency: libaio.so.1()(64bit) for package: device-mapper-persistent-data-0.8.5-3.el7_9.2.x86_64\n---> Package dmraid.x86_64 0:1.0.0.rc16-28.el7 will be installed\n--> Processing Dependency: libdevmapper-event.so.1.02(Base)(64bit) for package: dmraid-1.0.0.rc16-28.el7.x86_64\n--> Processing Dependency: dmraid-events for package: dmraid-1.0.0.rc16-28.el7.x86_64\n--> Processing Dependency: libdevmapper-event.so.1.02()(64bit) for package: dmraid-1.0.0.rc16-28.el7.x86_64\n---> Package libblockdev-utils.x86_64 0:2.18-5.el7 will be installed\n---> Package libbytesize.x86_64 0:1.2-1.el7 will be installed\n---> Package lsof.x86_64 0:4.87-6.el7 will be installed\n---> Package lvm2.x86_64 7:2.02.187-6.el7_9.5 will be installed\n--> Processing Dependency: lvm2-libs = 7:2.02.187-6.el7_9.5 for package: 7:lvm2-2.02.187-6.el7_9.5.x86_64\n--> Processing Dependency: liblvm2app.so.2.2(Base)(64bit) for package: 7:lvm2-2.02.187-6.el7_9.5.x86_64\n--> Processing Dependency: liblvm2app.so.2.2()(64bit) for package: 7:lvm2-2.02.187-6.el7_9.5.x86_64\n---> Package mdadm.x86_64 0:4.1-9.el7_9 will be installed\n--> Processing Dependency: libreport-filesystem for package: mdadm-4.1-9.el7_9.x86_64\n---> Package pyparted.x86_64 1:3.9-15.el7 will be installed\n---> Package python2-blockdev.x86_64 0:2.18-5.el7 will be installed\n---> Package python2-bytesize.x86_64 0:1.2-1.el7 will be installed\n---> Package python2-hawkey.x86_64 0:0.22.5-2.el7_9 will be installed\n--> Processing Dependency: python2-libdnf = 0.22.5-2.el7_9 for package: python2-hawkey-0.22.5-2.el7_9.x86_64\n--> Processing Dependency: libdnf(x86-64) = 0.22.5-2.el7_9 for package: python2-hawkey-0.22.5-2.el7_9.x86_64\n--> Processing Dependency: libsolvext.so.0(SOLV_1.0)(64bit) for package: python2-hawkey-0.22.5-2.el7_9.x86_64\n--> Processing Dependency: libsolv.so.0(SOLV_1.0)(64bit) for package: python2-hawkey-0.22.5-2.el7_9.x86_64\n--> Processing Dependency: libsolvext.so.0()(64bit) for package: python2-hawkey-0.22.5-2.el7_9.x86_64\n--> Processing Dependency: libsolv.so.0()(64bit) for package: python2-hawkey-0.22.5-2.el7_9.x86_64\n--> Processing Dependency: librepo.so.0()(64bit) for package: python2-hawkey-0.22.5-2.el7_9.x86_64\n--> Processing Dependency: libmodulemd.so.1()(64bit) for package: python2-hawkey-0.22.5-2.el7_9.x86_64\n--> Processing Dependency: libdnf.so.2()(64bit) for package: python2-hawkey-0.22.5-2.el7_9.x86_64\n---> Package volume_key-libs.x86_64 0:0.3.9-9.el7 will be installed\n--> Running transaction check\n---> Package device-mapper-event-libs.x86_64 7:1.02.170-6.el7_9.5 will be installed\n---> Package dmraid-events.x86_64 0:1.0.0.rc16-28.el7 will be installed\n--> Processing Dependency: sgpio for package: dmraid-events-1.0.0.rc16-28.el7.x86_64\n--> Processing Dependency: device-mapper-event for package: dmraid-events-1.0.0.rc16-28.el7.x86_64\n---> Package libaio.x86_64 0:0.3.109-13.el7 will be installed\n---> Package libdnf.x86_64 0:0.22.5-2.el7_9 will be installed\n---> Package libmodulemd.x86_64 0:1.6.3-1.el7 will be installed\n---> Package librepo.x86_64 0:1.8.1-8.el7_9 will be installed\n---> Package libreport-filesystem.x86_64 0:2.1.11-53.el7.centos will be installed\n---> Package libsolv.x86_64 0:0.6.34-4.el7 will be installed\n---> Package lvm2-libs.x86_64 7:2.02.187-6.el7_9.5 will be installed\n---> Package python2-libdnf.x86_64 0:0.22.5-2.el7_9 will be installed\n--> Running transaction check\n---> Package device-mapper-event.x86_64 7:1.02.170-6.el7_9.5 will be installed\n---> Package sgpio.x86_64 0:1.2.0.10-13.el7 will be installed\n--> Finished Dependency Resolution\n\nDependencies Resolved\n\n================================================================================\n Package Arch Version Repository\n Size\n================================================================================\nInstalling:\n libblockdev x86_64 2.18-5.el7 base 119 k\n libblockdev-crypto x86_64 2.18-5.el7 base 60 k\n libblockdev-dm x86_64 2.18-5.el7 base 54 k\n libblockdev-lvm x86_64 2.18-5.el7 base 62 k\n libblockdev-mdraid x86_64 2.18-5.el7 base 57 k\n libblockdev-swap x86_64 2.18-5.el7 base 52 k\n python-enum34 noarch 1.0.4-1.el7 base 52 k\n python2-blivet3 noarch 1:3.1.3-3.el7 base 851 k\nInstalling for dependencies:\n blivet3-data noarch 1:3.1.3-3.el7 base 77 k\n device-mapper-event x86_64 7:1.02.170-6.el7_9.5 updates 192 k\n device-mapper-event-libs x86_64 7:1.02.170-6.el7_9.5 updates 192 k\n device-mapper-persistent-data x86_64 0.8.5-3.el7_9.2 updates 423 k\n dmraid x86_64 1.0.0.rc16-28.el7 base 151 k\n dmraid-events x86_64 1.0.0.rc16-28.el7 base 21 k\n libaio x86_64 0.3.109-13.el7 base 24 k\n libblockdev-utils x86_64 2.18-5.el7 base 58 k\n libbytesize x86_64 1.2-1.el7 base 52 k\n libdnf x86_64 0.22.5-2.el7_9 extras 535 k\n libmodulemd x86_64 1.6.3-1.el7 extras 141 k\n librepo x86_64 1.8.1-8.el7_9 updates 82 k\n libreport-filesystem x86_64 2.1.11-53.el7.centos base 41 k\n libsolv x86_64 0.6.34-4.el7 base 329 k\n lsof x86_64 4.87-6.el7 base 331 k\n lvm2 x86_64 7:2.02.187-6.el7_9.5 updates 1.3 M\n lvm2-libs x86_64 7:2.02.187-6.el7_9.5 updates 1.1 M\n mdadm x86_64 4.1-9.el7_9 updates 439 k\n pyparted x86_64 1:3.9-15.el7 base 195 k\n python2-blockdev x86_64 2.18-5.el7 base 61 k\n python2-bytesize x86_64 1.2-1.el7 base 22 k\n python2-hawkey x86_64 0.22.5-2.el7_9 extras 71 k\n python2-libdnf x86_64 0.22.5-2.el7_9 extras 611 k\n sgpio x86_64 1.2.0.10-13.el7 base 13 k\n volume_key-libs x86_64 0.3.9-9.el7 base 141 k\n\nTransaction Summary\n================================================================================\nInstall 8 Packages (+25 Dependent packages)\n\nTotal download size: 7.8 M\nInstalled size: 23 M\nDownloading packages:\n--------------------------------------------------------------------------------\nTotal 5.1 MB/s | 7.8 MB 00:01 \nRunning transaction check\nRunning transaction test\nTransaction test succeeded\nRunning transaction\n Installing : libblockdev-utils-2.18-5.el7.x86_64 1/33 \n Installing : 7:device-mapper-event-libs-1.02.170-6.el7_9.5.x86_64 2/33 \n Installing : libsolv-0.6.34-4.el7.x86_64 3/33 \n Installing : libaio-0.3.109-13.el7.x86_64 4/33 \n Installing : librepo-1.8.1-8.el7_9.x86_64 5/33 \n Installing : libmodulemd-1.6.3-1.el7.x86_64 6/33 \n Installing : libdnf-0.22.5-2.el7_9.x86_64 7/33 \n Installing : device-mapper-persistent-data-0.8.5-3.el7_9.2.x86_64 8/33 \n Installing : 7:device-mapper-event-1.02.170-6.el7_9.5.x86_64 9/33 \n Installing : libbytesize-1.2-1.el7.x86_64 10/33 \n Installing : python2-bytesize-1.2-1.el7.x86_64 11/33 \n Installing : 7:lvm2-libs-2.02.187-6.el7_9.5.x86_64 12/33 \n Installing : 7:lvm2-2.02.187-6.el7_9.5.x86_64 13/33 \n Installing : python2-libdnf-0.22.5-2.el7_9.x86_64 14/33 \n Installing : python2-hawkey-0.22.5-2.el7_9.x86_64 15/33 \n Installing : libblockdev-2.18-5.el7.x86_64 16/33 \n Installing : python2-blockdev-2.18-5.el7.x86_64 17/33 \n Installing : 1:pyparted-3.9-15.el7.x86_64 18/33 \n Installing : sgpio-1.2.0.10-13.el7.x86_64 19/33 \n Installing : dmraid-1.0.0.rc16-28.el7.x86_64 20/33 \n Installing : dmraid-events-1.0.0.rc16-28.el7.x86_64 21/33 \n Installing : volume_key-libs-0.3.9-9.el7.x86_64 22/33 \n Installing : libreport-filesystem-2.1.11-53.el7.centos.x86_64 23/33 \n Installing : mdadm-4.1-9.el7_9.x86_64 24/33 \n Installing : 1:blivet3-data-3.1.3-3.el7.noarch 25/33 \n Installing : lsof-4.87-6.el7.x86_64 26/33 \n Installing : 1:python2-blivet3-3.1.3-3.el7.noarch 27/33 \n Installing : libblockdev-mdraid-2.18-5.el7.x86_64 28/33 \n Installing : libblockdev-crypto-2.18-5.el7.x86_64 29/33 \n Installing : libblockdev-dm-2.18-5.el7.x86_64 30/33 \n Installing : libblockdev-lvm-2.18-5.el7.x86_64 31/33 \n Installing : libblockdev-swap-2.18-5.el7.x86_64 32/33 \n Installing : python-enum34-1.0.4-1.el7.noarch 33/33 \n Verifying : 7:device-mapper-event-1.02.170-6.el7_9.5.x86_64 1/33 \n Verifying : libblockdev-swap-2.18-5.el7.x86_64 2/33 \n Verifying : libblockdev-lvm-2.18-5.el7.x86_64 3/33 \n Verifying : lsof-4.87-6.el7.x86_64 4/33 \n Verifying : libblockdev-mdraid-2.18-5.el7.x86_64 5/33 \n Verifying : libdnf-0.22.5-2.el7_9.x86_64 6/33 \n Verifying : python-enum34-1.0.4-1.el7.noarch 7/33 \n Verifying : 1:blivet3-data-3.1.3-3.el7.noarch 8/33 \n Verifying : dmraid-events-1.0.0.rc16-28.el7.x86_64 9/33 \n Verifying : python2-blockdev-2.18-5.el7.x86_64 10/33 \n Verifying : libmodulemd-1.6.3-1.el7.x86_64 11/33 \n Verifying : librepo-1.8.1-8.el7_9.x86_64 12/33 \n Verifying : libblockdev-dm-2.18-5.el7.x86_64 13/33 \n Verifying : libaio-0.3.109-13.el7.x86_64 14/33 \n Verifying : libreport-filesystem-2.1.11-53.el7.centos.x86_64 15/33 \n Verifying : 7:lvm2-libs-2.02.187-6.el7_9.5.x86_64 16/33 \n Verifying : python2-hawkey-0.22.5-2.el7_9.x86_64 17/33 \n Verifying : python2-bytesize-1.2-1.el7.x86_64 18/33 \n Verifying : libblockdev-2.18-5.el7.x86_64 19/33 \n Verifying : libbytesize-1.2-1.el7.x86_64 20/33 \n Verifying : 7:device-mapper-event-libs-1.02.170-6.el7_9.5.x86_64 21/33 \n Verifying : python2-libdnf-0.22.5-2.el7_9.x86_64 22/33 \n Verifying : 7:lvm2-2.02.187-6.el7_9.5.x86_64 23/33 \n Verifying : libblockdev-utils-2.18-5.el7.x86_64 24/33 \n Verifying : volume_key-libs-0.3.9-9.el7.x86_64 25/33 \n Verifying : libsolv-0.6.34-4.el7.x86_64 26/33 \n Verifying : device-mapper-persistent-data-0.8.5-3.el7_9.2.x86_64 27/33 \n Verifying : 1:python2-blivet3-3.1.3-3.el7.noarch 28/33 \n Verifying : dmraid-1.0.0.rc16-28.el7.x86_64 29/33 \n Verifying : mdadm-4.1-9.el7_9.x86_64 30/33 \n Verifying : sgpio-1.2.0.10-13.el7.x86_64 31/33 \n Verifying : libblockdev-crypto-2.18-5.el7.x86_64 32/33 \n Verifying : 1:pyparted-3.9-15.el7.x86_64 33/33 \n\nInstalled:\n libblockdev.x86_64 0:2.18-5.el7 libblockdev-crypto.x86_64 0:2.18-5.el7\n libblockdev-dm.x86_64 0:2.18-5.el7 libblockdev-lvm.x86_64 0:2.18-5.el7 \n libblockdev-mdraid.x86_64 0:2.18-5.el7 libblockdev-swap.x86_64 0:2.18-5.el7 \n python-enum34.noarch 0:1.0.4-1.el7 python2-blivet3.noarch 1:3.1.3-3.el7 \n\nDependency Installed:\n blivet3-data.noarch 1:3.1.3-3.el7 \n device-mapper-event.x86_64 7:1.02.170-6.el7_9.5 \n device-mapper-event-libs.x86_64 7:1.02.170-6.el7_9.5 \n device-mapper-persistent-data.x86_64 0:0.8.5-3.el7_9.2 \n dmraid.x86_64 0:1.0.0.rc16-28.el7 \n dmraid-events.x86_64 0:1.0.0.rc16-28.el7 \n libaio.x86_64 0:0.3.109-13.el7 \n libblockdev-utils.x86_64 0:2.18-5.el7 \n libbytesize.x86_64 0:1.2-1.el7 \n libdnf.x86_64 0:0.22.5-2.el7_9 \n libmodulemd.x86_64 0:1.6.3-1.el7 \n librepo.x86_64 0:1.8.1-8.el7_9 \n libreport-filesystem.x86_64 0:2.1.11-53.el7.centos \n libsolv.x86_64 0:0.6.34-4.el7 \n lsof.x86_64 0:4.87-6.el7 \n lvm2.x86_64 7:2.02.187-6.el7_9.5 \n lvm2-libs.x86_64 7:2.02.187-6.el7_9.5 \n mdadm.x86_64 0:4.1-9.el7_9 \n pyparted.x86_64 1:3.9-15.el7 \n python2-blockdev.x86_64 0:2.18-5.el7 \n python2-bytesize.x86_64 0:1.2-1.el7 \n python2-hawkey.x86_64 0:0.22.5-2.el7_9 \n python2-libdnf.x86_64 0:0.22.5-2.el7_9 \n sgpio.x86_64 0:1.2.0.10-13.el7 \n volume_key-libs.x86_64 0:0.3.9-9.el7 \n\nComplete!\n" ] } TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Friday 17 January 2025 04:10:20 -0500 (0:00:10.296) 0:00:13.016 ******** ok: [managed-node1] => { "storage_pools": "VARIABLE IS NOT DEFINED!: 'storage_pools' is undefined" } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:14 Friday 17 January 2025 04:10:20 -0500 (0:00:00.050) 0:00:13.067 ******** ok: [managed-node1] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19 Friday 17 January 2025 04:10:20 -0500 (0:00:00.050) 0:00:13.117 ******** ok: [managed-node1] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:31 Friday 17 January 2025 04:10:21 -0500 (0:00:00.655) 0:00:13.773 ******** included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node1 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Friday 17 January 2025 04:10:21 -0500 (0:00:00.085) 0:00:13.859 ******** TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Friday 17 January 2025 04:10:21 -0500 (0:00:00.023) 0:00:13.882 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Friday 17 January 2025 04:10:21 -0500 (0:00:00.025) 0:00:13.907 ******** TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:37 Friday 17 January 2025 04:10:21 -0500 (0:00:00.023) 0:00:13.930 ******** ok: [managed-node1] => { "changed": false, "rc": 0, "results": [ "kpartx-0.4.9-136.el7_9.x86_64 providing kpartx is already installed" ] } TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:51 Friday 17 January 2025 04:10:21 -0500 (0:00:00.591) 0:00:14.521 ******** ok: [managed-node1] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "arp-ethers.service": { "name": "arp-ethers.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "brandbot.service": { "name": "brandbot.service", "source": "systemd", "state": "inactive", "status": "static" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "console-shell.service": { "name": "console-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.import1.service": { "name": "dbus-org.freedesktop.import1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.machine1.service": { "name": "dbus-org.freedesktop.machine1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dmraid-activation.service": { "name": "dmraid-activation.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "ebtables.service": { "name": "ebtables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iprdump.service": { "name": "iprdump.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprinit.service": { "name": "iprinit.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprupdate.service": { "name": "iprupdate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-lvmetad.service": { "name": "lvm2-lvmetad.service", "source": "systemd", "state": "running", "status": "static" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "netconsole": { "name": "netconsole", "source": "sysv", "state": "stopped", "status": "disabled" }, "network": { "name": "network", "source": "sysv", "state": "running", "status": "enabled" }, "network.service": { "name": "network.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-config.service": { "name": "nfs-config.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-idmap.service": { "name": "nfs-idmap.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-lock.service": { "name": "nfs-lock.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-rquotad.service": { "name": "nfs-rquotad.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-secure.service": { "name": "nfs-secure.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs.service": { "name": "nfs.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfslock.service": { "name": "nfslock.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "postfix.service": { "name": "postfix.service", "source": "systemd", "state": "running", "status": "enabled" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rhel-autorelabel-mark.service": { "name": "rhel-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-autorelabel.service": { "name": "rhel-autorelabel.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-configure.service": { "name": "rhel-configure.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-dmesg.service": { "name": "rhel-dmesg.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-domainname.service": { "name": "rhel-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-import-state.service": { "name": "rhel-import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-loadmodules.service": { "name": "rhel-loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-readonly.service": { "name": "rhel-readonly.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-rquotad.service": { "name": "rpc-rquotad.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpcgssd.service": { "name": "rpcgssd.service", "source": "systemd", "state": "inactive", "status": "static" }, "rpcidmapd.service": { "name": "rpcidmapd.service", "source": "systemd", "state": "inactive", "status": "static" }, "rsyncd.service": { "name": "rsyncd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyncd@.service": { "name": "rsyncd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-policy-migrate-local-changes@.service": { "name": "selinux-policy-migrate-local-changes@.service", "source": "systemd", "state": "unknown", "status": "static" }, "selinux-policy-migrate-local-changes@targeted.service": { "name": "selinux-policy-migrate-local-changes@targeted.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "unknown" }, "sshd-keygen.service": { "name": "sshd-keygen.service", "source": "systemd", "state": "stopped", "status": "static" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-bootchart.service": { "name": "systemd-bootchart.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-importd.service": { "name": "systemd-importd.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-machined.service": { "name": "systemd-machined.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-nspawn@.service": { "name": "systemd-nspawn@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-readahead-collect.service": { "name": "systemd-readahead-collect.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-readahead-done.service": { "name": "systemd-readahead-done.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "systemd-readahead-drop.service": { "name": "systemd-readahead-drop.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "systemd-readahead-replay.service": { "name": "systemd-readahead-replay.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill@.service": { "name": "systemd-rfkill@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-shutdownd.service": { "name": "systemd-shutdownd.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "wpa_supplicant.service": { "name": "wpa_supplicant.service", "source": "systemd", "state": "inactive", "status": "disabled" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Set storage_cryptsetup_services] ***** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:57 Friday 17 January 2025 04:10:22 -0500 (0:00:01.129) 0:00:15.651 ******** ok: [managed-node1] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:63 Friday 17 January 2025 04:10:23 -0500 (0:00:00.085) 0:00:15.737 ******** TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:69 Friday 17 January 2025 04:10:23 -0500 (0:00:00.044) 0:00:15.782 ******** ok: [managed-node1] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:83 Friday 17 January 2025 04:10:23 -0500 (0:00:00.457) 0:00:16.239 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if /etc/fstab is present] ****** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:90 Friday 17 January 2025 04:10:23 -0500 (0:00:00.033) 0:00:16.273 ******** ok: [managed-node1] => { "changed": false, "stat": { "atime": 1737104722.005961, "attr_flags": "e", "attributes": [ "extents" ], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "4db69458c23204aa354c1fce8c724ba0713d6623", "ctime": 1718881114.40265, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 131078, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0644", "mtime": 1718881114.40265, "nlink": 1, "path": "/etc/fstab", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 1207, "uid": 0, "version": "18446744072852913878", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Add fingerprint to /etc/fstab if present] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:95 Friday 17 January 2025 04:10:23 -0500 (0:00:00.314) 0:00:16.588 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:113 Friday 17 January 2025 04:10:23 -0500 (0:00:00.037) 0:00:16.625 ******** TASK [fedora.linux_system_roles.storage : Show blivet_output] ****************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:119 Friday 17 January 2025 04:10:23 -0500 (0:00:00.031) 0:00:16.656 ******** ok: [managed-node1] => { "blivet_output": { "actions": [], "changed": false, "crypts": [], "failed": false, "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } } TASK [fedora.linux_system_roles.storage : Set the list of pools for test verification] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:128 Friday 17 January 2025 04:10:23 -0500 (0:00:00.038) 0:00:16.694 ******** ok: [managed-node1] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Set the list of volumes for test verification] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:132 Friday 17 January 2025 04:10:24 -0500 (0:00:00.036) 0:00:16.731 ******** ok: [managed-node1] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Remove obsolete mounts] ************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:148 Friday 17 January 2025 04:10:24 -0500 (0:00:00.038) 0:00:16.769 ******** TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:159 Friday 17 January 2025 04:10:24 -0500 (0:00:00.043) 0:00:16.813 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set up new/current mounts] *********** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:164 Friday 17 January 2025 04:10:24 -0500 (0:00:00.033) 0:00:16.847 ******** TASK [fedora.linux_system_roles.storage : Manage mount ownership/permissions] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:175 Friday 17 January 2025 04:10:24 -0500 (0:00:00.032) 0:00:16.880 ******** TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:187 Friday 17 January 2025 04:10:24 -0500 (0:00:00.032) 0:00:16.912 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Retrieve facts for the /etc/crypttab file] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:195 Friday 17 January 2025 04:10:24 -0500 (0:00:00.032) 0:00:16.945 ******** ok: [managed-node1] => { "changed": false, "stat": { "atime": 1737104945.097923, "attr_flags": "e", "attributes": [ "extents" ], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1718879272.062, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 131079, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0600", "mtime": 1718879026.308, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": "18446744072852913879", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Manage /etc/crypttab to account for changes we just made] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:200 Friday 17 January 2025 04:10:24 -0500 (0:00:00.311) 0:00:17.256 ******** TASK [fedora.linux_system_roles.storage : Update facts] ************************ task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:222 Friday 17 January 2025 04:10:24 -0500 (0:00:00.032) 0:00:17.289 ******** ok: [managed-node1] TASK [Get unused disks] ******************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:76 Friday 17 January 2025 04:10:25 -0500 (0:00:00.654) 0:00:17.943 ******** included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/get_unused_disk.yml for managed-node1 TASK [Ensure test packages] **************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/get_unused_disk.yml:2 Friday 17 January 2025 04:10:25 -0500 (0:00:00.064) 0:00:18.007 ******** ok: [managed-node1] => { "changed": false, "rc": 0, "results": [ "util-linux-2.23.2-65.el7_9.1.x86_64 providing util-linux is already installed" ] } TASK [Find unused disks in the system] ***************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/get_unused_disk.yml:11 Friday 17 January 2025 04:10:25 -0500 (0:00:00.633) 0:00:18.641 ******** ok: [managed-node1] => { "changed": false, "disks": [ "sda" ], "info": [ "Line: NAME=\"/dev/sda\" TYPE=\"disk\" SIZE=\"10737418240\" FSTYPE=\"\" LOG-SEC=\"512\"", "Line: NAME=\"/dev/sdb\" TYPE=\"disk\" SIZE=\"10737418240\" FSTYPE=\"\" LOG-SEC=\"512\"", "Line: NAME=\"/dev/sdc\" TYPE=\"disk\" SIZE=\"10737418240\" FSTYPE=\"\" LOG-SEC=\"512\"", "Line: NAME=\"/dev/sdd\" TYPE=\"disk\" SIZE=\"1099511627776\" FSTYPE=\"\" LOG-SEC=\"512\"", "Line: NAME=\"/dev/sde\" TYPE=\"disk\" SIZE=\"1099511627776\" FSTYPE=\"\" LOG-SEC=\"512\"", "Line: NAME=\"/dev/sdf\" TYPE=\"disk\" SIZE=\"10737418240\" FSTYPE=\"\" LOG-SEC=\"512\"", "Line: NAME=\"/dev/sdg\" TYPE=\"disk\" SIZE=\"1099511627776\" FSTYPE=\"\" LOG-SEC=\"512\"", "Line: NAME=\"/dev/sdh\" TYPE=\"disk\" SIZE=\"10737418240\" FSTYPE=\"\" LOG-SEC=\"512\"", "Line: NAME=\"/dev/sdi\" TYPE=\"disk\" SIZE=\"10737418240\" FSTYPE=\"\" LOG-SEC=\"512\"", "Line: NAME=\"/dev/xvda\" TYPE=\"disk\" SIZE=\"268435456000\" FSTYPE=\"\" LOG-SEC=\"512\"", "Line: NAME=\"/dev/xvda1\" TYPE=\"part\" SIZE=\"268434390528\" FSTYPE=\"ext4\" LOG-SEC=\"512\"", "Line type [part] is not disk: NAME=\"/dev/xvda1\" TYPE=\"part\" SIZE=\"268434390528\" FSTYPE=\"ext4\" LOG-SEC=\"512\"", "filename [xvda1] is a partition", "Disk [/dev/xvda] attrs [{'fstype': '', 'type': 'disk', 'ssize': '512', 'size': '268435456000'}] has partitions" ] } TASK [Debug why there are no unused disks] ************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/get_unused_disk.yml:20 Friday 17 January 2025 04:10:26 -0500 (0:00:00.549) 0:00:19.190 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set unused_disks if necessary] ******************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/get_unused_disk.yml:29 Friday 17 January 2025 04:10:26 -0500 (0:00:00.051) 0:00:19.241 ******** ok: [managed-node1] => { "ansible_facts": { "unused_disks": [ "sda" ] }, "changed": false } TASK [Exit playbook when there's not enough unused disks in the system] ******** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/get_unused_disk.yml:34 Friday 17 January 2025 04:10:26 -0500 (0:00:00.058) 0:00:19.300 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Print unused disks] ****************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/get_unused_disk.yml:39 Friday 17 January 2025 04:10:26 -0500 (0:00:00.050) 0:00:19.351 ******** ok: [managed-node1] => { "unused_disks": [ "sda" ] } TASK [Test for correct handling of new encrypted volume w/ no key] ************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:85 Friday 17 January 2025 04:10:26 -0500 (0:00:00.055) 0:00:19.406 ******** included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml for managed-node1 TASK [Store global variable value copy] **************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:4 Friday 17 January 2025 04:10:26 -0500 (0:00:00.105) 0:00:19.511 ******** ok: [managed-node1] => { "ansible_facts": { "storage_pools_global": [], "storage_safe_mode_global": true, "storage_volumes_global": [] }, "changed": false } TASK [Verify role raises correct error] **************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:10 Friday 17 January 2025 04:10:26 -0500 (0:00:00.050) 0:00:19.561 ******** TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Friday 17 January 2025 04:10:26 -0500 (0:00:00.069) 0:00:19.631 ******** included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node1 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Friday 17 January 2025 04:10:27 -0500 (0:00:00.111) 0:00:19.742 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Friday 17 January 2025 04:10:27 -0500 (0:00:00.042) 0:00:19.785 ******** skipping: [managed-node1] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node1] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node1] => (item=CentOS_7.yml) => { "ansible_facts": { "__storage_blivet_diskvolume_mkfs_option_map": { "ext2": "-F", "ext3": "-F", "ext4": "-F" }, "blivet_package_list": [ "python-enum34", "python-blivet3", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "{{ 'libblockdev-s390' if ansible_architecture == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_7.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_7.yml" } skipping: [managed-node1] => (item=CentOS_7.9.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS_7.9.yml", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Friday 17 January 2025 04:10:27 -0500 (0:00:00.090) 0:00:19.875 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Friday 17 January 2025 04:10:27 -0500 (0:00:00.044) 0:00:19.920 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Friday 17 January 2025 04:10:27 -0500 (0:00:00.035) 0:00:19.955 ******** ok: [managed-node1] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Friday 17 January 2025 04:10:27 -0500 (0:00:00.037) 0:00:19.992 ******** ok: [managed-node1] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Friday 17 January 2025 04:10:27 -0500 (0:00:00.039) 0:00:20.032 ******** included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node1 TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2 Friday 17 January 2025 04:10:27 -0500 (0:00:00.114) 0:00:20.146 ******** ok: [managed-node1] => { "changed": false, "rc": 0, "results": [ "python-enum34-1.0.4-1.el7.noarch providing python-enum34 is already installed", "1:python2-blivet3-3.1.3-3.el7.noarch providing python-blivet3 is already installed", "libblockdev-crypto-2.18-5.el7.x86_64 providing libblockdev-crypto is already installed", "libblockdev-dm-2.18-5.el7.x86_64 providing libblockdev-dm is already installed", "libblockdev-lvm-2.18-5.el7.x86_64 providing libblockdev-lvm is already installed", "libblockdev-mdraid-2.18-5.el7.x86_64 providing libblockdev-mdraid is already installed", "libblockdev-swap-2.18-5.el7.x86_64 providing libblockdev-swap is already installed", "libblockdev-2.18-5.el7.x86_64 providing libblockdev is already installed" ] } TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Friday 17 January 2025 04:10:31 -0500 (0:00:03.949) 0:00:24.096 ******** ok: [managed-node1] => { "storage_pools": [] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:14 Friday 17 January 2025 04:10:31 -0500 (0:00:00.070) 0:00:24.166 ******** ok: [managed-node1] => { "storage_volumes": [ { "disks": [ "sda" ], "encryption": true, "encryption_luks_version": "luks2", "mount_point": "/opt/test1", "name": "foo", "type": "disk" } ] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19 Friday 17 January 2025 04:10:31 -0500 (0:00:00.068) 0:00:24.235 ******** ok: [managed-node1] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "cryptsetup" ], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:31 Friday 17 January 2025 04:10:35 -0500 (0:00:04.281) 0:00:28.517 ******** included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node1 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Friday 17 January 2025 04:10:35 -0500 (0:00:00.086) 0:00:28.604 ******** TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Friday 17 January 2025 04:10:35 -0500 (0:00:00.043) 0:00:28.648 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Friday 17 January 2025 04:10:35 -0500 (0:00:00.045) 0:00:28.694 ******** TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:37 Friday 17 January 2025 04:10:36 -0500 (0:00:00.039) 0:00:28.734 ******** changed: [managed-node1] => { "changed": true, "changes": { "installed": [ "cryptsetup" ] }, "rc": 0, "results": [ "kpartx-0.4.9-136.el7_9.x86_64 providing kpartx is already installed", "Loaded plugins: fastestmirror\nLoading mirror speeds from cached hostfile\n * epel: d2lzkl7pfhq30w.cloudfront.net\n * epel-debuginfo: d2lzkl7pfhq30w.cloudfront.net\n * epel-source: d2lzkl7pfhq30w.cloudfront.net\nResolving Dependencies\n--> Running transaction check\n---> Package cryptsetup.x86_64 0:2.0.3-6.el7 will be installed\n--> Finished Dependency Resolution\n\nDependencies Resolved\n\n================================================================================\n Package Arch Version Repository Size\n================================================================================\nInstalling:\n cryptsetup x86_64 2.0.3-6.el7 base 154 k\n\nTransaction Summary\n================================================================================\nInstall 1 Package\n\nTotal download size: 154 k\nInstalled size: 354 k\nDownloading packages:\nRunning transaction check\nRunning transaction test\nTransaction test succeeded\nRunning transaction\n Installing : cryptsetup-2.0.3-6.el7.x86_64 1/1 \n Verifying : cryptsetup-2.0.3-6.el7.x86_64 1/1 \n\nInstalled:\n cryptsetup.x86_64 0:2.0.3-6.el7 \n\nComplete!\n" ] } TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:51 Friday 17 January 2025 04:10:44 -0500 (0:00:08.141) 0:00:36.876 ******** ok: [managed-node1] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "arp-ethers.service": { "name": "arp-ethers.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "brandbot.service": { "name": "brandbot.service", "source": "systemd", "state": "inactive", "status": "static" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "console-shell.service": { "name": "console-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.import1.service": { "name": "dbus-org.freedesktop.import1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.machine1.service": { "name": "dbus-org.freedesktop.machine1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dmraid-activation.service": { "name": "dmraid-activation.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "ebtables.service": { "name": "ebtables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iprdump.service": { "name": "iprdump.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprinit.service": { "name": "iprinit.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprupdate.service": { "name": "iprupdate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-lvmetad.service": { "name": "lvm2-lvmetad.service", "source": "systemd", "state": "running", "status": "static" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "netconsole": { "name": "netconsole", "source": "sysv", "state": "stopped", "status": "disabled" }, "network": { "name": "network", "source": "sysv", "state": "running", "status": "enabled" }, "network.service": { "name": "network.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-config.service": { "name": "nfs-config.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-idmap.service": { "name": "nfs-idmap.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-lock.service": { "name": "nfs-lock.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-rquotad.service": { "name": "nfs-rquotad.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-secure.service": { "name": "nfs-secure.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs.service": { "name": "nfs.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfslock.service": { "name": "nfslock.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "postfix.service": { "name": "postfix.service", "source": "systemd", "state": "running", "status": "enabled" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rhel-autorelabel-mark.service": { "name": "rhel-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-autorelabel.service": { "name": "rhel-autorelabel.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-configure.service": { "name": "rhel-configure.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-dmesg.service": { "name": "rhel-dmesg.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-domainname.service": { "name": "rhel-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-import-state.service": { "name": "rhel-import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-loadmodules.service": { "name": "rhel-loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-readonly.service": { "name": "rhel-readonly.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-rquotad.service": { "name": "rpc-rquotad.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpcgssd.service": { "name": "rpcgssd.service", "source": "systemd", "state": "inactive", "status": "static" }, "rpcidmapd.service": { "name": "rpcidmapd.service", "source": "systemd", "state": "inactive", "status": "static" }, "rsyncd.service": { "name": "rsyncd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyncd@.service": { "name": "rsyncd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-policy-migrate-local-changes@.service": { "name": "selinux-policy-migrate-local-changes@.service", "source": "systemd", "state": "unknown", "status": "static" }, "selinux-policy-migrate-local-changes@targeted.service": { "name": "selinux-policy-migrate-local-changes@targeted.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "unknown" }, "sshd-keygen.service": { "name": "sshd-keygen.service", "source": "systemd", "state": "stopped", "status": "static" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-bootchart.service": { "name": "systemd-bootchart.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-importd.service": { "name": "systemd-importd.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-machined.service": { "name": "systemd-machined.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-nspawn@.service": { "name": "systemd-nspawn@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-readahead-collect.service": { "name": "systemd-readahead-collect.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-readahead-done.service": { "name": "systemd-readahead-done.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "systemd-readahead-drop.service": { "name": "systemd-readahead-drop.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "systemd-readahead-replay.service": { "name": "systemd-readahead-replay.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill@.service": { "name": "systemd-rfkill@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-shutdownd.service": { "name": "systemd-shutdownd.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "wpa_supplicant.service": { "name": "wpa_supplicant.service", "source": "systemd", "state": "inactive", "status": "disabled" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Set storage_cryptsetup_services] ***** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:57 Friday 17 January 2025 04:10:45 -0500 (0:00:01.025) 0:00:37.901 ******** ok: [managed-node1] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:63 Friday 17 January 2025 04:10:45 -0500 (0:00:00.114) 0:00:38.015 ******** TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:69 Friday 17 January 2025 04:10:45 -0500 (0:00:00.055) 0:00:38.071 ******** fatal: [managed-node1]: FAILED! => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } MSG: encrypted volume 'foo' missing key/password TASK [fedora.linux_system_roles.storage : Failed message] ********************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:109 Friday 17 January 2025 04:10:49 -0500 (0:00:04.104) 0:00:42.175 ******** fatal: [managed-node1]: FAILED! => { "changed": false } MSG: {u'_ansible_no_log': False, u'crypts': [], u'pools': [], u'leaves': [], u'changed': False, u'actions': [], u'failed': True, u'volumes': [], u'invocation': {u'module_args': {u'packages_only': False, u'disklabel_type': None, u'diskvolume_mkfs_option_map': {u'ext4': u'-F', u'ext3': u'-F', u'ext2': u'-F'}, u'safe_mode': True, u'pools': [], u'volumes': [{u'raid_metadata_version': None, u'raid_level': None, u'fs_type': u'xfs', u'mount_options': u'defaults', u'size': 10737418240, u'mount_point': u'/opt/test1', u'compression': None, u'encryption_password': None, u'encryption': True, u'mount_device_identifier': u'uuid', u'raid_device_count': None, u'state': u'present', u'vdo_pool_size': None, u'mount_mode': None, u'thin_pool_name': None, u'fs_overwrite_existing': True, u'encryption_key_size': None, u'deduplication': None, u'encryption_cipher': None, u'encryption_key': None, u'fs_label': u'', u'encryption_luks_version': u'luks2', u'raid_stripe_size': None, u'cache_size': 0, u'mount_user': None, u'raid_spare_count': None, u'cache_mode': None, u'name': u'foo', u'mount_group': None, u'type': u'disk', u'disks': [u'sda'], u'cached': False, u'thin_pool_size': None, u'thin': None, u'mount_check': 0, u'mount_passno': 0, u'raid_chunk_size': None, u'cache_devices': [], u'fs_create_options': u''}], u'pool_defaults': {u'raid_metadata_version': None, u'encryption_cipher': None, u'encryption_key': None, u'encryption_luks_version': None, u'raid_spare_count': None, u'grow_to_fill': False, u'encryption_password': None, u'encryption': False, u'disks': [], u'raid_level': None, u'raid_device_count': None, u'state': u'present', u'volumes': [], u'shared': False, u'type': u'lvm', u'encryption_key_size': None, u'raid_chunk_size': None}, u'volume_defaults': {u'raid_metadata_version': None, u'raid_level': None, u'fs_type': u'xfs', u'mount_options': u'defaults', u'size': 0, u'mount_point': u'', u'compression': None, u'encryption_password': None, u'encryption': False, u'mount_device_identifier': u'uuid', u'raid_device_count': None, u'state': u'present', u'vdo_pool_size': None, u'thin_pool_name': None, u'type': u'lvm', u'encryption_key_size': None, u'encryption_cipher': None, u'encryption_key': None, u'fs_label': u'', u'encryption_luks_version': None, u'raid_stripe_size': None, u'cache_size': 0, u'raid_spare_count': None, u'cache_mode': None, u'deduplication': None, u'cached': False, u'fs_overwrite_existing': True, u'disks': [], u'thin_pool_size': None, u'thin': None, u'mount_check': 0, u'mount_passno': 0, u'raid_chunk_size': None, u'cache_devices': [], u'fs_create_options': u''}, u'use_partitions': None}}, u'mounts': [], u'packages': [], u'msg': u"encrypted volume 'foo' missing key/password"} TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:113 Friday 17 January 2025 04:10:49 -0500 (0:00:00.069) 0:00:42.245 ******** TASK [Check that we failed in the role] **************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:23 Friday 17 January 2025 04:10:49 -0500 (0:00:00.049) 0:00:42.294 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Verify the blivet output and error message are correct] ****************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:28 Friday 17 January 2025 04:10:49 -0500 (0:00:00.065) 0:00:42.360 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Verify correct exception or error message] ******************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:39 Friday 17 January 2025 04:10:49 -0500 (0:00:00.074) 0:00:42.435 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Create an encrypted disk volume w/ default fs] *************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:101 Friday 17 January 2025 04:10:49 -0500 (0:00:00.051) 0:00:42.486 ******** TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Friday 17 January 2025 04:10:49 -0500 (0:00:00.142) 0:00:42.629 ******** included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node1 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Friday 17 January 2025 04:10:50 -0500 (0:00:00.101) 0:00:42.731 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Friday 17 January 2025 04:10:50 -0500 (0:00:00.113) 0:00:42.844 ******** skipping: [managed-node1] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node1] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node1] => (item=CentOS_7.yml) => { "ansible_facts": { "__storage_blivet_diskvolume_mkfs_option_map": { "ext2": "-F", "ext3": "-F", "ext4": "-F" }, "blivet_package_list": [ "python-enum34", "python-blivet3", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "{{ 'libblockdev-s390' if ansible_architecture == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_7.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_7.yml" } skipping: [managed-node1] => (item=CentOS_7.9.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS_7.9.yml", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Friday 17 January 2025 04:10:50 -0500 (0:00:00.095) 0:00:42.940 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Friday 17 January 2025 04:10:50 -0500 (0:00:00.036) 0:00:42.976 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Friday 17 January 2025 04:10:50 -0500 (0:00:00.034) 0:00:43.010 ******** ok: [managed-node1] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Friday 17 January 2025 04:10:50 -0500 (0:00:00.035) 0:00:43.046 ******** ok: [managed-node1] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Friday 17 January 2025 04:10:50 -0500 (0:00:00.035) 0:00:43.082 ******** included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node1 TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2 Friday 17 January 2025 04:10:50 -0500 (0:00:00.082) 0:00:43.165 ******** ok: [managed-node1] => { "changed": false, "rc": 0, "results": [ "python-enum34-1.0.4-1.el7.noarch providing python-enum34 is already installed", "1:python2-blivet3-3.1.3-3.el7.noarch providing python-blivet3 is already installed", "libblockdev-crypto-2.18-5.el7.x86_64 providing libblockdev-crypto is already installed", "libblockdev-dm-2.18-5.el7.x86_64 providing libblockdev-dm is already installed", "libblockdev-lvm-2.18-5.el7.x86_64 providing libblockdev-lvm is already installed", "libblockdev-mdraid-2.18-5.el7.x86_64 providing libblockdev-mdraid is already installed", "libblockdev-swap-2.18-5.el7.x86_64 providing libblockdev-swap is already installed", "libblockdev-2.18-5.el7.x86_64 providing libblockdev is already installed" ] } TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Friday 17 January 2025 04:10:54 -0500 (0:00:03.925) 0:00:47.090 ******** ok: [managed-node1] => { "storage_pools": "VARIABLE IS NOT DEFINED!: 'storage_pools' is undefined" } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:14 Friday 17 January 2025 04:10:54 -0500 (0:00:00.120) 0:00:47.211 ******** ok: [managed-node1] => { "storage_volumes": [ { "disks": [ "sda" ], "encryption": true, "encryption_luks_version": "luks2", "encryption_password": "yabbadabbadoo", "mount_point": "/opt/test1", "name": "foo", "type": "disk" } ] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19 Friday 17 January 2025 04:10:54 -0500 (0:00:00.080) 0:00:47.291 ******** ok: [managed-node1] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "cryptsetup" ], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:31 Friday 17 January 2025 04:10:58 -0500 (0:00:03.691) 0:00:50.982 ******** included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node1 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Friday 17 January 2025 04:10:58 -0500 (0:00:00.063) 0:00:51.045 ******** TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Friday 17 January 2025 04:10:58 -0500 (0:00:00.033) 0:00:51.079 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Friday 17 January 2025 04:10:58 -0500 (0:00:00.035) 0:00:51.114 ******** TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:37 Friday 17 January 2025 04:10:58 -0500 (0:00:00.034) 0:00:51.148 ******** ok: [managed-node1] => { "changed": false, "rc": 0, "results": [ "cryptsetup-2.0.3-6.el7.x86_64 providing cryptsetup is already installed", "kpartx-0.4.9-136.el7_9.x86_64 providing kpartx is already installed" ] } TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:51 Friday 17 January 2025 04:10:59 -0500 (0:00:00.708) 0:00:51.857 ******** ok: [managed-node1] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "arp-ethers.service": { "name": "arp-ethers.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "brandbot.service": { "name": "brandbot.service", "source": "systemd", "state": "inactive", "status": "static" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "console-shell.service": { "name": "console-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.import1.service": { "name": "dbus-org.freedesktop.import1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.machine1.service": { "name": "dbus-org.freedesktop.machine1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dmraid-activation.service": { "name": "dmraid-activation.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "ebtables.service": { "name": "ebtables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iprdump.service": { "name": "iprdump.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprinit.service": { "name": "iprinit.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprupdate.service": { "name": "iprupdate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-lvmetad.service": { "name": "lvm2-lvmetad.service", "source": "systemd", "state": "running", "status": "static" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "netconsole": { "name": "netconsole", "source": "sysv", "state": "stopped", "status": "disabled" }, "network": { "name": "network", "source": "sysv", "state": "running", "status": "enabled" }, "network.service": { "name": "network.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-config.service": { "name": "nfs-config.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-idmap.service": { "name": "nfs-idmap.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-lock.service": { "name": "nfs-lock.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-rquotad.service": { "name": "nfs-rquotad.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-secure.service": { "name": "nfs-secure.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs.service": { "name": "nfs.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfslock.service": { "name": "nfslock.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "postfix.service": { "name": "postfix.service", "source": "systemd", "state": "running", "status": "enabled" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rhel-autorelabel-mark.service": { "name": "rhel-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-autorelabel.service": { "name": "rhel-autorelabel.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-configure.service": { "name": "rhel-configure.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-dmesg.service": { "name": "rhel-dmesg.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-domainname.service": { "name": "rhel-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-import-state.service": { "name": "rhel-import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-loadmodules.service": { "name": "rhel-loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-readonly.service": { "name": "rhel-readonly.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-rquotad.service": { "name": "rpc-rquotad.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpcgssd.service": { "name": "rpcgssd.service", "source": "systemd", "state": "inactive", "status": "static" }, "rpcidmapd.service": { "name": "rpcidmapd.service", "source": "systemd", "state": "inactive", "status": "static" }, "rsyncd.service": { "name": "rsyncd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyncd@.service": { "name": "rsyncd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-policy-migrate-local-changes@.service": { "name": "selinux-policy-migrate-local-changes@.service", "source": "systemd", "state": "unknown", "status": "static" }, "selinux-policy-migrate-local-changes@targeted.service": { "name": "selinux-policy-migrate-local-changes@targeted.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "unknown" }, "sshd-keygen.service": { "name": "sshd-keygen.service", "source": "systemd", "state": "stopped", "status": "static" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-bootchart.service": { "name": "systemd-bootchart.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-importd.service": { "name": "systemd-importd.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-machined.service": { "name": "systemd-machined.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-nspawn@.service": { "name": "systemd-nspawn@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-readahead-collect.service": { "name": "systemd-readahead-collect.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-readahead-done.service": { "name": "systemd-readahead-done.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "systemd-readahead-drop.service": { "name": "systemd-readahead-drop.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "systemd-readahead-replay.service": { "name": "systemd-readahead-replay.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill@.service": { "name": "systemd-rfkill@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-shutdownd.service": { "name": "systemd-shutdownd.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "wpa_supplicant.service": { "name": "wpa_supplicant.service", "source": "systemd", "state": "inactive", "status": "disabled" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Set storage_cryptsetup_services] ***** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:57 Friday 17 January 2025 04:11:00 -0500 (0:00:01.272) 0:00:53.130 ******** ok: [managed-node1] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:63 Friday 17 January 2025 04:11:00 -0500 (0:00:00.130) 0:00:53.261 ******** TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:69 Friday 17 January 2025 04:11:00 -0500 (0:00:00.089) 0:00:53.351 ******** changed: [managed-node1] => { "actions": [ { "action": "create format", "device": "/dev/sda", "fs_type": "luks" }, { "action": "create device", "device": "/dev/mapper/luks-688ee6a7-12e5-4393-bfd5-48ed7d374b8d", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/luks-688ee6a7-12e5-4393-bfd5-48ed7d374b8d", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/sda", "name": "luks-688ee6a7-12e5-4393-bfd5-48ed7d374b8d", "password": "-", "state": "present" } ], "leaves": [ "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1", "/dev/mapper/luks-688ee6a7-12e5-4393-bfd5-48ed7d374b8d" ], "mounts": [ { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-688ee6a7-12e5-4393-bfd5-48ed7d374b8d", "state": "mounted" } ], "packages": [ "cryptsetup", "xfsprogs", "e2fsprogs" ], "pools": [], "volumes": [ { "_device": "/dev/mapper/luks-688ee6a7-12e5-4393-bfd5-48ed7d374b8d", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-688ee6a7-12e5-4393-bfd5-48ed7d374b8d", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10737418240, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] } TASK [fedora.linux_system_roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:83 Friday 17 January 2025 04:11:14 -0500 (0:00:13.438) 0:01:06.789 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if /etc/fstab is present] ****** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:90 Friday 17 January 2025 04:11:14 -0500 (0:00:00.043) 0:01:06.832 ******** ok: [managed-node1] => { "changed": false, "stat": { "atime": 1737104722.005961, "attr_flags": "e", "attributes": [ "extents" ], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "4db69458c23204aa354c1fce8c724ba0713d6623", "ctime": 1718881114.40265, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 131078, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0644", "mtime": 1718881114.40265, "nlink": 1, "path": "/etc/fstab", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 1207, "uid": 0, "version": "18446744072852913878", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Add fingerprint to /etc/fstab if present] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:95 Friday 17 January 2025 04:11:14 -0500 (0:00:00.331) 0:01:07.164 ******** changed: [managed-node1] => { "backup": "", "changed": true } MSG: line added TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:113 Friday 17 January 2025 04:11:14 -0500 (0:00:00.494) 0:01:07.658 ******** TASK [fedora.linux_system_roles.storage : Show blivet_output] ****************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:119 Friday 17 January 2025 04:11:14 -0500 (0:00:00.032) 0:01:07.690 ******** ok: [managed-node1] => { "blivet_output": { "actions": [ { "action": "create format", "device": "/dev/sda", "fs_type": "luks" }, { "action": "create device", "device": "/dev/mapper/luks-688ee6a7-12e5-4393-bfd5-48ed7d374b8d", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/luks-688ee6a7-12e5-4393-bfd5-48ed7d374b8d", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/sda", "name": "luks-688ee6a7-12e5-4393-bfd5-48ed7d374b8d", "password": "-", "state": "present" } ], "failed": false, "leaves": [ "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1", "/dev/mapper/luks-688ee6a7-12e5-4393-bfd5-48ed7d374b8d" ], "mounts": [ { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-688ee6a7-12e5-4393-bfd5-48ed7d374b8d", "state": "mounted" } ], "packages": [ "cryptsetup", "xfsprogs", "e2fsprogs" ], "pools": [], "volumes": [ { "_device": "/dev/mapper/luks-688ee6a7-12e5-4393-bfd5-48ed7d374b8d", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-688ee6a7-12e5-4393-bfd5-48ed7d374b8d", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10737418240, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] } } TASK [fedora.linux_system_roles.storage : Set the list of pools for test verification] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:128 Friday 17 January 2025 04:11:15 -0500 (0:00:00.048) 0:01:07.739 ******** ok: [managed-node1] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Set the list of volumes for test verification] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:132 Friday 17 January 2025 04:11:15 -0500 (0:00:00.041) 0:01:07.780 ******** ok: [managed-node1] => { "ansible_facts": { "_storage_volumes_list": [ { "_device": "/dev/mapper/luks-688ee6a7-12e5-4393-bfd5-48ed7d374b8d", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-688ee6a7-12e5-4393-bfd5-48ed7d374b8d", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10737418240, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] }, "changed": false } TASK [fedora.linux_system_roles.storage : Remove obsolete mounts] ************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:148 Friday 17 January 2025 04:11:15 -0500 (0:00:00.043) 0:01:07.823 ******** TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:159 Friday 17 January 2025 04:11:15 -0500 (0:00:00.034) 0:01:07.859 ******** ok: [managed-node1] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Set up new/current mounts] *********** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:164 Friday 17 January 2025 04:11:15 -0500 (0:00:00.816) 0:01:08.675 ******** changed: [managed-node1] => (item={u'src': u'/dev/mapper/luks-688ee6a7-12e5-4393-bfd5-48ed7d374b8d', u'group': None, u'dump': 0, u'passno': 0, u'fstype': u'xfs', u'state': u'mounted', u'mode': None, u'owner': None, u'path': u'/opt/test1', u'opts': u'defaults'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-688ee6a7-12e5-4393-bfd5-48ed7d374b8d", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/luks-688ee6a7-12e5-4393-bfd5-48ed7d374b8d" } TASK [fedora.linux_system_roles.storage : Manage mount ownership/permissions] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:175 Friday 17 January 2025 04:11:16 -0500 (0:00:00.456) 0:01:09.131 ******** skipping: [managed-node1] => (item={u'src': u'/dev/mapper/luks-688ee6a7-12e5-4393-bfd5-48ed7d374b8d', u'group': None, u'dump': 0, u'passno': 0, u'fstype': u'xfs', u'state': u'mounted', u'mode': None, u'owner': None, u'path': u'/opt/test1', u'opts': u'defaults'}) => { "ansible_loop_var": "mount_info", "changed": false, "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-688ee6a7-12e5-4393-bfd5-48ed7d374b8d", "state": "mounted" }, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:187 Friday 17 January 2025 04:11:16 -0500 (0:00:00.047) 0:01:09.178 ******** ok: [managed-node1] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Retrieve facts for the /etc/crypttab file] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:195 Friday 17 January 2025 04:11:16 -0500 (0:00:00.452) 0:01:09.631 ******** ok: [managed-node1] => { "changed": false, "stat": { "atime": 1737104945.097923, "attr_flags": "e", "attributes": [ "extents" ], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1718879272.062, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 131079, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0600", "mtime": 1718879026.308, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": "18446744072852913879", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Manage /etc/crypttab to account for changes we just made] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:200 Friday 17 January 2025 04:11:17 -0500 (0:00:00.370) 0:01:10.002 ******** changed: [managed-node1] => (item={u'state': u'present', u'password': u'-', u'name': u'luks-688ee6a7-12e5-4393-bfd5-48ed7d374b8d', u'backing_device': u'/dev/sda'}) => { "ansible_loop_var": "entry", "backup": "", "changed": true, "entry": { "backing_device": "/dev/sda", "name": "luks-688ee6a7-12e5-4393-bfd5-48ed7d374b8d", "password": "-", "state": "present" } } MSG: line added TASK [fedora.linux_system_roles.storage : Update facts] ************************ task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:222 Friday 17 January 2025 04:11:17 -0500 (0:00:00.382) 0:01:10.384 ******** ok: [managed-node1] TASK [Verify role results] ***************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:114 Friday 17 January 2025 04:11:18 -0500 (0:00:00.829) 0:01:11.213 ******** included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml for managed-node1 TASK [Print out pool information] ********************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:2 Friday 17 January 2025 04:11:18 -0500 (0:00:00.124) 0:01:11.338 ******** skipping: [managed-node1] => {} TASK [Print out volume information] ******************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:7 Friday 17 January 2025 04:11:18 -0500 (0:00:00.054) 0:01:11.392 ******** ok: [managed-node1] => { "_storage_volumes_list": [ { "_device": "/dev/mapper/luks-688ee6a7-12e5-4393-bfd5-48ed7d374b8d", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-688ee6a7-12e5-4393-bfd5-48ed7d374b8d", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10737418240, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] } TASK [Collect info about the volumes.] ***************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:15 Friday 17 January 2025 04:11:18 -0500 (0:00:00.070) 0:01:11.463 ******** ok: [managed-node1] => { "changed": false, "info": { "/dev/mapper/luks-688ee6a7-12e5-4393-bfd5-48ed7d374b8d": { "fstype": "xfs", "label": "", "mountpoint": "/opt/test1", "name": "/dev/mapper/luks-688ee6a7-12e5-4393-bfd5-48ed7d374b8d", "size": "10G", "type": "crypt", "uuid": "32dc73e0-b52c-41da-84f5-65ffec30cc34" }, "/dev/sda": { "fstype": "crypto_LUKS", "label": "", "mountpoint": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "688ee6a7-12e5-4393-bfd5-48ed7d374b8d" }, "/dev/sdb": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdd": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdd", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sde": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sde", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdf": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdf", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdg": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdg", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdh": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdh", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdi": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdi", "size": "10G", "type": "disk", "uuid": "" }, "/dev/xvda": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/xvda", "size": "250G", "type": "disk", "uuid": "" }, "/dev/xvda1": { "fstype": "ext4", "label": "", "mountpoint": "/", "name": "/dev/xvda1", "size": "250G", "type": "partition", "uuid": "c7b7d6a5-fd01-4b9b-bcca-153eaff9d312" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:20 Friday 17 January 2025 04:11:20 -0500 (0:00:01.938) 0:01:13.401 ******** ok: [managed-node1] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.002927", "end": "2025-01-17 04:11:21.119225", "rc": 0, "start": "2025-01-17 04:11:21.116298" } STDOUT: # system_role:storage # # /etc/fstab # Created by anaconda on Thu Jun 20 10:23:46 2024 # # Accessible filesystems, by reference, are maintained under '/dev/disk' # See man pages fstab(5), findfs(8), mount(8) and/or blkid(8) for more info # UUID=c7b7d6a5-fd01-4b9b-bcca-153eaff9d312 / ext4 defaults 1 1 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat /mnt/redhat nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/qa /mnt/qa nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 vtap-eng01.storage.rdu2.redhat.com:/vol/engarchive /mnt/engarchive nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/tpsdist /mnt/tpsdist nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat/brewroot /mnt/brew nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_brew_scratch_nfs/scratch /mnt/brew_scratch nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 /dev/mapper/luks-688ee6a7-12e5-4393-bfd5-48ed7d374b8d /opt/test1 xfs defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:25 Friday 17 January 2025 04:11:21 -0500 (0:00:00.483) 0:01:13.884 ******** ok: [managed-node1] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.002934", "end": "2025-01-17 04:11:21.442635", "failed_when_result": false, "rc": 0, "start": "2025-01-17 04:11:21.439701" } STDOUT: luks-688ee6a7-12e5-4393-bfd5-48ed7d374b8d /dev/sda - TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:34 Friday 17 January 2025 04:11:21 -0500 (0:00:00.316) 0:01:14.201 ******** TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:44 Friday 17 January 2025 04:11:21 -0500 (0:00:00.032) 0:01:14.233 ******** included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml for managed-node1 TASK [Set storage volume test variables] *************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:2 Friday 17 January 2025 04:11:21 -0500 (0:00:00.074) 0:01:14.308 ******** ok: [managed-node1] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [Run test verify for {{ storage_test_volume_subset }}] ******************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:19 Friday 17 January 2025 04:11:21 -0500 (0:00:00.043) 0:01:14.351 ******** included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml for managed-node1 included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml for managed-node1 included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml for managed-node1 included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml for managed-node1 included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml for managed-node1 included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml for managed-node1 included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml for managed-node1 included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml for managed-node1 TASK [Get expected mount device based on device type] ************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:7 Friday 17 January 2025 04:11:21 -0500 (0:00:00.182) 0:01:14.534 ******** ok: [managed-node1] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/luks-688ee6a7-12e5-4393-bfd5-48ed7d374b8d" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:11 Friday 17 January 2025 04:11:21 -0500 (0:00:00.089) 0:01:14.623 ******** ok: [managed-node1] => { "ansible_facts": { "storage_test_mount_expected_mount_point": "/opt/test1", "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Get information about the mountpoint directory] ************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:19 Friday 17 January 2025 04:11:21 -0500 (0:00:00.045) 0:01:14.669 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by device] ******************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:28 Friday 17 January 2025 04:11:22 -0500 (0:00:00.037) 0:01:14.706 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Verify mount directory user] ********************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:36 Friday 17 January 2025 04:11:22 -0500 (0:00:00.044) 0:01:14.751 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory group] ******************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:42 Friday 17 January 2025 04:11:22 -0500 (0:00:00.041) 0:01:14.792 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory permissions] ************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:48 Friday 17 January 2025 04:11:22 -0500 (0:00:00.046) 0:01:14.839 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get path of test volume device] ****************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:57 Friday 17 January 2025 04:11:22 -0500 (0:00:00.039) 0:01:14.878 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:63 Friday 17 January 2025 04:11:22 -0500 (0:00:00.037) 0:01:14.915 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:69 Friday 17 January 2025 04:11:22 -0500 (0:00:00.037) 0:01:14.953 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:79 Friday 17 January 2025 04:11:22 -0500 (0:00:00.037) 0:01:14.990 ******** ok: [managed-node1] => { "ansible_facts": { "storage_test_found_mount_stat": null, "storage_test_mount_expected_mount_point": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:2 Friday 17 January 2025 04:11:22 -0500 (0:00:00.036) 0:01:15.027 ******** ok: [managed-node1] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "/dev/mapper/luks-688ee6a7-12e5-4393-bfd5-48ed7d374b8d " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:17 Friday 17 January 2025 04:11:22 -0500 (0:00:00.066) 0:01:15.094 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:24 Friday 17 January 2025 04:11:22 -0500 (0:00:00.046) 0:01:15.141 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:33 Friday 17 January 2025 04:11:22 -0500 (0:00:00.044) 0:01:15.185 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify fingerprint] ****************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:45 Friday 17 January 2025 04:11:22 -0500 (0:00:00.036) 0:01:15.222 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Clean up variables] ****************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:52 Friday 17 January 2025 04:11:22 -0500 (0:00:00.045) 0:01:15.267 ******** ok: [managed-node1] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:6 Friday 17 January 2025 04:11:22 -0500 (0:00:00.037) 0:01:15.305 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:14 Friday 17 January 2025 04:11:22 -0500 (0:00:00.051) 0:01:15.357 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:3 Friday 17 January 2025 04:11:22 -0500 (0:00:00.053) 0:01:15.411 ******** ok: [managed-node1] => { "changed": false, "stat": { "atime": 1737105073.8549068, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1737105073.8549068, "dev": 5, "device_type": 2048, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 28267, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/blockdevice", "mode": "0660", "mtime": 1737105073.8549068, "nlink": 1, "path": "/dev/sda", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:9 Friday 17 January 2025 04:11:23 -0500 (0:00:00.321) 0:01:15.733 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:16 Friday 17 January 2025 04:11:23 -0500 (0:00:00.047) 0:01:15.780 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Make sure we got info about this volume] ********************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:23 Friday 17 January 2025 04:11:23 -0500 (0:00:00.036) 0:01:15.817 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Process volume type (set initial value) (1/2)] *************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:29 Friday 17 January 2025 04:11:23 -0500 (0:00:00.056) 0:01:15.873 ******** ok: [managed-node1] => { "ansible_facts": { "st_volume_type": "disk" }, "changed": false } TASK [Process volume type (get RAID value) (2/2)] ****************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:33 Friday 17 January 2025 04:11:23 -0500 (0:00:00.040) 0:01:15.914 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:38 Friday 17 January 2025 04:11:23 -0500 (0:00:00.036) 0:01:15.950 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:3 Friday 17 January 2025 04:11:23 -0500 (0:00:00.042) 0:01:15.992 ******** ok: [managed-node1] => { "changed": false, "stat": { "atime": 1737105073.9749067, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1737105073.9749067, "dev": 5, "device_type": 64768, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 42615, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1737105073.9749067, "nlink": 1, "path": "/dev/mapper/luks-688ee6a7-12e5-4393-bfd5-48ed7d374b8d", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Ensure cryptsetup is present] ******************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:10 Friday 17 January 2025 04:11:23 -0500 (0:00:00.328) 0:01:16.321 ******** ok: [managed-node1] => { "changed": false, "rc": 0, "results": [ "cryptsetup-2.0.3-6.el7.x86_64 providing cryptsetup is already installed" ] } TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:16 Friday 17 January 2025 04:11:24 -0500 (0:00:00.614) 0:01:16.936 ******** ok: [managed-node1] => { "changed": false, "cmd": [ "cryptsetup", "luksDump", "/dev/sda" ], "delta": "0:00:00.651096", "end": "2025-01-17 04:11:25.157244", "rc": 0, "start": "2025-01-17 04:11:24.506148" } STDOUT: LUKS header information Version: 2 Epoch: 3 Metadata area: 12288 bytes UUID: 688ee6a7-12e5-4393-bfd5-48ed7d374b8d Label: (no label) Subsystem: (no subsystem) Flags: (no flags) Data segments: 0: crypt offset: 4194304 [bytes] length: (whole device) cipher: aes-xts-plain64 sector: 512 [bytes] Keyslots: 0: luks2 Key: 512 bits Priority: normal Cipher: aes-xts-plain64 PBKDF: argon2i Time cost: 4 Memory: 667079 Threads: 2 Salt: d7 73 93 31 af 08 fc 5b 32 a2 f6 d7 97 8e 20 9c da 37 93 07 07 1a 4f b0 49 7b 3f ac 5d af 3b 38 AF stripes: 4000 Area offset:32768 [bytes] Area length:258048 [bytes] Digest ID: 0 Tokens: Digests: 0: pbkdf2 Hash: sha256 Iterations: 22882 Salt: e1 ad d6 52 d1 92 c3 6a c7 b9 d1 47 24 18 90 b2 74 a9 5d d2 31 43 cc d7 d6 8c 4b e0 9c d7 69 97 Digest: fe d3 34 9e f7 66 27 12 83 86 3a dd 9e 22 ee 16 11 d5 05 9d d4 88 32 d7 48 4c d9 22 ae 39 6f 3e TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:22 Friday 17 January 2025 04:11:25 -0500 (0:00:01.016) 0:01:17.952 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:29 Friday 17 January 2025 04:11:25 -0500 (0:00:00.069) 0:01:18.022 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:40 Friday 17 January 2025 04:11:25 -0500 (0:00:00.076) 0:01:18.099 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:46 Friday 17 January 2025 04:11:25 -0500 (0:00:00.069) 0:01:18.169 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Check LUKS version] ****************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:51 Friday 17 January 2025 04:11:25 -0500 (0:00:00.066) 0:01:18.235 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Check LUKS key size] ***************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:63 Friday 17 January 2025 04:11:25 -0500 (0:00:00.074) 0:01:18.309 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:75 Friday 17 January 2025 04:11:25 -0500 (0:00:00.057) 0:01:18.368 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set test variables] ****************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:87 Friday 17 January 2025 04:11:25 -0500 (0:00:00.058) 0:01:18.426 ******** ok: [managed-node1] => { "ansible_facts": { "_storage_test_crypttab_entries": [ "luks-688ee6a7-12e5-4393-bfd5-48ed7d374b8d /dev/sda -" ], "_storage_test_expected_crypttab_entries": "1", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:93 Friday 17 January 2025 04:11:25 -0500 (0:00:00.070) 0:01:18.497 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:100 Friday 17 January 2025 04:11:25 -0500 (0:00:00.067) 0:01:18.564 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:108 Friday 17 January 2025 04:11:25 -0500 (0:00:00.065) 0:01:18.630 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Check key file of crypttab entry] **************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:116 Friday 17 January 2025 04:11:25 -0500 (0:00:00.057) 0:01:18.687 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Clear test variables] **************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:124 Friday 17 January 2025 04:11:26 -0500 (0:00:00.053) 0:01:18.741 ******** ok: [managed-node1] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [Get information about RAID] ********************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:8 Friday 17 January 2025 04:11:26 -0500 (0:00:00.044) 0:01:18.785 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:14 Friday 17 January 2025 04:11:26 -0500 (0:00:00.043) 0:01:18.829 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:19 Friday 17 January 2025 04:11:26 -0500 (0:00:00.036) 0:01:18.866 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:24 Friday 17 January 2025 04:11:26 -0500 (0:00:00.035) 0:01:18.902 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set chunk size regex] **************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:29 Friday 17 January 2025 04:11:26 -0500 (0:00:00.040) 0:01:18.942 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:37 Friday 17 January 2025 04:11:26 -0500 (0:00:00.044) 0:01:18.986 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:46 Friday 17 January 2025 04:11:26 -0500 (0:00:00.035) 0:01:19.022 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:54 Friday 17 January 2025 04:11:26 -0500 (0:00:00.038) 0:01:19.060 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:62 Friday 17 January 2025 04:11:26 -0500 (0:00:00.036) 0:01:19.097 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:70 Friday 17 January 2025 04:11:26 -0500 (0:00:00.036) 0:01:19.133 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the actual size of the volume] ************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:3 Friday 17 January 2025 04:11:26 -0500 (0:00:00.035) 0:01:19.168 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested size of the volume] ********************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:11 Friday 17 January 2025 04:11:26 -0500 (0:00:00.039) 0:01:19.208 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected size] ********************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:20 Friday 17 January 2025 04:11:26 -0500 (0:00:00.038) 0:01:19.247 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show expected size] ****************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:28 Friday 17 January 2025 04:11:26 -0500 (0:00:00.041) 0:01:19.289 ******** ok: [managed-node1] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [Get the size of parent/pool device] ************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:32 Friday 17 January 2025 04:11:26 -0500 (0:00:00.039) 0:01:19.328 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show test pool] ********************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:46 Friday 17 January 2025 04:11:26 -0500 (0:00:00.040) 0:01:19.368 ******** skipping: [managed-node1] => {} TASK [Show test blockinfo] ***************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:50 Friday 17 January 2025 04:11:26 -0500 (0:00:00.039) 0:01:19.408 ******** skipping: [managed-node1] => {} TASK [Show test pool size] ***************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:54 Friday 17 January 2025 04:11:26 -0500 (0:00:00.039) 0:01:19.447 ******** skipping: [managed-node1] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:58 Friday 17 January 2025 04:11:26 -0500 (0:00:00.038) 0:01:19.485 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default thin pool reserved space values] ********************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:67 Friday 17 January 2025 04:11:26 -0500 (0:00:00.087) 0:01:19.573 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default minimal thin pool reserved space size] *************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:71 Friday 17 January 2025 04:11:26 -0500 (0:00:00.036) 0:01:19.610 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default maximal thin pool reserved space size] *************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:76 Friday 17 January 2025 04:11:26 -0500 (0:00:00.037) 0:01:19.647 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate maximum usable space in thin pool] ***************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:82 Friday 17 January 2025 04:11:26 -0500 (0:00:00.035) 0:01:19.683 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply upper size limit to max usable thin pool space] ******************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:86 Friday 17 January 2025 04:11:27 -0500 (0:00:00.036) 0:01:19.720 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply lower size limit to max usable thin pool space] ******************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:91 Friday 17 January 2025 04:11:27 -0500 (0:00:00.037) 0:01:19.757 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Convert maximum usable thin pool space from int to Size] ***************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:96 Friday 17 January 2025 04:11:27 -0500 (0:00:00.038) 0:01:19.796 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show max thin pool size] ************************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:101 Friday 17 January 2025 04:11:27 -0500 (0:00:00.037) 0:01:19.833 ******** skipping: [managed-node1] => {} TASK [Show volume thin pool size] ********************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:105 Friday 17 January 2025 04:11:27 -0500 (0:00:00.037) 0:01:19.870 ******** skipping: [managed-node1] => {} TASK [Show test volume size] *************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:109 Friday 17 January 2025 04:11:27 -0500 (0:00:00.037) 0:01:19.908 ******** skipping: [managed-node1] => {} TASK [Establish base value for expected thin pool size] ************************ task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:113 Friday 17 January 2025 04:11:27 -0500 (0:00:00.035) 0:01:19.943 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:120 Friday 17 January 2025 04:11:27 -0500 (0:00:00.044) 0:01:19.988 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected thin pool volume size] ***************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:127 Friday 17 January 2025 04:11:27 -0500 (0:00:00.043) 0:01:20.032 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected thin pool volume size based on percentage value] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:131 Friday 17 January 2025 04:11:27 -0500 (0:00:00.036) 0:01:20.068 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Replace expected volume size with calculated value] ********************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:137 Friday 17 January 2025 04:11:27 -0500 (0:00:00.036) 0:01:20.105 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show actual size] ******************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:143 Friday 17 January 2025 04:11:27 -0500 (0:00:00.036) 0:01:20.142 ******** ok: [managed-node1] => { "storage_test_actual_size": { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } } TASK [Show expected size] ****************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:147 Friday 17 January 2025 04:11:27 -0500 (0:00:00.041) 0:01:20.183 ******** ok: [managed-node1] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [Assert expected size is actual size] ************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:151 Friday 17 January 2025 04:11:27 -0500 (0:00:00.039) 0:01:20.223 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about the LV] ******************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:5 Friday 17 January 2025 04:11:27 -0500 (0:00:00.041) 0:01:20.265 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV segment type] ***************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:13 Friday 17 January 2025 04:11:27 -0500 (0:00:00.037) 0:01:20.302 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check segment type] ****************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:17 Friday 17 January 2025 04:11:27 -0500 (0:00:00.036) 0:01:20.339 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV cache size] ******************************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:24 Friday 17 January 2025 04:11:27 -0500 (0:00:00.037) 0:01:20.377 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested cache size] ****************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:31 Friday 17 January 2025 04:11:27 -0500 (0:00:00.037) 0:01:20.415 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected cache size] ************************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:37 Friday 17 January 2025 04:11:27 -0500 (0:00:00.037) 0:01:20.452 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:42 Friday 17 January 2025 04:11:27 -0500 (0:00:00.040) 0:01:20.492 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:25 Friday 17 January 2025 04:11:27 -0500 (0:00:00.036) 0:01:20.529 ******** ok: [managed-node1] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Clean up variable namespace] ********************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:54 Friday 17 January 2025 04:11:27 -0500 (0:00:00.038) 0:01:20.568 ******** ok: [managed-node1] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null }, "changed": false } TASK [Create a file] *********************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/create-test-file.yml:12 Friday 17 January 2025 04:11:27 -0500 (0:00:00.038) 0:01:20.606 ******** changed: [managed-node1] => { "changed": true, "dest": "/opt/test1/quux", "gid": 0, "group": "root", "mode": "0644", "owner": "root", "secontext": "unconfined_u:object_r:unlabeled_t:s0", "size": 0, "state": "file", "uid": 0 } TASK [Test for correct handling of safe_mode] ********************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:120 Friday 17 January 2025 04:11:28 -0500 (0:00:00.511) 0:01:21.117 ******** included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml for managed-node1 TASK [Store global variable value copy] **************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:4 Friday 17 January 2025 04:11:28 -0500 (0:00:00.076) 0:01:21.194 ******** ok: [managed-node1] => { "ansible_facts": { "storage_pools_global": [], "storage_safe_mode_global": true, "storage_volumes_global": [] }, "changed": false } TASK [Verify role raises correct error] **************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:10 Friday 17 January 2025 04:11:28 -0500 (0:00:00.044) 0:01:21.239 ******** TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Friday 17 January 2025 04:11:28 -0500 (0:00:00.056) 0:01:21.295 ******** included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node1 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Friday 17 January 2025 04:11:28 -0500 (0:00:00.062) 0:01:21.358 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Friday 17 January 2025 04:11:28 -0500 (0:00:00.045) 0:01:21.403 ******** skipping: [managed-node1] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node1] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node1] => (item=CentOS_7.yml) => { "ansible_facts": { "__storage_blivet_diskvolume_mkfs_option_map": { "ext2": "-F", "ext3": "-F", "ext4": "-F" }, "blivet_package_list": [ "python-enum34", "python-blivet3", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "{{ 'libblockdev-s390' if ansible_architecture == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_7.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_7.yml" } skipping: [managed-node1] => (item=CentOS_7.9.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS_7.9.yml", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Friday 17 January 2025 04:11:28 -0500 (0:00:00.090) 0:01:21.494 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Friday 17 January 2025 04:11:28 -0500 (0:00:00.036) 0:01:21.530 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Friday 17 January 2025 04:11:28 -0500 (0:00:00.036) 0:01:21.567 ******** ok: [managed-node1] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Friday 17 January 2025 04:11:28 -0500 (0:00:00.039) 0:01:21.606 ******** ok: [managed-node1] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Friday 17 January 2025 04:11:28 -0500 (0:00:00.038) 0:01:21.644 ******** included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node1 TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2 Friday 17 January 2025 04:11:29 -0500 (0:00:00.086) 0:01:21.730 ******** ok: [managed-node1] => { "changed": false, "rc": 0, "results": [ "python-enum34-1.0.4-1.el7.noarch providing python-enum34 is already installed", "1:python2-blivet3-3.1.3-3.el7.noarch providing python-blivet3 is already installed", "libblockdev-crypto-2.18-5.el7.x86_64 providing libblockdev-crypto is already installed", "libblockdev-dm-2.18-5.el7.x86_64 providing libblockdev-dm is already installed", "libblockdev-lvm-2.18-5.el7.x86_64 providing libblockdev-lvm is already installed", "libblockdev-mdraid-2.18-5.el7.x86_64 providing libblockdev-mdraid is already installed", "libblockdev-swap-2.18-5.el7.x86_64 providing libblockdev-swap is already installed", "libblockdev-2.18-5.el7.x86_64 providing libblockdev is already installed" ] } TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Friday 17 January 2025 04:11:32 -0500 (0:00:03.826) 0:01:25.557 ******** ok: [managed-node1] => { "storage_pools": [] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:14 Friday 17 January 2025 04:11:32 -0500 (0:00:00.045) 0:01:25.603 ******** ok: [managed-node1] => { "storage_volumes": [ { "disks": [ "sda" ], "encryption": false, "encryption_luks_version": "luks2", "encryption_password": "yabbadabbadoo", "mount_point": "/opt/test1", "name": "foo", "type": "disk" } ] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19 Friday 17 January 2025 04:11:32 -0500 (0:00:00.045) 0:01:25.648 ******** ok: [managed-node1] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:31 Friday 17 January 2025 04:11:36 -0500 (0:00:03.821) 0:01:29.469 ******** included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node1 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Friday 17 January 2025 04:11:36 -0500 (0:00:00.105) 0:01:29.575 ******** TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Friday 17 January 2025 04:11:36 -0500 (0:00:00.078) 0:01:29.654 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Friday 17 January 2025 04:11:37 -0500 (0:00:00.076) 0:01:29.730 ******** TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:37 Friday 17 January 2025 04:11:37 -0500 (0:00:00.118) 0:01:29.849 ******** ok: [managed-node1] => { "changed": false, "rc": 0, "results": [ "kpartx-0.4.9-136.el7_9.x86_64 providing kpartx is already installed" ] } TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:51 Friday 17 January 2025 04:11:37 -0500 (0:00:00.774) 0:01:30.624 ******** ok: [managed-node1] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "arp-ethers.service": { "name": "arp-ethers.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "brandbot.service": { "name": "brandbot.service", "source": "systemd", "state": "inactive", "status": "static" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "console-shell.service": { "name": "console-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.import1.service": { "name": "dbus-org.freedesktop.import1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.machine1.service": { "name": "dbus-org.freedesktop.machine1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dmraid-activation.service": { "name": "dmraid-activation.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "ebtables.service": { "name": "ebtables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iprdump.service": { "name": "iprdump.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprinit.service": { "name": "iprinit.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprupdate.service": { "name": "iprupdate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-lvmetad.service": { "name": "lvm2-lvmetad.service", "source": "systemd", "state": "running", "status": "static" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "netconsole": { "name": "netconsole", "source": "sysv", "state": "stopped", "status": "disabled" }, "network": { "name": "network", "source": "sysv", "state": "running", "status": "enabled" }, "network.service": { "name": "network.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-config.service": { "name": "nfs-config.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-idmap.service": { "name": "nfs-idmap.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-lock.service": { "name": "nfs-lock.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-rquotad.service": { "name": "nfs-rquotad.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-secure.service": { "name": "nfs-secure.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs.service": { "name": "nfs.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfslock.service": { "name": "nfslock.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "postfix.service": { "name": "postfix.service", "source": "systemd", "state": "running", "status": "enabled" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rhel-autorelabel-mark.service": { "name": "rhel-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-autorelabel.service": { "name": "rhel-autorelabel.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-configure.service": { "name": "rhel-configure.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-dmesg.service": { "name": "rhel-dmesg.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-domainname.service": { "name": "rhel-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-import-state.service": { "name": "rhel-import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-loadmodules.service": { "name": "rhel-loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-readonly.service": { "name": "rhel-readonly.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-rquotad.service": { "name": "rpc-rquotad.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpcgssd.service": { "name": "rpcgssd.service", "source": "systemd", "state": "inactive", "status": "static" }, "rpcidmapd.service": { "name": "rpcidmapd.service", "source": "systemd", "state": "inactive", "status": "static" }, "rsyncd.service": { "name": "rsyncd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyncd@.service": { "name": "rsyncd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-policy-migrate-local-changes@.service": { "name": "selinux-policy-migrate-local-changes@.service", "source": "systemd", "state": "unknown", "status": "static" }, "selinux-policy-migrate-local-changes@targeted.service": { "name": "selinux-policy-migrate-local-changes@targeted.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "unknown" }, "sshd-keygen.service": { "name": "sshd-keygen.service", "source": "systemd", "state": "stopped", "status": "static" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-bootchart.service": { "name": "systemd-bootchart.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-importd.service": { "name": "systemd-importd.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-machined.service": { "name": "systemd-machined.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-nspawn@.service": { "name": "systemd-nspawn@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-readahead-collect.service": { "name": "systemd-readahead-collect.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-readahead-done.service": { "name": "systemd-readahead-done.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "systemd-readahead-drop.service": { "name": "systemd-readahead-drop.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "systemd-readahead-replay.service": { "name": "systemd-readahead-replay.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill@.service": { "name": "systemd-rfkill@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-shutdownd.service": { "name": "systemd-shutdownd.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "wpa_supplicant.service": { "name": "wpa_supplicant.service", "source": "systemd", "state": "inactive", "status": "disabled" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Set storage_cryptsetup_services] ***** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:57 Friday 17 January 2025 04:11:39 -0500 (0:00:01.132) 0:01:31.756 ******** ok: [managed-node1] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:63 Friday 17 January 2025 04:11:39 -0500 (0:00:00.061) 0:01:31.818 ******** TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:69 Friday 17 January 2025 04:11:39 -0500 (0:00:00.041) 0:01:31.860 ******** fatal: [managed-node1]: FAILED! => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } MSG: cannot remove existing formatting on device 'luks-688ee6a7-12e5-4393-bfd5-48ed7d374b8d' in safe mode due to encryption removal TASK [fedora.linux_system_roles.storage : Failed message] ********************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:109 Friday 17 January 2025 04:11:43 -0500 (0:00:04.141) 0:01:36.001 ******** fatal: [managed-node1]: FAILED! => { "changed": false } MSG: {u'_ansible_no_log': False, u'crypts': [], u'pools': [], u'leaves': [], u'changed': False, u'actions': [], u'failed': True, u'volumes': [], u'invocation': {u'module_args': {u'packages_only': False, u'disklabel_type': None, u'diskvolume_mkfs_option_map': {u'ext4': u'-F', u'ext3': u'-F', u'ext2': u'-F'}, u'safe_mode': True, u'pools': [], u'volumes': [{u'raid_metadata_version': None, u'raid_level': None, u'fs_type': u'xfs', u'mount_options': u'defaults', u'size': 10733223936, u'mount_point': u'/opt/test1', u'compression': None, u'encryption_password': u'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', u'encryption': False, u'mount_device_identifier': u'uuid', u'raid_device_count': None, u'state': u'present', u'vdo_pool_size': None, u'mount_mode': None, u'thin_pool_name': None, u'fs_overwrite_existing': True, u'encryption_key_size': 0, u'deduplication': None, u'encryption_cipher': None, u'encryption_key': None, u'fs_label': u'', u'encryption_luks_version': u'luks2', u'raid_stripe_size': None, u'cache_size': 0, u'mount_user': None, u'raid_spare_count': None, u'cache_mode': None, u'name': u'foo', u'mount_group': None, u'type': u'disk', u'disks': [u'sda'], u'cached': False, u'thin_pool_size': None, u'thin': None, u'mount_check': 0, u'mount_passno': 0, u'raid_chunk_size': None, u'cache_devices': [], u'fs_create_options': u''}], u'pool_defaults': {u'raid_metadata_version': None, u'encryption_cipher': None, u'encryption_key': None, u'encryption_luks_version': None, u'raid_spare_count': None, u'grow_to_fill': False, u'encryption_password': None, u'encryption': False, u'disks': [], u'raid_level': None, u'raid_device_count': None, u'state': u'present', u'volumes': [], u'shared': False, u'type': u'lvm', u'encryption_key_size': None, u'raid_chunk_size': None}, u'volume_defaults': {u'raid_metadata_version': None, u'raid_level': None, u'fs_type': u'xfs', u'mount_options': u'defaults', u'size': 0, u'mount_point': u'', u'compression': None, u'encryption_password': None, u'encryption': False, u'mount_device_identifier': u'uuid', u'raid_device_count': None, u'state': u'present', u'vdo_pool_size': None, u'thin_pool_name': None, u'type': u'lvm', u'encryption_key_size': None, u'encryption_cipher': None, u'encryption_key': None, u'fs_label': u'', u'encryption_luks_version': None, u'raid_stripe_size': None, u'cache_size': 0, u'raid_spare_count': None, u'cache_mode': None, u'deduplication': None, u'cached': False, u'fs_overwrite_existing': True, u'disks': [], u'thin_pool_size': None, u'thin': None, u'mount_check': 0, u'mount_passno': 0, u'raid_chunk_size': None, u'cache_devices': [], u'fs_create_options': u''}, u'use_partitions': None}}, u'mounts': [], u'packages': [], u'msg': u"cannot remove existing formatting on device 'luks-688ee6a7-12e5-4393-bfd5-48ed7d374b8d' in safe mode due to encryption removal"} TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:113 Friday 17 January 2025 04:11:43 -0500 (0:00:00.086) 0:01:36.087 ******** TASK [Check that we failed in the role] **************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:23 Friday 17 January 2025 04:11:43 -0500 (0:00:00.075) 0:01:36.163 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Verify the blivet output and error message are correct] ****************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:28 Friday 17 January 2025 04:11:43 -0500 (0:00:00.076) 0:01:36.239 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Verify correct exception or error message] ******************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:39 Friday 17 January 2025 04:11:43 -0500 (0:00:00.136) 0:01:36.375 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Stat the file] *********************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml:11 Friday 17 January 2025 04:11:43 -0500 (0:00:00.114) 0:01:36.490 ******** ok: [managed-node1] => { "changed": false, "stat": { "atime": 1737105088.3598983, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1737105088.3598983, "dev": 64768, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 67, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0644", "mtime": 1737105088.3598983, "nlink": 1, "path": "/opt/test1/quux", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "version": "18446744073260069219", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Assert file presence] **************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml:16 Friday 17 January 2025 04:11:44 -0500 (0:00:00.649) 0:01:37.139 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Remove the encryption layer] ********************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:141 Friday 17 January 2025 04:11:44 -0500 (0:00:00.077) 0:01:37.217 ******** TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Friday 17 January 2025 04:11:44 -0500 (0:00:00.175) 0:01:37.393 ******** included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node1 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Friday 17 January 2025 04:11:44 -0500 (0:00:00.086) 0:01:37.479 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Friday 17 January 2025 04:11:44 -0500 (0:00:00.067) 0:01:37.547 ******** skipping: [managed-node1] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node1] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node1] => (item=CentOS_7.yml) => { "ansible_facts": { "__storage_blivet_diskvolume_mkfs_option_map": { "ext2": "-F", "ext3": "-F", "ext4": "-F" }, "blivet_package_list": [ "python-enum34", "python-blivet3", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "{{ 'libblockdev-s390' if ansible_architecture == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_7.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_7.yml" } skipping: [managed-node1] => (item=CentOS_7.9.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS_7.9.yml", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Friday 17 January 2025 04:11:44 -0500 (0:00:00.144) 0:01:37.692 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Friday 17 January 2025 04:11:45 -0500 (0:00:00.058) 0:01:37.750 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Friday 17 January 2025 04:11:45 -0500 (0:00:00.055) 0:01:37.806 ******** ok: [managed-node1] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Friday 17 January 2025 04:11:45 -0500 (0:00:00.057) 0:01:37.863 ******** ok: [managed-node1] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Friday 17 January 2025 04:11:45 -0500 (0:00:00.055) 0:01:37.919 ******** included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node1 TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2 Friday 17 January 2025 04:11:45 -0500 (0:00:00.194) 0:01:38.113 ******** ok: [managed-node1] => { "changed": false, "rc": 0, "results": [ "python-enum34-1.0.4-1.el7.noarch providing python-enum34 is already installed", "1:python2-blivet3-3.1.3-3.el7.noarch providing python-blivet3 is already installed", "libblockdev-crypto-2.18-5.el7.x86_64 providing libblockdev-crypto is already installed", "libblockdev-dm-2.18-5.el7.x86_64 providing libblockdev-dm is already installed", "libblockdev-lvm-2.18-5.el7.x86_64 providing libblockdev-lvm is already installed", "libblockdev-mdraid-2.18-5.el7.x86_64 providing libblockdev-mdraid is already installed", "libblockdev-swap-2.18-5.el7.x86_64 providing libblockdev-swap is already installed", "libblockdev-2.18-5.el7.x86_64 providing libblockdev is already installed" ] } TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Friday 17 January 2025 04:11:49 -0500 (0:00:04.000) 0:01:42.113 ******** ok: [managed-node1] => { "storage_pools": "VARIABLE IS NOT DEFINED!: 'storage_pools' is undefined" } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:14 Friday 17 January 2025 04:11:49 -0500 (0:00:00.060) 0:01:42.174 ******** ok: [managed-node1] => { "storage_volumes": [ { "disks": [ "sda" ], "encryption": false, "encryption_luks_version": "luks2", "encryption_password": "yabbadabbadoo", "mount_point": "/opt/test1", "name": "foo", "type": "disk" } ] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19 Friday 17 January 2025 04:11:49 -0500 (0:00:00.103) 0:01:42.278 ******** ok: [managed-node1] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:31 Friday 17 January 2025 04:11:53 -0500 (0:00:04.075) 0:01:46.353 ******** included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node1 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Friday 17 January 2025 04:11:53 -0500 (0:00:00.080) 0:01:46.434 ******** TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Friday 17 January 2025 04:11:53 -0500 (0:00:00.034) 0:01:46.469 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Friday 17 January 2025 04:11:53 -0500 (0:00:00.043) 0:01:46.513 ******** TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:37 Friday 17 January 2025 04:11:53 -0500 (0:00:00.049) 0:01:46.563 ******** ok: [managed-node1] => { "changed": false, "rc": 0, "results": [ "kpartx-0.4.9-136.el7_9.x86_64 providing kpartx is already installed" ] } TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:51 Friday 17 January 2025 04:11:54 -0500 (0:00:00.847) 0:01:47.410 ******** ok: [managed-node1] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "arp-ethers.service": { "name": "arp-ethers.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "brandbot.service": { "name": "brandbot.service", "source": "systemd", "state": "inactive", "status": "static" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "console-shell.service": { "name": "console-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.import1.service": { "name": "dbus-org.freedesktop.import1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.machine1.service": { "name": "dbus-org.freedesktop.machine1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dmraid-activation.service": { "name": "dmraid-activation.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "ebtables.service": { "name": "ebtables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iprdump.service": { "name": "iprdump.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprinit.service": { "name": "iprinit.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprupdate.service": { "name": "iprupdate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-lvmetad.service": { "name": "lvm2-lvmetad.service", "source": "systemd", "state": "running", "status": "static" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "netconsole": { "name": "netconsole", "source": "sysv", "state": "stopped", "status": "disabled" }, "network": { "name": "network", "source": "sysv", "state": "running", "status": "enabled" }, "network.service": { "name": "network.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-config.service": { "name": "nfs-config.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-idmap.service": { "name": "nfs-idmap.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-lock.service": { "name": "nfs-lock.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-rquotad.service": { "name": "nfs-rquotad.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-secure.service": { "name": "nfs-secure.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs.service": { "name": "nfs.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfslock.service": { "name": "nfslock.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "postfix.service": { "name": "postfix.service", "source": "systemd", "state": "running", "status": "enabled" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rhel-autorelabel-mark.service": { "name": "rhel-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-autorelabel.service": { "name": "rhel-autorelabel.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-configure.service": { "name": "rhel-configure.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-dmesg.service": { "name": "rhel-dmesg.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-domainname.service": { "name": "rhel-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-import-state.service": { "name": "rhel-import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-loadmodules.service": { "name": "rhel-loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-readonly.service": { "name": "rhel-readonly.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-rquotad.service": { "name": "rpc-rquotad.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpcgssd.service": { "name": "rpcgssd.service", "source": "systemd", "state": "inactive", "status": "static" }, "rpcidmapd.service": { "name": "rpcidmapd.service", "source": "systemd", "state": "inactive", "status": "static" }, "rsyncd.service": { "name": "rsyncd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyncd@.service": { "name": "rsyncd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-policy-migrate-local-changes@.service": { "name": "selinux-policy-migrate-local-changes@.service", "source": "systemd", "state": "unknown", "status": "static" }, "selinux-policy-migrate-local-changes@targeted.service": { "name": "selinux-policy-migrate-local-changes@targeted.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "unknown" }, "sshd-keygen.service": { "name": "sshd-keygen.service", "source": "systemd", "state": "stopped", "status": "static" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-bootchart.service": { "name": "systemd-bootchart.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-importd.service": { "name": "systemd-importd.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-machined.service": { "name": "systemd-machined.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-nspawn@.service": { "name": "systemd-nspawn@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-readahead-collect.service": { "name": "systemd-readahead-collect.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-readahead-done.service": { "name": "systemd-readahead-done.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "systemd-readahead-drop.service": { "name": "systemd-readahead-drop.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "systemd-readahead-replay.service": { "name": "systemd-readahead-replay.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill@.service": { "name": "systemd-rfkill@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-shutdownd.service": { "name": "systemd-shutdownd.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "wpa_supplicant.service": { "name": "wpa_supplicant.service", "source": "systemd", "state": "inactive", "status": "disabled" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Set storage_cryptsetup_services] ***** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:57 Friday 17 January 2025 04:11:55 -0500 (0:00:00.983) 0:01:48.393 ******** ok: [managed-node1] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:63 Friday 17 January 2025 04:11:55 -0500 (0:00:00.064) 0:01:48.458 ******** TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:69 Friday 17 January 2025 04:11:55 -0500 (0:00:00.049) 0:01:48.508 ******** changed: [managed-node1] => { "actions": [ { "action": "destroy format", "device": "/dev/mapper/luks-688ee6a7-12e5-4393-bfd5-48ed7d374b8d", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/luks-688ee6a7-12e5-4393-bfd5-48ed7d374b8d", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda", "fs_type": "luks" }, { "action": "create format", "device": "/dev/sda", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/sda", "name": "luks-688ee6a7-12e5-4393-bfd5-48ed7d374b8d", "password": "-", "state": "absent" } ], "leaves": [ "/dev/sda", "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-688ee6a7-12e5-4393-bfd5-48ed7d374b8d", "state": "absent" }, { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "UUID=e4ad127e-be0f-45ac-a94b-c1f82748bbb5", "state": "mounted" } ], "packages": [ "xfsprogs", "e2fsprogs" ], "pools": [], "volumes": [ { "_device": "/dev/sda", "_kernel_device": "/dev/sda", "_mount_id": "UUID=e4ad127e-be0f-45ac-a94b-c1f82748bbb5", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10733223936, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] } TASK [fedora.linux_system_roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:83 Friday 17 January 2025 04:12:00 -0500 (0:00:04.287) 0:01:52.795 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if /etc/fstab is present] ****** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:90 Friday 17 January 2025 04:12:00 -0500 (0:00:00.053) 0:01:52.849 ******** ok: [managed-node1] => { "changed": false, "stat": { "atime": 1737105076.3559053, "attr_flags": "e", "attributes": [ "extents" ], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "6e40d8f66d0f808aaad192abe735d48296af92f0", "ctime": 1737105076.3529053, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 263644, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0644", "mtime": 1737105076.3529053, "nlink": 1, "path": "/etc/fstab", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 1311, "uid": 0, "version": "18446744072031193646", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Add fingerprint to /etc/fstab if present] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:95 Friday 17 January 2025 04:12:00 -0500 (0:00:00.412) 0:01:53.261 ******** ok: [managed-node1] => { "backup": "", "changed": false } TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:113 Friday 17 January 2025 04:12:00 -0500 (0:00:00.417) 0:01:53.679 ******** TASK [fedora.linux_system_roles.storage : Show blivet_output] ****************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:119 Friday 17 January 2025 04:12:01 -0500 (0:00:00.074) 0:01:53.753 ******** ok: [managed-node1] => { "blivet_output": { "actions": [ { "action": "destroy format", "device": "/dev/mapper/luks-688ee6a7-12e5-4393-bfd5-48ed7d374b8d", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/luks-688ee6a7-12e5-4393-bfd5-48ed7d374b8d", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda", "fs_type": "luks" }, { "action": "create format", "device": "/dev/sda", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/sda", "name": "luks-688ee6a7-12e5-4393-bfd5-48ed7d374b8d", "password": "-", "state": "absent" } ], "failed": false, "leaves": [ "/dev/sda", "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-688ee6a7-12e5-4393-bfd5-48ed7d374b8d", "state": "absent" }, { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "UUID=e4ad127e-be0f-45ac-a94b-c1f82748bbb5", "state": "mounted" } ], "packages": [ "xfsprogs", "e2fsprogs" ], "pools": [], "volumes": [ { "_device": "/dev/sda", "_kernel_device": "/dev/sda", "_mount_id": "UUID=e4ad127e-be0f-45ac-a94b-c1f82748bbb5", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10733223936, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] } } TASK [fedora.linux_system_roles.storage : Set the list of pools for test verification] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:128 Friday 17 January 2025 04:12:01 -0500 (0:00:00.079) 0:01:53.832 ******** ok: [managed-node1] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Set the list of volumes for test verification] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:132 Friday 17 January 2025 04:12:01 -0500 (0:00:00.059) 0:01:53.892 ******** ok: [managed-node1] => { "ansible_facts": { "_storage_volumes_list": [ { "_device": "/dev/sda", "_kernel_device": "/dev/sda", "_mount_id": "UUID=e4ad127e-be0f-45ac-a94b-c1f82748bbb5", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10733223936, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] }, "changed": false } TASK [fedora.linux_system_roles.storage : Remove obsolete mounts] ************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:148 Friday 17 January 2025 04:12:01 -0500 (0:00:00.076) 0:01:53.968 ******** changed: [managed-node1] => (item={u'src': u'/dev/mapper/luks-688ee6a7-12e5-4393-bfd5-48ed7d374b8d', u'state': u'absent', u'fstype': u'xfs', u'path': u'/opt/test1'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-688ee6a7-12e5-4393-bfd5-48ed7d374b8d", "state": "absent" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/luks-688ee6a7-12e5-4393-bfd5-48ed7d374b8d" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:159 Friday 17 January 2025 04:12:01 -0500 (0:00:00.450) 0:01:54.419 ******** ok: [managed-node1] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Set up new/current mounts] *********** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:164 Friday 17 January 2025 04:12:02 -0500 (0:00:00.556) 0:01:54.976 ******** changed: [managed-node1] => (item={u'src': u'UUID=e4ad127e-be0f-45ac-a94b-c1f82748bbb5', u'group': None, u'dump': 0, u'passno': 0, u'fstype': u'xfs', u'state': u'mounted', u'mode': None, u'owner': None, u'path': u'/opt/test1', u'opts': u'defaults'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "UUID=e4ad127e-be0f-45ac-a94b-c1f82748bbb5", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "UUID=e4ad127e-be0f-45ac-a94b-c1f82748bbb5" } TASK [fedora.linux_system_roles.storage : Manage mount ownership/permissions] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:175 Friday 17 January 2025 04:12:02 -0500 (0:00:00.426) 0:01:55.402 ******** skipping: [managed-node1] => (item={u'src': u'UUID=e4ad127e-be0f-45ac-a94b-c1f82748bbb5', u'group': None, u'dump': 0, u'passno': 0, u'fstype': u'xfs', u'state': u'mounted', u'mode': None, u'owner': None, u'path': u'/opt/test1', u'opts': u'defaults'}) => { "ansible_loop_var": "mount_info", "changed": false, "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "UUID=e4ad127e-be0f-45ac-a94b-c1f82748bbb5", "state": "mounted" }, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:187 Friday 17 January 2025 04:12:02 -0500 (0:00:00.064) 0:01:55.467 ******** ok: [managed-node1] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Retrieve facts for the /etc/crypttab file] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:195 Friday 17 January 2025 04:12:03 -0500 (0:00:00.461) 0:01:55.929 ******** ok: [managed-node1] => { "changed": false, "stat": { "atime": 1737105081.4409022, "attr_flags": "e", "attributes": [ "extents" ], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "fe4e66a7e148c8fe3f03b88c54549e8c76b78400", "ctime": 1737105077.6129045, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 263646, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0600", "mtime": 1737105077.6129045, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 53, "uid": 0, "version": "18446744072031193676", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Manage /etc/crypttab to account for changes we just made] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:200 Friday 17 January 2025 04:12:03 -0500 (0:00:00.378) 0:01:56.308 ******** changed: [managed-node1] => (item={u'state': u'absent', u'password': u'-', u'name': u'luks-688ee6a7-12e5-4393-bfd5-48ed7d374b8d', u'backing_device': u'/dev/sda'}) => { "ansible_loop_var": "entry", "backup": "", "changed": true, "entry": { "backing_device": "/dev/sda", "name": "luks-688ee6a7-12e5-4393-bfd5-48ed7d374b8d", "password": "-", "state": "absent" }, "found": 1 } MSG: 1 line(s) removed TASK [fedora.linux_system_roles.storage : Update facts] ************************ task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:222 Friday 17 January 2025 04:12:03 -0500 (0:00:00.343) 0:01:56.652 ******** ok: [managed-node1] TASK [Verify role results] ***************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:155 Friday 17 January 2025 04:12:04 -0500 (0:00:00.788) 0:01:57.440 ******** included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml for managed-node1 TASK [Print out pool information] ********************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:2 Friday 17 January 2025 04:12:04 -0500 (0:00:00.095) 0:01:57.536 ******** skipping: [managed-node1] => {} TASK [Print out volume information] ******************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:7 Friday 17 January 2025 04:12:04 -0500 (0:00:00.034) 0:01:57.570 ******** ok: [managed-node1] => { "_storage_volumes_list": [ { "_device": "/dev/sda", "_kernel_device": "/dev/sda", "_mount_id": "UUID=e4ad127e-be0f-45ac-a94b-c1f82748bbb5", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10733223936, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] } TASK [Collect info about the volumes.] ***************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:15 Friday 17 January 2025 04:12:04 -0500 (0:00:00.047) 0:01:57.618 ******** ok: [managed-node1] => { "changed": false, "info": { "/dev/sda": { "fstype": "xfs", "label": "", "mountpoint": "/opt/test1", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "e4ad127e-be0f-45ac-a94b-c1f82748bbb5" }, "/dev/sdb": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdd": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdd", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sde": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sde", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdf": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdf", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdg": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdg", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdh": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdh", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdi": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdi", "size": "10G", "type": "disk", "uuid": "" }, "/dev/xvda": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/xvda", "size": "250G", "type": "disk", "uuid": "" }, "/dev/xvda1": { "fstype": "ext4", "label": "", "mountpoint": "/", "name": "/dev/xvda1", "size": "250G", "type": "partition", "uuid": "c7b7d6a5-fd01-4b9b-bcca-153eaff9d312" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:20 Friday 17 January 2025 04:12:05 -0500 (0:00:00.496) 0:01:58.114 ******** ok: [managed-node1] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.002842", "end": "2025-01-17 04:12:05.696878", "rc": 0, "start": "2025-01-17 04:12:05.694036" } STDOUT: # system_role:storage # # /etc/fstab # Created by anaconda on Thu Jun 20 10:23:46 2024 # # Accessible filesystems, by reference, are maintained under '/dev/disk' # See man pages fstab(5), findfs(8), mount(8) and/or blkid(8) for more info # UUID=c7b7d6a5-fd01-4b9b-bcca-153eaff9d312 / ext4 defaults 1 1 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat /mnt/redhat nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/qa /mnt/qa nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 vtap-eng01.storage.rdu2.redhat.com:/vol/engarchive /mnt/engarchive nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/tpsdist /mnt/tpsdist nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat/brewroot /mnt/brew nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_brew_scratch_nfs/scratch /mnt/brew_scratch nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 UUID=e4ad127e-be0f-45ac-a94b-c1f82748bbb5 /opt/test1 xfs defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:25 Friday 17 January 2025 04:12:05 -0500 (0:00:00.364) 0:01:58.478 ******** ok: [managed-node1] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.002877", "end": "2025-01-17 04:12:06.040392", "failed_when_result": false, "rc": 0, "start": "2025-01-17 04:12:06.037515" } TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:34 Friday 17 January 2025 04:12:06 -0500 (0:00:00.322) 0:01:58.801 ******** TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:44 Friday 17 January 2025 04:12:06 -0500 (0:00:00.040) 0:01:58.841 ******** included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml for managed-node1 TASK [Set storage volume test variables] *************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:2 Friday 17 January 2025 04:12:06 -0500 (0:00:00.108) 0:01:58.950 ******** ok: [managed-node1] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [Run test verify for {{ storage_test_volume_subset }}] ******************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:19 Friday 17 January 2025 04:12:06 -0500 (0:00:00.059) 0:01:59.009 ******** included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml for managed-node1 included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml for managed-node1 included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml for managed-node1 included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml for managed-node1 included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml for managed-node1 included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml for managed-node1 included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml for managed-node1 included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml for managed-node1 TASK [Get expected mount device based on device type] ************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:7 Friday 17 January 2025 04:12:06 -0500 (0:00:00.200) 0:01:59.210 ******** ok: [managed-node1] => { "ansible_facts": { "storage_test_device_path": "/dev/sda" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:11 Friday 17 January 2025 04:12:06 -0500 (0:00:00.042) 0:01:59.252 ******** ok: [managed-node1] => { "ansible_facts": { "storage_test_mount_expected_mount_point": "/opt/test1", "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Get information about the mountpoint directory] ************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:19 Friday 17 January 2025 04:12:06 -0500 (0:00:00.052) 0:01:59.305 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by device] ******************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:28 Friday 17 January 2025 04:12:06 -0500 (0:00:00.049) 0:01:59.354 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Verify mount directory user] ********************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:36 Friday 17 January 2025 04:12:06 -0500 (0:00:00.062) 0:01:59.417 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory group] ******************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:42 Friday 17 January 2025 04:12:06 -0500 (0:00:00.053) 0:01:59.470 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory permissions] ************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:48 Friday 17 January 2025 04:12:06 -0500 (0:00:00.064) 0:01:59.535 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get path of test volume device] ****************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:57 Friday 17 January 2025 04:12:06 -0500 (0:00:00.043) 0:01:59.579 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:63 Friday 17 January 2025 04:12:06 -0500 (0:00:00.044) 0:01:59.623 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:69 Friday 17 January 2025 04:12:06 -0500 (0:00:00.037) 0:01:59.661 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:79 Friday 17 January 2025 04:12:07 -0500 (0:00:00.037) 0:01:59.698 ******** ok: [managed-node1] => { "ansible_facts": { "storage_test_found_mount_stat": null, "storage_test_mount_expected_mount_point": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:2 Friday 17 January 2025 04:12:07 -0500 (0:00:00.041) 0:01:59.739 ******** ok: [managed-node1] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "UUID=e4ad127e-be0f-45ac-a94b-c1f82748bbb5 " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:17 Friday 17 January 2025 04:12:07 -0500 (0:00:00.087) 0:01:59.826 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:24 Friday 17 January 2025 04:12:07 -0500 (0:00:00.064) 0:01:59.891 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:33 Friday 17 January 2025 04:12:07 -0500 (0:00:00.069) 0:01:59.960 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify fingerprint] ****************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:45 Friday 17 January 2025 04:12:07 -0500 (0:00:00.056) 0:02:00.017 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Clean up variables] ****************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:52 Friday 17 January 2025 04:12:07 -0500 (0:00:00.064) 0:02:00.081 ******** ok: [managed-node1] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:6 Friday 17 January 2025 04:12:07 -0500 (0:00:00.048) 0:02:00.129 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:14 Friday 17 January 2025 04:12:07 -0500 (0:00:00.061) 0:02:00.191 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:3 Friday 17 January 2025 04:12:07 -0500 (0:00:00.061) 0:02:00.252 ******** ok: [managed-node1] => { "changed": false, "stat": { "atime": 1737105119.9728825, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1737105119.9728825, "dev": 5, "device_type": 2048, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 28267, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/blockdevice", "mode": "0660", "mtime": 1737105119.9728825, "nlink": 1, "path": "/dev/sda", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:9 Friday 17 January 2025 04:12:07 -0500 (0:00:00.349) 0:02:00.602 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:16 Friday 17 January 2025 04:12:07 -0500 (0:00:00.056) 0:02:00.659 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Make sure we got info about this volume] ********************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:23 Friday 17 January 2025 04:12:07 -0500 (0:00:00.036) 0:02:00.695 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Process volume type (set initial value) (1/2)] *************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:29 Friday 17 January 2025 04:12:08 -0500 (0:00:00.045) 0:02:00.741 ******** ok: [managed-node1] => { "ansible_facts": { "st_volume_type": "disk" }, "changed": false } TASK [Process volume type (get RAID value) (2/2)] ****************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:33 Friday 17 January 2025 04:12:08 -0500 (0:00:00.040) 0:02:00.782 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:38 Friday 17 January 2025 04:12:08 -0500 (0:00:00.036) 0:02:00.819 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:3 Friday 17 January 2025 04:12:08 -0500 (0:00:00.043) 0:02:00.862 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Ensure cryptsetup is present] ******************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:10 Friday 17 January 2025 04:12:08 -0500 (0:00:00.035) 0:02:00.898 ******** ok: [managed-node1] => { "changed": false, "rc": 0, "results": [ "cryptsetup-2.0.3-6.el7.x86_64 providing cryptsetup is already installed" ] } TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:16 Friday 17 January 2025 04:12:08 -0500 (0:00:00.667) 0:02:01.565 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:22 Friday 17 January 2025 04:12:08 -0500 (0:00:00.058) 0:02:01.624 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:29 Friday 17 January 2025 04:12:08 -0500 (0:00:00.040) 0:02:01.664 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:40 Friday 17 January 2025 04:12:09 -0500 (0:00:00.049) 0:02:01.714 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:46 Friday 17 January 2025 04:12:09 -0500 (0:00:00.035) 0:02:01.750 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:51 Friday 17 January 2025 04:12:09 -0500 (0:00:00.038) 0:02:01.788 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:63 Friday 17 January 2025 04:12:09 -0500 (0:00:00.036) 0:02:01.824 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:75 Friday 17 January 2025 04:12:09 -0500 (0:00:00.035) 0:02:01.860 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set test variables] ****************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:87 Friday 17 January 2025 04:12:09 -0500 (0:00:00.102) 0:02:01.963 ******** ok: [managed-node1] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:93 Friday 17 January 2025 04:12:09 -0500 (0:00:00.058) 0:02:02.022 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:100 Friday 17 January 2025 04:12:09 -0500 (0:00:00.063) 0:02:02.085 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:108 Friday 17 January 2025 04:12:09 -0500 (0:00:00.064) 0:02:02.150 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:116 Friday 17 January 2025 04:12:09 -0500 (0:00:00.078) 0:02:02.228 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clear test variables] **************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:124 Friday 17 January 2025 04:12:09 -0500 (0:00:00.054) 0:02:02.283 ******** ok: [managed-node1] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [Get information about RAID] ********************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:8 Friday 17 January 2025 04:12:09 -0500 (0:00:00.064) 0:02:02.348 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:14 Friday 17 January 2025 04:12:09 -0500 (0:00:00.056) 0:02:02.405 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:19 Friday 17 January 2025 04:12:09 -0500 (0:00:00.058) 0:02:02.463 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:24 Friday 17 January 2025 04:12:09 -0500 (0:00:00.054) 0:02:02.518 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set chunk size regex] **************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:29 Friday 17 January 2025 04:12:09 -0500 (0:00:00.106) 0:02:02.625 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:37 Friday 17 January 2025 04:12:10 -0500 (0:00:00.085) 0:02:02.710 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:46 Friday 17 January 2025 04:12:10 -0500 (0:00:00.068) 0:02:02.779 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:54 Friday 17 January 2025 04:12:10 -0500 (0:00:00.052) 0:02:02.832 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:62 Friday 17 January 2025 04:12:10 -0500 (0:00:00.052) 0:02:02.884 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:70 Friday 17 January 2025 04:12:10 -0500 (0:00:00.038) 0:02:02.923 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the actual size of the volume] ************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:3 Friday 17 January 2025 04:12:10 -0500 (0:00:00.037) 0:02:02.960 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested size of the volume] ********************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:11 Friday 17 January 2025 04:12:10 -0500 (0:00:00.039) 0:02:02.999 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected size] ********************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:20 Friday 17 January 2025 04:12:10 -0500 (0:00:00.045) 0:02:03.045 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show expected size] ****************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:28 Friday 17 January 2025 04:12:10 -0500 (0:00:00.050) 0:02:03.095 ******** ok: [managed-node1] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [Get the size of parent/pool device] ************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:32 Friday 17 January 2025 04:12:10 -0500 (0:00:00.073) 0:02:03.169 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show test pool] ********************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:46 Friday 17 January 2025 04:12:10 -0500 (0:00:00.067) 0:02:03.237 ******** skipping: [managed-node1] => {} TASK [Show test blockinfo] ***************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:50 Friday 17 January 2025 04:12:10 -0500 (0:00:00.101) 0:02:03.339 ******** skipping: [managed-node1] => {} TASK [Show test pool size] ***************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:54 Friday 17 January 2025 04:12:10 -0500 (0:00:00.070) 0:02:03.409 ******** skipping: [managed-node1] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:58 Friday 17 January 2025 04:12:10 -0500 (0:00:00.082) 0:02:03.491 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default thin pool reserved space values] ********************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:67 Friday 17 January 2025 04:12:10 -0500 (0:00:00.059) 0:02:03.551 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default minimal thin pool reserved space size] *************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:71 Friday 17 January 2025 04:12:10 -0500 (0:00:00.055) 0:02:03.606 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default maximal thin pool reserved space size] *************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:76 Friday 17 January 2025 04:12:10 -0500 (0:00:00.056) 0:02:03.663 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate maximum usable space in thin pool] ***************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:82 Friday 17 January 2025 04:12:11 -0500 (0:00:00.046) 0:02:03.709 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply upper size limit to max usable thin pool space] ******************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:86 Friday 17 January 2025 04:12:11 -0500 (0:00:00.060) 0:02:03.769 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply lower size limit to max usable thin pool space] ******************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:91 Friday 17 January 2025 04:12:11 -0500 (0:00:00.049) 0:02:03.819 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Convert maximum usable thin pool space from int to Size] ***************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:96 Friday 17 January 2025 04:12:11 -0500 (0:00:00.042) 0:02:03.862 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show max thin pool size] ************************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:101 Friday 17 January 2025 04:12:11 -0500 (0:00:00.043) 0:02:03.905 ******** skipping: [managed-node1] => {} TASK [Show volume thin pool size] ********************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:105 Friday 17 January 2025 04:12:11 -0500 (0:00:00.046) 0:02:03.952 ******** skipping: [managed-node1] => {} TASK [Show test volume size] *************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:109 Friday 17 January 2025 04:12:11 -0500 (0:00:00.057) 0:02:04.009 ******** skipping: [managed-node1] => {} TASK [Establish base value for expected thin pool size] ************************ task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:113 Friday 17 January 2025 04:12:11 -0500 (0:00:00.054) 0:02:04.063 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:120 Friday 17 January 2025 04:12:11 -0500 (0:00:00.058) 0:02:04.122 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected thin pool volume size] ***************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:127 Friday 17 January 2025 04:12:11 -0500 (0:00:00.062) 0:02:04.184 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected thin pool volume size based on percentage value] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:131 Friday 17 January 2025 04:12:11 -0500 (0:00:00.055) 0:02:04.239 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Replace expected volume size with calculated value] ********************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:137 Friday 17 January 2025 04:12:11 -0500 (0:00:00.106) 0:02:04.346 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show actual size] ******************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:143 Friday 17 January 2025 04:12:11 -0500 (0:00:00.055) 0:02:04.401 ******** ok: [managed-node1] => { "storage_test_actual_size": { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } } TASK [Show expected size] ****************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:147 Friday 17 January 2025 04:12:11 -0500 (0:00:00.063) 0:02:04.465 ******** ok: [managed-node1] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [Assert expected size is actual size] ************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:151 Friday 17 January 2025 04:12:11 -0500 (0:00:00.079) 0:02:04.544 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about the LV] ******************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:5 Friday 17 January 2025 04:12:11 -0500 (0:00:00.070) 0:02:04.615 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV segment type] ***************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:13 Friday 17 January 2025 04:12:11 -0500 (0:00:00.077) 0:02:04.692 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check segment type] ****************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:17 Friday 17 January 2025 04:12:12 -0500 (0:00:00.052) 0:02:04.745 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV cache size] ******************************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:24 Friday 17 January 2025 04:12:12 -0500 (0:00:00.051) 0:02:04.796 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested cache size] ****************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:31 Friday 17 January 2025 04:12:12 -0500 (0:00:00.050) 0:02:04.846 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected cache size] ************************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:37 Friday 17 January 2025 04:12:12 -0500 (0:00:00.037) 0:02:04.884 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:42 Friday 17 January 2025 04:12:12 -0500 (0:00:00.043) 0:02:04.928 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:25 Friday 17 January 2025 04:12:12 -0500 (0:00:00.055) 0:02:04.983 ******** ok: [managed-node1] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Clean up variable namespace] ********************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:54 Friday 17 January 2025 04:12:12 -0500 (0:00:00.043) 0:02:05.027 ******** ok: [managed-node1] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null }, "changed": false } TASK [Create a file] *********************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/create-test-file.yml:12 Friday 17 January 2025 04:12:12 -0500 (0:00:00.046) 0:02:05.074 ******** changed: [managed-node1] => { "changed": true, "dest": "/opt/test1/quux", "gid": 0, "group": "root", "mode": "0644", "owner": "root", "secontext": "unconfined_u:object_r:unlabeled_t:s0", "size": 0, "state": "file", "uid": 0 } TASK [Test for correct handling of safe_mode] ********************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:161 Friday 17 January 2025 04:12:12 -0500 (0:00:00.347) 0:02:05.421 ******** included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml for managed-node1 TASK [Store global variable value copy] **************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:4 Friday 17 January 2025 04:12:12 -0500 (0:00:00.103) 0:02:05.524 ******** ok: [managed-node1] => { "ansible_facts": { "storage_pools_global": [], "storage_safe_mode_global": true, "storage_volumes_global": [] }, "changed": false } TASK [Verify role raises correct error] **************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:10 Friday 17 January 2025 04:12:12 -0500 (0:00:00.055) 0:02:05.580 ******** TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Friday 17 January 2025 04:12:12 -0500 (0:00:00.087) 0:02:05.668 ******** included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node1 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Friday 17 January 2025 04:12:13 -0500 (0:00:00.136) 0:02:05.805 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Friday 17 January 2025 04:12:13 -0500 (0:00:00.127) 0:02:05.933 ******** skipping: [managed-node1] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node1] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node1] => (item=CentOS_7.yml) => { "ansible_facts": { "__storage_blivet_diskvolume_mkfs_option_map": { "ext2": "-F", "ext3": "-F", "ext4": "-F" }, "blivet_package_list": [ "python-enum34", "python-blivet3", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "{{ 'libblockdev-s390' if ansible_architecture == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_7.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_7.yml" } skipping: [managed-node1] => (item=CentOS_7.9.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS_7.9.yml", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Friday 17 January 2025 04:12:13 -0500 (0:00:00.284) 0:02:06.218 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Friday 17 January 2025 04:12:13 -0500 (0:00:00.045) 0:02:06.264 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Friday 17 January 2025 04:12:13 -0500 (0:00:00.037) 0:02:06.301 ******** ok: [managed-node1] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Friday 17 January 2025 04:12:13 -0500 (0:00:00.039) 0:02:06.341 ******** ok: [managed-node1] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Friday 17 January 2025 04:12:13 -0500 (0:00:00.037) 0:02:06.378 ******** included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node1 TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2 Friday 17 January 2025 04:12:13 -0500 (0:00:00.127) 0:02:06.506 ******** ok: [managed-node1] => { "changed": false, "rc": 0, "results": [ "python-enum34-1.0.4-1.el7.noarch providing python-enum34 is already installed", "1:python2-blivet3-3.1.3-3.el7.noarch providing python-blivet3 is already installed", "libblockdev-crypto-2.18-5.el7.x86_64 providing libblockdev-crypto is already installed", "libblockdev-dm-2.18-5.el7.x86_64 providing libblockdev-dm is already installed", "libblockdev-lvm-2.18-5.el7.x86_64 providing libblockdev-lvm is already installed", "libblockdev-mdraid-2.18-5.el7.x86_64 providing libblockdev-mdraid is already installed", "libblockdev-swap-2.18-5.el7.x86_64 providing libblockdev-swap is already installed", "libblockdev-2.18-5.el7.x86_64 providing libblockdev is already installed" ] } TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Friday 17 January 2025 04:12:15 -0500 (0:00:01.318) 0:02:07.824 ******** ok: [managed-node1] => { "storage_pools": [] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:14 Friday 17 January 2025 04:12:15 -0500 (0:00:00.058) 0:02:07.883 ******** ok: [managed-node1] => { "storage_volumes": [ { "disks": [ "sda" ], "encryption": true, "encryption_luks_version": "luks2", "encryption_password": "yabbadabbadoo", "mount_point": "/opt/test1", "name": "foo", "type": "disk" } ] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19 Friday 17 January 2025 04:12:15 -0500 (0:00:00.054) 0:02:07.937 ******** ok: [managed-node1] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "cryptsetup" ], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:31 Friday 17 January 2025 04:12:19 -0500 (0:00:03.982) 0:02:11.920 ******** included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node1 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Friday 17 January 2025 04:12:19 -0500 (0:00:00.131) 0:02:12.051 ******** TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Friday 17 January 2025 04:12:19 -0500 (0:00:00.059) 0:02:12.111 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Friday 17 January 2025 04:12:19 -0500 (0:00:00.066) 0:02:12.178 ******** TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:37 Friday 17 January 2025 04:12:19 -0500 (0:00:00.083) 0:02:12.262 ******** ok: [managed-node1] => { "changed": false, "rc": 0, "results": [ "cryptsetup-2.0.3-6.el7.x86_64 providing cryptsetup is already installed", "kpartx-0.4.9-136.el7_9.x86_64 providing kpartx is already installed" ] } TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:51 Friday 17 January 2025 04:12:20 -0500 (0:00:00.783) 0:02:13.045 ******** ok: [managed-node1] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "arp-ethers.service": { "name": "arp-ethers.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "brandbot.service": { "name": "brandbot.service", "source": "systemd", "state": "inactive", "status": "static" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "console-shell.service": { "name": "console-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.import1.service": { "name": "dbus-org.freedesktop.import1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.machine1.service": { "name": "dbus-org.freedesktop.machine1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dmraid-activation.service": { "name": "dmraid-activation.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "ebtables.service": { "name": "ebtables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iprdump.service": { "name": "iprdump.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprinit.service": { "name": "iprinit.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprupdate.service": { "name": "iprupdate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-lvmetad.service": { "name": "lvm2-lvmetad.service", "source": "systemd", "state": "running", "status": "static" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "netconsole": { "name": "netconsole", "source": "sysv", "state": "stopped", "status": "disabled" }, "network": { "name": "network", "source": "sysv", "state": "running", "status": "enabled" }, "network.service": { "name": "network.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-config.service": { "name": "nfs-config.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-idmap.service": { "name": "nfs-idmap.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-lock.service": { "name": "nfs-lock.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-rquotad.service": { "name": "nfs-rquotad.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-secure.service": { "name": "nfs-secure.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs.service": { "name": "nfs.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfslock.service": { "name": "nfslock.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "postfix.service": { "name": "postfix.service", "source": "systemd", "state": "running", "status": "enabled" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rhel-autorelabel-mark.service": { "name": "rhel-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-autorelabel.service": { "name": "rhel-autorelabel.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-configure.service": { "name": "rhel-configure.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-dmesg.service": { "name": "rhel-dmesg.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-domainname.service": { "name": "rhel-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-import-state.service": { "name": "rhel-import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-loadmodules.service": { "name": "rhel-loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-readonly.service": { "name": "rhel-readonly.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-rquotad.service": { "name": "rpc-rquotad.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpcgssd.service": { "name": "rpcgssd.service", "source": "systemd", "state": "inactive", "status": "static" }, "rpcidmapd.service": { "name": "rpcidmapd.service", "source": "systemd", "state": "inactive", "status": "static" }, "rsyncd.service": { "name": "rsyncd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyncd@.service": { "name": "rsyncd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-policy-migrate-local-changes@.service": { "name": "selinux-policy-migrate-local-changes@.service", "source": "systemd", "state": "unknown", "status": "static" }, "selinux-policy-migrate-local-changes@targeted.service": { "name": "selinux-policy-migrate-local-changes@targeted.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "unknown" }, "sshd-keygen.service": { "name": "sshd-keygen.service", "source": "systemd", "state": "stopped", "status": "static" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-bootchart.service": { "name": "systemd-bootchart.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-cryptsetup@luks\\x2d688ee6a7\\x2d12e5\\x2d4393\\x2dbfd5\\x2d48ed7d374b8d.service": { "name": "systemd-cryptsetup@luks\\x2d688ee6a7\\x2d12e5\\x2d4393\\x2dbfd5\\x2d48ed7d374b8d.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-importd.service": { "name": "systemd-importd.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-machined.service": { "name": "systemd-machined.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-nspawn@.service": { "name": "systemd-nspawn@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-readahead-collect.service": { "name": "systemd-readahead-collect.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-readahead-done.service": { "name": "systemd-readahead-done.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "systemd-readahead-drop.service": { "name": "systemd-readahead-drop.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "systemd-readahead-replay.service": { "name": "systemd-readahead-replay.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill@.service": { "name": "systemd-rfkill@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-shutdownd.service": { "name": "systemd-shutdownd.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "wpa_supplicant.service": { "name": "wpa_supplicant.service", "source": "systemd", "state": "inactive", "status": "disabled" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Set storage_cryptsetup_services] ***** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:57 Friday 17 January 2025 04:12:21 -0500 (0:00:01.045) 0:02:14.091 ******** ok: [managed-node1] => { "ansible_facts": { "storage_cryptsetup_services": [ "systemd-cryptsetup@luks\\x2d688ee6a7\\x2d12e5\\x2d4393\\x2dbfd5\\x2d48ed7d374b8d.service" ] }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:63 Friday 17 January 2025 04:12:21 -0500 (0:00:00.075) 0:02:14.166 ******** changed: [managed-node1] => (item=systemd-cryptsetup@luks\x2d688ee6a7\x2d12e5\x2d4393\x2dbfd5\x2d48ed7d374b8d.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2d688ee6a7\\x2d12e5\\x2d4393\\x2dbfd5\\x2d48ed7d374b8d.service", "name": "systemd-cryptsetup@luks\\x2d688ee6a7\\x2d12e5\\x2d4393\\x2dbfd5\\x2d48ed7d374b8d.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "After": "systemd-journald.socket systemd-readahead-collect.service cryptsetup-pre.target dev-sda.device systemd-readahead-replay.service system-systemd\\x2dcryptsetup.slice", "AllowIsolate": "no", "AmbientCapabilities": "0", "AssertResult": "no", "AssertTimestampMonotonic": "0", "Before": "cryptsetup.target umount.target", "BindsTo": "dev-sda.device", "BlockIOAccounting": "no", "BlockIOWeight": "18446744073709551615", "CPUAccounting": "no", "CPUQuotaPerSecUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "18446744073709551615", "CanIsolate": "no", "CanReload": "no", "CanStart": "yes", "CanStop": "yes", "CapabilityBoundingSet": "18446744073709551615", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "Conflicts": "umount.target", "ControlPID": "0", "DefaultDependencies": "no", "Delegate": "no", "Description": "Cryptography Setup for luks-688ee6a7-12e5-4393-bfd5-48ed7d374b8d", "DevicePolicy": "auto", "Documentation": "man:crypttab(5) man:systemd-cryptsetup-generator(8) man:systemd-cryptsetup@.service(8)", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup attach luks-688ee6a7-12e5-4393-bfd5-48ed7d374b8d /dev/sda - ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStop": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup detach luks-688ee6a7-12e5-4393-bfd5-48ed7d374b8d ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/run/systemd/generator/systemd-cryptsetup@luks\\x2d688ee6a7\\x2d12e5\\x2d4393\\x2dbfd5\\x2d48ed7d374b8d.service", "GuessMainPID": "yes", "IOScheduling": "0", "Id": "systemd-cryptsetup@luks\\x2d688ee6a7\\x2d12e5\\x2d4393\\x2dbfd5\\x2d48ed7d374b8d.service", "IgnoreOnIsolate": "yes", "IgnoreOnSnapshot": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobTimeoutAction": "none", "JobTimeoutUSec": "0", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "18446744073709551615", "LimitCORE": "18446744073709551615", "LimitCPU": "18446744073709551615", "LimitDATA": "18446744073709551615", "LimitFSIZE": "18446744073709551615", "LimitLOCKS": "18446744073709551615", "LimitMEMLOCK": "65536", "LimitMSGQUEUE": "819200", "LimitNICE": "0", "LimitNOFILE": "4096", "LimitNPROC": "14311", "LimitRSS": "18446744073709551615", "LimitRTPRIO": "0", "LimitRTTIME": "18446744073709551615", "LimitSIGPENDING": "14311", "LimitSTACK": "18446744073709551615", "LoadState": "loaded", "MainPID": "0", "MemoryAccounting": "no", "MemoryCurrent": "18446744073709551615", "MemoryLimit": "18446744073709551615", "MountFlags": "0", "Names": "systemd-cryptsetup@luks\\x2d688ee6a7\\x2d12e5\\x2d4393\\x2dbfd5\\x2d48ed7d374b8d.service", "NeedDaemonReload": "yes", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "PrivateDevices": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "ProtectHome": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "yes", "RequiredBy": "cryptsetup.target", "Requires": "system-systemd\\x2dcryptsetup.slice", "Restart": "no", "RestartUSec": "100ms", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "Slice": "system-systemd\\x2dcryptsetup.slice", "SourcePath": "/etc/crypttab", "StandardError": "inherit", "StandardInput": "null", "StandardOutput": "journal", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitInterval": "10000000", "StartupBlockIOWeight": "18446744073709551615", "StartupCPUShares": "18446744073709551615", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "no", "TasksCurrent": "18446744073709551615", "TasksMax": "18446744073709551615", "TimeoutStartUSec": "0", "TimeoutStopUSec": "0", "TimerSlackNSec": "50000", "Transient": "no", "Type": "oneshot", "UMask": "0022", "UnitFilePreset": "disabled", "UnitFileState": "bad", "WantedBy": "dev-sda.device", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:69 Friday 17 January 2025 04:12:22 -0500 (0:00:00.550) 0:02:14.717 ******** fatal: [managed-node1]: FAILED! => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } MSG: cannot remove existing formatting on device 'sda' in safe mode due to adding encryption TASK [fedora.linux_system_roles.storage : Failed message] ********************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:109 Friday 17 January 2025 04:12:26 -0500 (0:00:04.017) 0:02:18.735 ******** fatal: [managed-node1]: FAILED! => { "changed": false } MSG: {u'_ansible_no_log': False, u'crypts': [], u'pools': [], u'leaves': [], u'changed': False, u'actions': [], u'failed': True, u'volumes': [], u'invocation': {u'module_args': {u'packages_only': False, u'disklabel_type': None, u'diskvolume_mkfs_option_map': {u'ext4': u'-F', u'ext3': u'-F', u'ext2': u'-F'}, u'safe_mode': True, u'pools': [], u'volumes': [{u'raid_metadata_version': None, u'raid_level': None, u'fs_type': u'xfs', u'mount_options': u'defaults', u'size': 10737418240, u'mount_point': u'/opt/test1', u'compression': None, u'encryption_password': u'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', u'encryption': True, u'mount_device_identifier': u'uuid', u'raid_device_count': None, u'state': u'present', u'vdo_pool_size': None, u'mount_mode': None, u'thin_pool_name': None, u'fs_overwrite_existing': True, u'encryption_key_size': None, u'deduplication': None, u'encryption_cipher': None, u'encryption_key': None, u'fs_label': u'', u'encryption_luks_version': u'luks2', u'raid_stripe_size': None, u'cache_size': 0, u'mount_user': None, u'raid_spare_count': None, u'cache_mode': None, u'name': u'foo', u'mount_group': None, u'type': u'disk', u'disks': [u'sda'], u'cached': False, u'thin_pool_size': None, u'thin': None, u'mount_check': 0, u'mount_passno': 0, u'raid_chunk_size': None, u'cache_devices': [], u'fs_create_options': u''}], u'pool_defaults': {u'raid_metadata_version': None, u'encryption_cipher': None, u'encryption_key': None, u'encryption_luks_version': None, u'raid_spare_count': None, u'grow_to_fill': False, u'encryption_password': None, u'encryption': False, u'disks': [], u'raid_level': None, u'raid_device_count': None, u'state': u'present', u'volumes': [], u'shared': False, u'type': u'lvm', u'encryption_key_size': None, u'raid_chunk_size': None}, u'volume_defaults': {u'raid_metadata_version': None, u'raid_level': None, u'fs_type': u'xfs', u'mount_options': u'defaults', u'size': 0, u'mount_point': u'', u'compression': None, u'encryption_password': None, u'encryption': False, u'mount_device_identifier': u'uuid', u'raid_device_count': None, u'state': u'present', u'vdo_pool_size': None, u'thin_pool_name': None, u'type': u'lvm', u'encryption_key_size': None, u'encryption_cipher': None, u'encryption_key': None, u'fs_label': u'', u'encryption_luks_version': None, u'raid_stripe_size': None, u'cache_size': 0, u'raid_spare_count': None, u'cache_mode': None, u'deduplication': None, u'cached': False, u'fs_overwrite_existing': True, u'disks': [], u'thin_pool_size': None, u'thin': None, u'mount_check': 0, u'mount_passno': 0, u'raid_chunk_size': None, u'cache_devices': [], u'fs_create_options': u''}, u'use_partitions': None}}, u'mounts': [], u'packages': [], u'msg': u"cannot remove existing formatting on device 'sda' in safe mode due to adding encryption"} TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:113 Friday 17 January 2025 04:12:26 -0500 (0:00:00.069) 0:02:18.804 ******** changed: [managed-node1] => (item=systemd-cryptsetup@luks\x2d688ee6a7\x2d12e5\x2d4393\x2dbfd5\x2d48ed7d374b8d.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2d688ee6a7\\x2d12e5\\x2d4393\\x2dbfd5\\x2d48ed7d374b8d.service", "name": "systemd-cryptsetup@luks\\x2d688ee6a7\\x2d12e5\\x2d4393\\x2dbfd5\\x2d48ed7d374b8d.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AmbientCapabilities": "0", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "18446744073709551615", "CPUAccounting": "no", "CPUQuotaPerSecUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "18446744073709551615", "CanIsolate": "no", "CanReload": "no", "CanStart": "yes", "CanStop": "yes", "CapabilityBoundingSet": "18446744073709551615", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ControlPID": "0", "DefaultDependencies": "yes", "Delegate": "no", "Description": "systemd-cryptsetup@luks\\x2d688ee6a7\\x2d12e5\\x2d4393\\x2dbfd5\\x2d48ed7d374b8d.service", "DevicePolicy": "auto", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/dev/null", "GuessMainPID": "yes", "IOScheduling": "0", "Id": "systemd-cryptsetup@luks\\x2d688ee6a7\\x2d12e5\\x2d4393\\x2dbfd5\\x2d48ed7d374b8d.service", "IgnoreOnIsolate": "no", "IgnoreOnSnapshot": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobTimeoutAction": "none", "JobTimeoutUSec": "0", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "18446744073709551615", "LimitCORE": "18446744073709551615", "LimitCPU": "18446744073709551615", "LimitDATA": "18446744073709551615", "LimitFSIZE": "18446744073709551615", "LimitLOCKS": "18446744073709551615", "LimitMEMLOCK": "65536", "LimitMSGQUEUE": "819200", "LimitNICE": "0", "LimitNOFILE": "1048576", "LimitNPROC": "14311", "LimitRSS": "18446744073709551615", "LimitRTPRIO": "0", "LimitRTTIME": "18446744073709551615", "LimitSIGPENDING": "14311", "LimitSTACK": "18446744073709551615", "LoadState": "masked", "MainPID": "0", "MemoryAccounting": "no", "MemoryCurrent": "18446744073709551615", "MemoryLimit": "18446744073709551615", "MountFlags": "0", "Names": "systemd-cryptsetup@luks\\x2d688ee6a7\\x2d12e5\\x2d4393\\x2dbfd5\\x2d48ed7d374b8d.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "PrivateDevices": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "ProtectHome": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "no", "Restart": "no", "RestartUSec": "100ms", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitInterval": "10000000", "StartupBlockIOWeight": "18446744073709551615", "StartupCPUShares": "18446744073709551615", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "no", "TasksCurrent": "18446744073709551615", "TasksMax": "18446744073709551615", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UMask": "0022", "UnitFileState": "bad", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } TASK [Check that we failed in the role] **************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:23 Friday 17 January 2025 04:12:26 -0500 (0:00:00.626) 0:02:19.430 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Verify the blivet output and error message are correct] ****************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:28 Friday 17 January 2025 04:12:26 -0500 (0:00:00.069) 0:02:19.499 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Verify correct exception or error message] ******************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:39 Friday 17 January 2025 04:12:26 -0500 (0:00:00.125) 0:02:19.625 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Stat the file] *********************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml:11 Friday 17 January 2025 04:12:27 -0500 (0:00:00.111) 0:02:19.737 ******** ok: [managed-node1] => { "changed": false, "stat": { "atime": 1737105132.6488855, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1737105132.6488855, "dev": 2048, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 67, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0644", "mtime": 1737105132.6488855, "nlink": 1, "path": "/opt/test1/quux", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "version": "685872589", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Assert file presence] **************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml:16 Friday 17 January 2025 04:12:27 -0500 (0:00:00.523) 0:02:20.260 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Add encryption to the volume] ******************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:182 Friday 17 January 2025 04:12:27 -0500 (0:00:00.066) 0:02:20.326 ******** TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Friday 17 January 2025 04:12:28 -0500 (0:00:00.430) 0:02:20.757 ******** included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node1 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Friday 17 January 2025 04:12:28 -0500 (0:00:00.096) 0:02:20.854 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Friday 17 January 2025 04:12:28 -0500 (0:00:00.065) 0:02:20.920 ******** skipping: [managed-node1] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node1] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node1] => (item=CentOS_7.yml) => { "ansible_facts": { "__storage_blivet_diskvolume_mkfs_option_map": { "ext2": "-F", "ext3": "-F", "ext4": "-F" }, "blivet_package_list": [ "python-enum34", "python-blivet3", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "{{ 'libblockdev-s390' if ansible_architecture == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_7.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_7.yml" } skipping: [managed-node1] => (item=CentOS_7.9.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS_7.9.yml", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Friday 17 January 2025 04:12:28 -0500 (0:00:00.131) 0:02:21.051 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Friday 17 January 2025 04:12:28 -0500 (0:00:00.050) 0:02:21.101 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Friday 17 January 2025 04:12:28 -0500 (0:00:00.040) 0:02:21.142 ******** ok: [managed-node1] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Friday 17 January 2025 04:12:28 -0500 (0:00:00.037) 0:02:21.179 ******** ok: [managed-node1] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Friday 17 January 2025 04:12:28 -0500 (0:00:00.039) 0:02:21.219 ******** included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node1 TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2 Friday 17 January 2025 04:12:28 -0500 (0:00:00.115) 0:02:21.334 ******** ok: [managed-node1] => { "changed": false, "rc": 0, "results": [ "python-enum34-1.0.4-1.el7.noarch providing python-enum34 is already installed", "1:python2-blivet3-3.1.3-3.el7.noarch providing python-blivet3 is already installed", "libblockdev-crypto-2.18-5.el7.x86_64 providing libblockdev-crypto is already installed", "libblockdev-dm-2.18-5.el7.x86_64 providing libblockdev-dm is already installed", "libblockdev-lvm-2.18-5.el7.x86_64 providing libblockdev-lvm is already installed", "libblockdev-mdraid-2.18-5.el7.x86_64 providing libblockdev-mdraid is already installed", "libblockdev-swap-2.18-5.el7.x86_64 providing libblockdev-swap is already installed", "libblockdev-2.18-5.el7.x86_64 providing libblockdev is already installed" ] } TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Friday 17 January 2025 04:12:32 -0500 (0:00:03.951) 0:02:25.285 ******** ok: [managed-node1] => { "storage_pools": "VARIABLE IS NOT DEFINED!: 'storage_pools' is undefined" } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:14 Friday 17 January 2025 04:12:32 -0500 (0:00:00.064) 0:02:25.350 ******** ok: [managed-node1] => { "storage_volumes": [ { "disks": [ "sda" ], "encryption": true, "encryption_luks_version": "luks2", "encryption_password": "yabbadabbadoo", "mount_point": "/opt/test1", "name": "foo", "type": "disk" } ] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19 Friday 17 January 2025 04:12:32 -0500 (0:00:00.087) 0:02:25.438 ******** ok: [managed-node1] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "cryptsetup" ], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:31 Friday 17 January 2025 04:12:36 -0500 (0:00:04.157) 0:02:29.596 ******** included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node1 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Friday 17 January 2025 04:12:36 -0500 (0:00:00.084) 0:02:29.681 ******** TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Friday 17 January 2025 04:12:37 -0500 (0:00:00.040) 0:02:29.721 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Friday 17 January 2025 04:12:37 -0500 (0:00:00.039) 0:02:29.761 ******** TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:37 Friday 17 January 2025 04:12:37 -0500 (0:00:00.034) 0:02:29.795 ******** ok: [managed-node1] => { "changed": false, "rc": 0, "results": [ "cryptsetup-2.0.3-6.el7.x86_64 providing cryptsetup is already installed", "kpartx-0.4.9-136.el7_9.x86_64 providing kpartx is already installed" ] } TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:51 Friday 17 January 2025 04:12:37 -0500 (0:00:00.620) 0:02:30.416 ******** ok: [managed-node1] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "arp-ethers.service": { "name": "arp-ethers.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "brandbot.service": { "name": "brandbot.service", "source": "systemd", "state": "inactive", "status": "static" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "console-shell.service": { "name": "console-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.import1.service": { "name": "dbus-org.freedesktop.import1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.machine1.service": { "name": "dbus-org.freedesktop.machine1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dmraid-activation.service": { "name": "dmraid-activation.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "ebtables.service": { "name": "ebtables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iprdump.service": { "name": "iprdump.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprinit.service": { "name": "iprinit.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprupdate.service": { "name": "iprupdate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-lvmetad.service": { "name": "lvm2-lvmetad.service", "source": "systemd", "state": "running", "status": "static" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "netconsole": { "name": "netconsole", "source": "sysv", "state": "stopped", "status": "disabled" }, "network": { "name": "network", "source": "sysv", "state": "running", "status": "enabled" }, "network.service": { "name": "network.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-config.service": { "name": "nfs-config.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-idmap.service": { "name": "nfs-idmap.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-lock.service": { "name": "nfs-lock.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-rquotad.service": { "name": "nfs-rquotad.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-secure.service": { "name": "nfs-secure.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs.service": { "name": "nfs.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfslock.service": { "name": "nfslock.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "postfix.service": { "name": "postfix.service", "source": "systemd", "state": "running", "status": "enabled" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rhel-autorelabel-mark.service": { "name": "rhel-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-autorelabel.service": { "name": "rhel-autorelabel.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-configure.service": { "name": "rhel-configure.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-dmesg.service": { "name": "rhel-dmesg.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-domainname.service": { "name": "rhel-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-import-state.service": { "name": "rhel-import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-loadmodules.service": { "name": "rhel-loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-readonly.service": { "name": "rhel-readonly.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-rquotad.service": { "name": "rpc-rquotad.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpcgssd.service": { "name": "rpcgssd.service", "source": "systemd", "state": "inactive", "status": "static" }, "rpcidmapd.service": { "name": "rpcidmapd.service", "source": "systemd", "state": "inactive", "status": "static" }, "rsyncd.service": { "name": "rsyncd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyncd@.service": { "name": "rsyncd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-policy-migrate-local-changes@.service": { "name": "selinux-policy-migrate-local-changes@.service", "source": "systemd", "state": "unknown", "status": "static" }, "selinux-policy-migrate-local-changes@targeted.service": { "name": "selinux-policy-migrate-local-changes@targeted.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "unknown" }, "sshd-keygen.service": { "name": "sshd-keygen.service", "source": "systemd", "state": "stopped", "status": "static" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-bootchart.service": { "name": "systemd-bootchart.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-importd.service": { "name": "systemd-importd.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-machined.service": { "name": "systemd-machined.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-nspawn@.service": { "name": "systemd-nspawn@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-readahead-collect.service": { "name": "systemd-readahead-collect.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-readahead-done.service": { "name": "systemd-readahead-done.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "systemd-readahead-drop.service": { "name": "systemd-readahead-drop.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "systemd-readahead-replay.service": { "name": "systemd-readahead-replay.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill@.service": { "name": "systemd-rfkill@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-shutdownd.service": { "name": "systemd-shutdownd.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "wpa_supplicant.service": { "name": "wpa_supplicant.service", "source": "systemd", "state": "inactive", "status": "disabled" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Set storage_cryptsetup_services] ***** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:57 Friday 17 January 2025 04:12:38 -0500 (0:00:00.939) 0:02:31.355 ******** ok: [managed-node1] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:63 Friday 17 January 2025 04:12:38 -0500 (0:00:00.054) 0:02:31.409 ******** TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:69 Friday 17 January 2025 04:12:38 -0500 (0:00:00.033) 0:02:31.443 ******** changed: [managed-node1] => { "actions": [ { "action": "destroy format", "device": "/dev/sda", "fs_type": "xfs" }, { "action": "create format", "device": "/dev/sda", "fs_type": "luks" }, { "action": "create device", "device": "/dev/mapper/luks-8c8d336a-42af-4ee5-a371-3bf9cd16f56b", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/luks-8c8d336a-42af-4ee5-a371-3bf9cd16f56b", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/sda", "name": "luks-8c8d336a-42af-4ee5-a371-3bf9cd16f56b", "password": "-", "state": "present" } ], "leaves": [ "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1", "/dev/mapper/luks-8c8d336a-42af-4ee5-a371-3bf9cd16f56b" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "UUID=e4ad127e-be0f-45ac-a94b-c1f82748bbb5", "state": "absent" }, { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-8c8d336a-42af-4ee5-a371-3bf9cd16f56b", "state": "mounted" } ], "packages": [ "cryptsetup", "xfsprogs", "e2fsprogs" ], "pools": [], "volumes": [ { "_device": "/dev/mapper/luks-8c8d336a-42af-4ee5-a371-3bf9cd16f56b", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-8c8d336a-42af-4ee5-a371-3bf9cd16f56b", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10737418240, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] } TASK [fedora.linux_system_roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:83 Friday 17 January 2025 04:12:51 -0500 (0:00:12.533) 0:02:43.976 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if /etc/fstab is present] ****** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:90 Friday 17 January 2025 04:12:51 -0500 (0:00:00.051) 0:02:44.028 ******** ok: [managed-node1] => { "changed": false, "stat": { "atime": 1737105122.6098833, "attr_flags": "e", "attributes": [ "extents" ], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "1efe0fcc700d2a5830fa1dd36c7e7692e7b2a101", "ctime": 1737105122.6078832, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 263644, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0644", "mtime": 1737105122.6078832, "nlink": 1, "path": "/etc/fstab", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 1299, "uid": 0, "version": "18446744072031193646", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Add fingerprint to /etc/fstab if present] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:95 Friday 17 January 2025 04:12:51 -0500 (0:00:00.370) 0:02:44.398 ******** ok: [managed-node1] => { "backup": "", "changed": false } TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:113 Friday 17 January 2025 04:12:52 -0500 (0:00:00.361) 0:02:44.759 ******** TASK [fedora.linux_system_roles.storage : Show blivet_output] ****************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:119 Friday 17 January 2025 04:12:52 -0500 (0:00:00.043) 0:02:44.803 ******** ok: [managed-node1] => { "blivet_output": { "actions": [ { "action": "destroy format", "device": "/dev/sda", "fs_type": "xfs" }, { "action": "create format", "device": "/dev/sda", "fs_type": "luks" }, { "action": "create device", "device": "/dev/mapper/luks-8c8d336a-42af-4ee5-a371-3bf9cd16f56b", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/luks-8c8d336a-42af-4ee5-a371-3bf9cd16f56b", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/sda", "name": "luks-8c8d336a-42af-4ee5-a371-3bf9cd16f56b", "password": "-", "state": "present" } ], "failed": false, "leaves": [ "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1", "/dev/mapper/luks-8c8d336a-42af-4ee5-a371-3bf9cd16f56b" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "UUID=e4ad127e-be0f-45ac-a94b-c1f82748bbb5", "state": "absent" }, { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-8c8d336a-42af-4ee5-a371-3bf9cd16f56b", "state": "mounted" } ], "packages": [ "cryptsetup", "xfsprogs", "e2fsprogs" ], "pools": [], "volumes": [ { "_device": "/dev/mapper/luks-8c8d336a-42af-4ee5-a371-3bf9cd16f56b", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-8c8d336a-42af-4ee5-a371-3bf9cd16f56b", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10737418240, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] } } TASK [fedora.linux_system_roles.storage : Set the list of pools for test verification] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:128 Friday 17 January 2025 04:12:52 -0500 (0:00:00.060) 0:02:44.863 ******** ok: [managed-node1] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Set the list of volumes for test verification] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:132 Friday 17 January 2025 04:12:52 -0500 (0:00:00.046) 0:02:44.910 ******** ok: [managed-node1] => { "ansible_facts": { "_storage_volumes_list": [ { "_device": "/dev/mapper/luks-8c8d336a-42af-4ee5-a371-3bf9cd16f56b", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-8c8d336a-42af-4ee5-a371-3bf9cd16f56b", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10737418240, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] }, "changed": false } TASK [fedora.linux_system_roles.storage : Remove obsolete mounts] ************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:148 Friday 17 January 2025 04:12:52 -0500 (0:00:00.050) 0:02:44.960 ******** changed: [managed-node1] => (item={u'src': u'UUID=e4ad127e-be0f-45ac-a94b-c1f82748bbb5', u'state': u'absent', u'fstype': u'xfs', u'path': u'/opt/test1'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "fstype": "xfs", "path": "/opt/test1", "src": "UUID=e4ad127e-be0f-45ac-a94b-c1f82748bbb5", "state": "absent" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "UUID=e4ad127e-be0f-45ac-a94b-c1f82748bbb5" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:159 Friday 17 January 2025 04:12:52 -0500 (0:00:00.385) 0:02:45.345 ******** ok: [managed-node1] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Set up new/current mounts] *********** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:164 Friday 17 January 2025 04:12:53 -0500 (0:00:00.763) 0:02:46.109 ******** changed: [managed-node1] => (item={u'src': u'/dev/mapper/luks-8c8d336a-42af-4ee5-a371-3bf9cd16f56b', u'group': None, u'dump': 0, u'passno': 0, u'fstype': u'xfs', u'state': u'mounted', u'mode': None, u'owner': None, u'path': u'/opt/test1', u'opts': u'defaults'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-8c8d336a-42af-4ee5-a371-3bf9cd16f56b", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/luks-8c8d336a-42af-4ee5-a371-3bf9cd16f56b" } TASK [fedora.linux_system_roles.storage : Manage mount ownership/permissions] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:175 Friday 17 January 2025 04:12:53 -0500 (0:00:00.550) 0:02:46.660 ******** skipping: [managed-node1] => (item={u'src': u'/dev/mapper/luks-8c8d336a-42af-4ee5-a371-3bf9cd16f56b', u'group': None, u'dump': 0, u'passno': 0, u'fstype': u'xfs', u'state': u'mounted', u'mode': None, u'owner': None, u'path': u'/opt/test1', u'opts': u'defaults'}) => { "ansible_loop_var": "mount_info", "changed": false, "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-8c8d336a-42af-4ee5-a371-3bf9cd16f56b", "state": "mounted" }, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:187 Friday 17 January 2025 04:12:54 -0500 (0:00:00.055) 0:02:46.715 ******** ok: [managed-node1] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Retrieve facts for the /etc/crypttab file] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:195 Friday 17 January 2025 04:12:54 -0500 (0:00:00.540) 0:02:47.255 ******** ok: [managed-node1] => { "changed": false, "stat": { "atime": 1737105126.038884, "attr_flags": "e", "attributes": [ "extents" ], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1737105123.8828835, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 263647, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0600", "mtime": 1737105123.8808835, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": "18446744072031193834", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Manage /etc/crypttab to account for changes we just made] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:200 Friday 17 January 2025 04:12:55 -0500 (0:00:00.491) 0:02:47.747 ******** changed: [managed-node1] => (item={u'state': u'present', u'password': u'-', u'name': u'luks-8c8d336a-42af-4ee5-a371-3bf9cd16f56b', u'backing_device': u'/dev/sda'}) => { "ansible_loop_var": "entry", "backup": "", "changed": true, "entry": { "backing_device": "/dev/sda", "name": "luks-8c8d336a-42af-4ee5-a371-3bf9cd16f56b", "password": "-", "state": "present" } } MSG: line added TASK [fedora.linux_system_roles.storage : Update facts] ************************ task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:222 Friday 17 January 2025 04:12:55 -0500 (0:00:00.443) 0:02:48.190 ******** ok: [managed-node1] TASK [Verify role results] ***************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:196 Friday 17 January 2025 04:12:56 -0500 (0:00:00.815) 0:02:49.005 ******** included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml for managed-node1 TASK [Print out pool information] ********************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:2 Friday 17 January 2025 04:12:56 -0500 (0:00:00.112) 0:02:49.118 ******** skipping: [managed-node1] => {} TASK [Print out volume information] ******************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:7 Friday 17 January 2025 04:12:56 -0500 (0:00:00.037) 0:02:49.156 ******** ok: [managed-node1] => { "_storage_volumes_list": [ { "_device": "/dev/mapper/luks-8c8d336a-42af-4ee5-a371-3bf9cd16f56b", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-8c8d336a-42af-4ee5-a371-3bf9cd16f56b", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10737418240, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] } TASK [Collect info about the volumes.] ***************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:15 Friday 17 January 2025 04:12:56 -0500 (0:00:00.046) 0:02:49.203 ******** ok: [managed-node1] => { "changed": false, "info": { "/dev/mapper/luks-8c8d336a-42af-4ee5-a371-3bf9cd16f56b": { "fstype": "xfs", "label": "", "mountpoint": "/opt/test1", "name": "/dev/mapper/luks-8c8d336a-42af-4ee5-a371-3bf9cd16f56b", "size": "10G", "type": "crypt", "uuid": "ac501f98-87bc-49aa-8da0-2bd092cc2162" }, "/dev/sda": { "fstype": "crypto_LUKS", "label": "", "mountpoint": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "8c8d336a-42af-4ee5-a371-3bf9cd16f56b" }, "/dev/sdb": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdd": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdd", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sde": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sde", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdf": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdf", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdg": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdg", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdh": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdh", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdi": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdi", "size": "10G", "type": "disk", "uuid": "" }, "/dev/xvda": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/xvda", "size": "250G", "type": "disk", "uuid": "" }, "/dev/xvda1": { "fstype": "ext4", "label": "", "mountpoint": "/", "name": "/dev/xvda1", "size": "250G", "type": "partition", "uuid": "c7b7d6a5-fd01-4b9b-bcca-153eaff9d312" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:20 Friday 17 January 2025 04:12:56 -0500 (0:00:00.453) 0:02:49.657 ******** ok: [managed-node1] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.002905", "end": "2025-01-17 04:12:57.267020", "rc": 0, "start": "2025-01-17 04:12:57.264115" } STDOUT: # system_role:storage # # /etc/fstab # Created by anaconda on Thu Jun 20 10:23:46 2024 # # Accessible filesystems, by reference, are maintained under '/dev/disk' # See man pages fstab(5), findfs(8), mount(8) and/or blkid(8) for more info # UUID=c7b7d6a5-fd01-4b9b-bcca-153eaff9d312 / ext4 defaults 1 1 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat /mnt/redhat nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/qa /mnt/qa nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 vtap-eng01.storage.rdu2.redhat.com:/vol/engarchive /mnt/engarchive nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/tpsdist /mnt/tpsdist nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat/brewroot /mnt/brew nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_brew_scratch_nfs/scratch /mnt/brew_scratch nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 /dev/mapper/luks-8c8d336a-42af-4ee5-a371-3bf9cd16f56b /opt/test1 xfs defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:25 Friday 17 January 2025 04:12:57 -0500 (0:00:00.374) 0:02:50.031 ******** ok: [managed-node1] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.002737", "end": "2025-01-17 04:12:57.655872", "failed_when_result": false, "rc": 0, "start": "2025-01-17 04:12:57.653135" } STDOUT: luks-8c8d336a-42af-4ee5-a371-3bf9cd16f56b /dev/sda - TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:34 Friday 17 January 2025 04:12:57 -0500 (0:00:00.382) 0:02:50.414 ******** TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:44 Friday 17 January 2025 04:12:57 -0500 (0:00:00.037) 0:02:50.451 ******** included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml for managed-node1 TASK [Set storage volume test variables] *************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:2 Friday 17 January 2025 04:12:57 -0500 (0:00:00.115) 0:02:50.567 ******** ok: [managed-node1] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [Run test verify for {{ storage_test_volume_subset }}] ******************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:19 Friday 17 January 2025 04:12:57 -0500 (0:00:00.062) 0:02:50.629 ******** included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml for managed-node1 included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml for managed-node1 included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml for managed-node1 included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml for managed-node1 included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml for managed-node1 included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml for managed-node1 included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml for managed-node1 included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml for managed-node1 TASK [Get expected mount device based on device type] ************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:7 Friday 17 January 2025 04:12:58 -0500 (0:00:00.297) 0:02:50.927 ******** ok: [managed-node1] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/luks-8c8d336a-42af-4ee5-a371-3bf9cd16f56b" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:11 Friday 17 January 2025 04:12:58 -0500 (0:00:00.062) 0:02:50.989 ******** ok: [managed-node1] => { "ansible_facts": { "storage_test_mount_expected_mount_point": "/opt/test1", "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Get information about the mountpoint directory] ************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:19 Friday 17 January 2025 04:12:58 -0500 (0:00:00.082) 0:02:51.072 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by device] ******************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:28 Friday 17 January 2025 04:12:58 -0500 (0:00:00.059) 0:02:51.132 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Verify mount directory user] ********************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:36 Friday 17 January 2025 04:12:58 -0500 (0:00:00.069) 0:02:51.202 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory group] ******************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:42 Friday 17 January 2025 04:12:58 -0500 (0:00:00.092) 0:02:51.294 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory permissions] ************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:48 Friday 17 January 2025 04:12:58 -0500 (0:00:00.075) 0:02:51.370 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get path of test volume device] ****************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:57 Friday 17 January 2025 04:12:58 -0500 (0:00:00.055) 0:02:51.426 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:63 Friday 17 January 2025 04:12:58 -0500 (0:00:00.052) 0:02:51.478 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:69 Friday 17 January 2025 04:12:58 -0500 (0:00:00.091) 0:02:51.570 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:79 Friday 17 January 2025 04:12:58 -0500 (0:00:00.102) 0:02:51.673 ******** ok: [managed-node1] => { "ansible_facts": { "storage_test_found_mount_stat": null, "storage_test_mount_expected_mount_point": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:2 Friday 17 January 2025 04:12:59 -0500 (0:00:00.054) 0:02:51.727 ******** ok: [managed-node1] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "/dev/mapper/luks-8c8d336a-42af-4ee5-a371-3bf9cd16f56b " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:17 Friday 17 January 2025 04:12:59 -0500 (0:00:00.133) 0:02:51.861 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:24 Friday 17 January 2025 04:12:59 -0500 (0:00:00.117) 0:02:51.979 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:33 Friday 17 January 2025 04:12:59 -0500 (0:00:00.134) 0:02:52.114 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify fingerprint] ****************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:45 Friday 17 January 2025 04:12:59 -0500 (0:00:00.120) 0:02:52.235 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Clean up variables] ****************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:52 Friday 17 January 2025 04:12:59 -0500 (0:00:00.111) 0:02:52.346 ******** ok: [managed-node1] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:6 Friday 17 January 2025 04:12:59 -0500 (0:00:00.094) 0:02:52.440 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:14 Friday 17 January 2025 04:12:59 -0500 (0:00:00.142) 0:02:52.582 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:3 Friday 17 January 2025 04:13:00 -0500 (0:00:00.204) 0:02:52.787 ******** ok: [managed-node1] => { "changed": false, "stat": { "atime": 1737105171.0468948, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1737105171.0468948, "dev": 5, "device_type": 2048, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 28267, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/blockdevice", "mode": "0660", "mtime": 1737105171.0468948, "nlink": 1, "path": "/dev/sda", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:9 Friday 17 January 2025 04:13:00 -0500 (0:00:00.568) 0:02:53.356 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:16 Friday 17 January 2025 04:13:00 -0500 (0:00:00.101) 0:02:53.458 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Make sure we got info about this volume] ********************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:23 Friday 17 January 2025 04:13:00 -0500 (0:00:00.057) 0:02:53.515 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Process volume type (set initial value) (1/2)] *************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:29 Friday 17 January 2025 04:13:00 -0500 (0:00:00.124) 0:02:53.640 ******** ok: [managed-node1] => { "ansible_facts": { "st_volume_type": "disk" }, "changed": false } TASK [Process volume type (get RAID value) (2/2)] ****************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:33 Friday 17 January 2025 04:13:01 -0500 (0:00:00.097) 0:02:53.737 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:38 Friday 17 January 2025 04:13:01 -0500 (0:00:00.055) 0:02:53.793 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:3 Friday 17 January 2025 04:13:01 -0500 (0:00:00.066) 0:02:53.859 ******** ok: [managed-node1] => { "changed": false, "stat": { "atime": 1737105171.1548948, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1737105171.1548948, "dev": 5, "device_type": 64768, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 60127, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1737105171.1548948, "nlink": 1, "path": "/dev/mapper/luks-8c8d336a-42af-4ee5-a371-3bf9cd16f56b", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Ensure cryptsetup is present] ******************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:10 Friday 17 January 2025 04:13:01 -0500 (0:00:00.560) 0:02:54.420 ******** ok: [managed-node1] => { "changed": false, "rc": 0, "results": [ "cryptsetup-2.0.3-6.el7.x86_64 providing cryptsetup is already installed" ] } TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:16 Friday 17 January 2025 04:13:02 -0500 (0:00:01.104) 0:02:55.524 ******** ok: [managed-node1] => { "changed": false, "cmd": [ "cryptsetup", "luksDump", "/dev/sda" ], "delta": "0:00:00.027550", "end": "2025-01-17 04:13:03.256849", "rc": 0, "start": "2025-01-17 04:13:03.229299" } STDOUT: LUKS header information Version: 2 Epoch: 3 Metadata area: 12288 bytes UUID: 8c8d336a-42af-4ee5-a371-3bf9cd16f56b Label: (no label) Subsystem: (no subsystem) Flags: (no flags) Data segments: 0: crypt offset: 4194304 [bytes] length: (whole device) cipher: aes-xts-plain64 sector: 512 [bytes] Keyslots: 0: luks2 Key: 512 bits Priority: normal Cipher: aes-xts-plain64 PBKDF: argon2i Time cost: 4 Memory: 667528 Threads: 2 Salt: ee c6 8f 7d 65 b4 18 5e 2a 29 89 e2 e1 37 f9 6a 42 3f 36 f1 02 9e c8 1f 98 8f 78 7c 66 24 40 f0 AF stripes: 4000 Area offset:32768 [bytes] Area length:258048 [bytes] Digest ID: 0 Tokens: Digests: 0: pbkdf2 Hash: sha256 Iterations: 23011 Salt: cb be bc e1 b0 47 a2 a9 f6 dc 40 f4 d9 61 bd 94 80 d4 a1 2f a2 91 ec fb a4 22 09 ee 3c 14 c1 ec Digest: 50 ed ea f2 c2 54 1c 96 1e 24 9c 78 4c c9 9a 6b 78 14 50 a0 56 58 12 47 e5 10 17 e4 09 43 26 cd TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:22 Friday 17 January 2025 04:13:03 -0500 (0:00:00.536) 0:02:56.061 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:29 Friday 17 January 2025 04:13:03 -0500 (0:00:00.078) 0:02:56.139 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:40 Friday 17 January 2025 04:13:03 -0500 (0:00:00.109) 0:02:56.249 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:46 Friday 17 January 2025 04:13:03 -0500 (0:00:00.096) 0:02:56.346 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Check LUKS version] ****************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:51 Friday 17 January 2025 04:13:03 -0500 (0:00:00.110) 0:02:56.457 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Check LUKS key size] ***************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:63 Friday 17 January 2025 04:13:03 -0500 (0:00:00.154) 0:02:56.611 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:75 Friday 17 January 2025 04:13:04 -0500 (0:00:00.146) 0:02:56.758 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set test variables] ****************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:87 Friday 17 January 2025 04:13:04 -0500 (0:00:00.102) 0:02:56.861 ******** ok: [managed-node1] => { "ansible_facts": { "_storage_test_crypttab_entries": [ "luks-8c8d336a-42af-4ee5-a371-3bf9cd16f56b /dev/sda -" ], "_storage_test_expected_crypttab_entries": "1", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:93 Friday 17 January 2025 04:13:04 -0500 (0:00:00.112) 0:02:56.973 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:100 Friday 17 January 2025 04:13:04 -0500 (0:00:00.070) 0:02:57.044 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:108 Friday 17 January 2025 04:13:04 -0500 (0:00:00.082) 0:02:57.127 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Check key file of crypttab entry] **************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:116 Friday 17 January 2025 04:13:04 -0500 (0:00:00.085) 0:02:57.213 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Clear test variables] **************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:124 Friday 17 January 2025 04:13:04 -0500 (0:00:00.079) 0:02:57.292 ******** ok: [managed-node1] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [Get information about RAID] ********************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:8 Friday 17 January 2025 04:13:04 -0500 (0:00:00.067) 0:02:57.360 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:14 Friday 17 January 2025 04:13:04 -0500 (0:00:00.057) 0:02:57.417 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:19 Friday 17 January 2025 04:13:04 -0500 (0:00:00.055) 0:02:57.473 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:24 Friday 17 January 2025 04:13:04 -0500 (0:00:00.054) 0:02:57.528 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set chunk size regex] **************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:29 Friday 17 January 2025 04:13:04 -0500 (0:00:00.053) 0:02:57.581 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:37 Friday 17 January 2025 04:13:04 -0500 (0:00:00.056) 0:02:57.638 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:46 Friday 17 January 2025 04:13:05 -0500 (0:00:00.068) 0:02:57.707 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:54 Friday 17 January 2025 04:13:05 -0500 (0:00:00.078) 0:02:57.786 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:62 Friday 17 January 2025 04:13:05 -0500 (0:00:00.088) 0:02:57.874 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:70 Friday 17 January 2025 04:13:05 -0500 (0:00:00.087) 0:02:57.962 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the actual size of the volume] ************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:3 Friday 17 January 2025 04:13:05 -0500 (0:00:00.138) 0:02:58.101 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested size of the volume] ********************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:11 Friday 17 January 2025 04:13:05 -0500 (0:00:00.066) 0:02:58.167 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected size] ********************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:20 Friday 17 January 2025 04:13:05 -0500 (0:00:00.087) 0:02:58.255 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show expected size] ****************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:28 Friday 17 January 2025 04:13:05 -0500 (0:00:00.084) 0:02:58.339 ******** ok: [managed-node1] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [Get the size of parent/pool device] ************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:32 Friday 17 January 2025 04:13:05 -0500 (0:00:00.114) 0:02:58.454 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show test pool] ********************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:46 Friday 17 January 2025 04:13:05 -0500 (0:00:00.066) 0:02:58.521 ******** skipping: [managed-node1] => {} TASK [Show test blockinfo] ***************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:50 Friday 17 January 2025 04:13:05 -0500 (0:00:00.069) 0:02:58.590 ******** skipping: [managed-node1] => {} TASK [Show test pool size] ***************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:54 Friday 17 January 2025 04:13:05 -0500 (0:00:00.057) 0:02:58.648 ******** skipping: [managed-node1] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:58 Friday 17 January 2025 04:13:06 -0500 (0:00:00.081) 0:02:58.729 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default thin pool reserved space values] ********************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:67 Friday 17 January 2025 04:13:06 -0500 (0:00:00.071) 0:02:58.800 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default minimal thin pool reserved space size] *************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:71 Friday 17 January 2025 04:13:06 -0500 (0:00:00.059) 0:02:58.860 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default maximal thin pool reserved space size] *************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:76 Friday 17 January 2025 04:13:06 -0500 (0:00:00.056) 0:02:58.916 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate maximum usable space in thin pool] ***************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:82 Friday 17 January 2025 04:13:06 -0500 (0:00:00.065) 0:02:58.981 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply upper size limit to max usable thin pool space] ******************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:86 Friday 17 January 2025 04:13:06 -0500 (0:00:00.107) 0:02:59.089 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply lower size limit to max usable thin pool space] ******************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:91 Friday 17 January 2025 04:13:06 -0500 (0:00:00.129) 0:02:59.218 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Convert maximum usable thin pool space from int to Size] ***************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:96 Friday 17 January 2025 04:13:06 -0500 (0:00:00.086) 0:02:59.305 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show max thin pool size] ************************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:101 Friday 17 January 2025 04:13:06 -0500 (0:00:00.057) 0:02:59.362 ******** skipping: [managed-node1] => {} TASK [Show volume thin pool size] ********************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:105 Friday 17 January 2025 04:13:06 -0500 (0:00:00.062) 0:02:59.425 ******** skipping: [managed-node1] => {} TASK [Show test volume size] *************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:109 Friday 17 January 2025 04:13:06 -0500 (0:00:00.091) 0:02:59.517 ******** skipping: [managed-node1] => {} TASK [Establish base value for expected thin pool size] ************************ task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:113 Friday 17 January 2025 04:13:06 -0500 (0:00:00.057) 0:02:59.574 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:120 Friday 17 January 2025 04:13:06 -0500 (0:00:00.095) 0:02:59.669 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected thin pool volume size] ***************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:127 Friday 17 January 2025 04:13:07 -0500 (0:00:00.108) 0:02:59.778 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected thin pool volume size based on percentage value] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:131 Friday 17 January 2025 04:13:07 -0500 (0:00:00.057) 0:02:59.835 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Replace expected volume size with calculated value] ********************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:137 Friday 17 January 2025 04:13:07 -0500 (0:00:00.066) 0:02:59.902 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show actual size] ******************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:143 Friday 17 January 2025 04:13:07 -0500 (0:00:00.079) 0:02:59.982 ******** ok: [managed-node1] => { "storage_test_actual_size": { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } } TASK [Show expected size] ****************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:147 Friday 17 January 2025 04:13:07 -0500 (0:00:00.055) 0:03:00.038 ******** ok: [managed-node1] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [Assert expected size is actual size] ************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:151 Friday 17 January 2025 04:13:07 -0500 (0:00:00.052) 0:03:00.090 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about the LV] ******************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:5 Friday 17 January 2025 04:13:07 -0500 (0:00:00.124) 0:03:00.215 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV segment type] ***************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:13 Friday 17 January 2025 04:13:07 -0500 (0:00:00.054) 0:03:00.270 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check segment type] ****************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:17 Friday 17 January 2025 04:13:07 -0500 (0:00:00.046) 0:03:00.316 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV cache size] ******************************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:24 Friday 17 January 2025 04:13:07 -0500 (0:00:00.049) 0:03:00.366 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested cache size] ****************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:31 Friday 17 January 2025 04:13:07 -0500 (0:00:00.047) 0:03:00.413 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected cache size] ************************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:37 Friday 17 January 2025 04:13:07 -0500 (0:00:00.045) 0:03:00.459 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:42 Friday 17 January 2025 04:13:07 -0500 (0:00:00.051) 0:03:00.510 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:25 Friday 17 January 2025 04:13:07 -0500 (0:00:00.056) 0:03:00.567 ******** ok: [managed-node1] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Clean up variable namespace] ********************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:54 Friday 17 January 2025 04:13:07 -0500 (0:00:00.056) 0:03:00.624 ******** ok: [managed-node1] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null }, "changed": false } TASK [Test for correct handling of new encrypted volume w/ no key] ************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:203 Friday 17 January 2025 04:13:07 -0500 (0:00:00.054) 0:03:00.679 ******** included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml for managed-node1 TASK [Store global variable value copy] **************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:4 Friday 17 January 2025 04:13:08 -0500 (0:00:00.184) 0:03:00.863 ******** ok: [managed-node1] => { "ansible_facts": { "storage_pools_global": [], "storage_safe_mode_global": true, "storage_volumes_global": [] }, "changed": false } TASK [Verify role raises correct error] **************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:10 Friday 17 January 2025 04:13:08 -0500 (0:00:00.114) 0:03:00.977 ******** TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Friday 17 January 2025 04:13:08 -0500 (0:00:00.087) 0:03:01.065 ******** included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node1 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Friday 17 January 2025 04:13:08 -0500 (0:00:00.097) 0:03:01.162 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Friday 17 January 2025 04:13:08 -0500 (0:00:00.083) 0:03:01.246 ******** skipping: [managed-node1] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node1] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node1] => (item=CentOS_7.yml) => { "ansible_facts": { "__storage_blivet_diskvolume_mkfs_option_map": { "ext2": "-F", "ext3": "-F", "ext4": "-F" }, "blivet_package_list": [ "python-enum34", "python-blivet3", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "{{ 'libblockdev-s390' if ansible_architecture == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_7.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_7.yml" } skipping: [managed-node1] => (item=CentOS_7.9.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS_7.9.yml", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Friday 17 January 2025 04:13:08 -0500 (0:00:00.135) 0:03:01.382 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Friday 17 January 2025 04:13:08 -0500 (0:00:00.059) 0:03:01.442 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Friday 17 January 2025 04:13:08 -0500 (0:00:00.084) 0:03:01.526 ******** ok: [managed-node1] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Friday 17 January 2025 04:13:08 -0500 (0:00:00.129) 0:03:01.656 ******** ok: [managed-node1] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Friday 17 January 2025 04:13:09 -0500 (0:00:00.065) 0:03:01.721 ******** included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node1 TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2 Friday 17 January 2025 04:13:09 -0500 (0:00:00.198) 0:03:01.919 ******** ok: [managed-node1] => { "changed": false, "rc": 0, "results": [ "python-enum34-1.0.4-1.el7.noarch providing python-enum34 is already installed", "1:python2-blivet3-3.1.3-3.el7.noarch providing python-blivet3 is already installed", "libblockdev-crypto-2.18-5.el7.x86_64 providing libblockdev-crypto is already installed", "libblockdev-dm-2.18-5.el7.x86_64 providing libblockdev-dm is already installed", "libblockdev-lvm-2.18-5.el7.x86_64 providing libblockdev-lvm is already installed", "libblockdev-mdraid-2.18-5.el7.x86_64 providing libblockdev-mdraid is already installed", "libblockdev-swap-2.18-5.el7.x86_64 providing libblockdev-swap is already installed", "libblockdev-2.18-5.el7.x86_64 providing libblockdev is already installed" ] } TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Friday 17 January 2025 04:13:13 -0500 (0:00:04.266) 0:03:06.186 ******** ok: [managed-node1] => { "storage_pools": [ { "disks": [ "sda" ], "name": "foo", "type": "partition", "volumes": [ { "encryption": true, "encryption_luks_version": "luks2", "mount_point": "/opt/test1", "name": "test1", "size": "4g", "type": "partition" } ] } ] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:14 Friday 17 January 2025 04:13:13 -0500 (0:00:00.060) 0:03:06.246 ******** ok: [managed-node1] => { "storage_volumes": [] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19 Friday 17 January 2025 04:13:13 -0500 (0:00:00.061) 0:03:06.307 ******** ok: [managed-node1] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "cryptsetup" ], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:31 Friday 17 January 2025 04:13:17 -0500 (0:00:04.009) 0:03:10.317 ******** included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node1 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Friday 17 January 2025 04:13:17 -0500 (0:00:00.076) 0:03:10.393 ******** TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Friday 17 January 2025 04:13:17 -0500 (0:00:00.042) 0:03:10.436 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Friday 17 January 2025 04:13:17 -0500 (0:00:00.055) 0:03:10.492 ******** TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:37 Friday 17 January 2025 04:13:17 -0500 (0:00:00.108) 0:03:10.600 ******** ok: [managed-node1] => { "changed": false, "rc": 0, "results": [ "cryptsetup-2.0.3-6.el7.x86_64 providing cryptsetup is already installed", "kpartx-0.4.9-136.el7_9.x86_64 providing kpartx is already installed" ] } TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:51 Friday 17 January 2025 04:13:18 -0500 (0:00:00.794) 0:03:11.395 ******** ok: [managed-node1] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "arp-ethers.service": { "name": "arp-ethers.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "brandbot.service": { "name": "brandbot.service", "source": "systemd", "state": "inactive", "status": "static" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "console-shell.service": { "name": "console-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.import1.service": { "name": "dbus-org.freedesktop.import1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.machine1.service": { "name": "dbus-org.freedesktop.machine1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dmraid-activation.service": { "name": "dmraid-activation.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "ebtables.service": { "name": "ebtables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iprdump.service": { "name": "iprdump.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprinit.service": { "name": "iprinit.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprupdate.service": { "name": "iprupdate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-lvmetad.service": { "name": "lvm2-lvmetad.service", "source": "systemd", "state": "running", "status": "static" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "netconsole": { "name": "netconsole", "source": "sysv", "state": "stopped", "status": "disabled" }, "network": { "name": "network", "source": "sysv", "state": "running", "status": "enabled" }, "network.service": { "name": "network.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-config.service": { "name": "nfs-config.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-idmap.service": { "name": "nfs-idmap.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-lock.service": { "name": "nfs-lock.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-rquotad.service": { "name": "nfs-rquotad.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-secure.service": { "name": "nfs-secure.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs.service": { "name": "nfs.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfslock.service": { "name": "nfslock.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "postfix.service": { "name": "postfix.service", "source": "systemd", "state": "running", "status": "enabled" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rhel-autorelabel-mark.service": { "name": "rhel-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-autorelabel.service": { "name": "rhel-autorelabel.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-configure.service": { "name": "rhel-configure.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-dmesg.service": { "name": "rhel-dmesg.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-domainname.service": { "name": "rhel-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-import-state.service": { "name": "rhel-import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-loadmodules.service": { "name": "rhel-loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-readonly.service": { "name": "rhel-readonly.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-rquotad.service": { "name": "rpc-rquotad.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpcgssd.service": { "name": "rpcgssd.service", "source": "systemd", "state": "inactive", "status": "static" }, "rpcidmapd.service": { "name": "rpcidmapd.service", "source": "systemd", "state": "inactive", "status": "static" }, "rsyncd.service": { "name": "rsyncd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyncd@.service": { "name": "rsyncd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-policy-migrate-local-changes@.service": { "name": "selinux-policy-migrate-local-changes@.service", "source": "systemd", "state": "unknown", "status": "static" }, "selinux-policy-migrate-local-changes@targeted.service": { "name": "selinux-policy-migrate-local-changes@targeted.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "unknown" }, "sshd-keygen.service": { "name": "sshd-keygen.service", "source": "systemd", "state": "stopped", "status": "static" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-bootchart.service": { "name": "systemd-bootchart.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-importd.service": { "name": "systemd-importd.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-machined.service": { "name": "systemd-machined.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-nspawn@.service": { "name": "systemd-nspawn@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-readahead-collect.service": { "name": "systemd-readahead-collect.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-readahead-done.service": { "name": "systemd-readahead-done.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "systemd-readahead-drop.service": { "name": "systemd-readahead-drop.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "systemd-readahead-replay.service": { "name": "systemd-readahead-replay.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill@.service": { "name": "systemd-rfkill@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-shutdownd.service": { "name": "systemd-shutdownd.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "wpa_supplicant.service": { "name": "wpa_supplicant.service", "source": "systemd", "state": "inactive", "status": "disabled" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Set storage_cryptsetup_services] ***** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:57 Friday 17 January 2025 04:13:19 -0500 (0:00:01.109) 0:03:12.504 ******** ok: [managed-node1] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:63 Friday 17 January 2025 04:13:19 -0500 (0:00:00.077) 0:03:12.582 ******** TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:69 Friday 17 January 2025 04:13:19 -0500 (0:00:00.074) 0:03:12.657 ******** fatal: [managed-node1]: FAILED! => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } MSG: encrypted volume 'test1' missing key/password TASK [fedora.linux_system_roles.storage : Failed message] ********************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:109 Friday 17 January 2025 04:13:23 -0500 (0:00:03.995) 0:03:16.653 ******** fatal: [managed-node1]: FAILED! => { "changed": false } MSG: {u'_ansible_no_log': False, u'crypts': [], u'pools': [], u'leaves': [], u'changed': False, u'actions': [], u'failed': True, u'volumes': [], u'invocation': {u'module_args': {u'packages_only': False, u'disklabel_type': None, u'diskvolume_mkfs_option_map': {u'ext4': u'-F', u'ext3': u'-F', u'ext2': u'-F'}, u'safe_mode': False, u'pools': [{u'raid_metadata_version': None, u'encryption_key_size': None, u'encryption_key': None, u'encryption_luks_version': None, u'encryption_tang_url': None, u'raid_spare_count': None, u'grow_to_fill': False, u'encryption_tang_thumbprint': None, u'name': u'foo', u'encryption_password': None, u'encryption': False, u'disks': [u'sda'], u'raid_level': None, u'raid_device_count': None, u'state': u'present', u'volumes': [{u'raid_metadata_version': None, u'mount_device_identifier': u'uuid', u'fs_type': u'xfs', u'mount_options': u'defaults', u'size': u'4g', u'mount_point': u'/opt/test1', u'compression': None, u'encryption_password': None, u'encryption': True, u'raid_level': None, u'raid_device_count': None, u'state': u'present', u'vdo_pool_size': None, u'mount_mode': None, u'thin_pool_name': None, u'type': u'partition', u'encryption_key_size': None, u'deduplication': None, u'encryption_cipher': None, u'encryption_key': None, u'fs_label': u'', u'encryption_luks_version': u'luks2', u'raid_stripe_size': None, u'mount_passno': 0, u'mount_user': None, u'raid_spare_count': None, u'name': u'test1', u'cache_mode': None, u'raid_disks': [], u'mount_group': None, u'fs_overwrite_existing': True, u'disks': [], u'cached': False, u'thin_pool_size': None, u'thin': False, u'mount_check': 0, u'cache_size': 0, u'raid_chunk_size': None, u'cache_devices': [], u'fs_create_options': u''}], u'shared': False, u'encryption_clevis_pin': None, u'type': u'partition', u'encryption_cipher': None, u'raid_chunk_size': None}], u'volumes': [], u'pool_defaults': {u'raid_metadata_version': None, u'encryption_cipher': None, u'encryption_key': None, u'encryption_luks_version': None, u'raid_spare_count': None, u'grow_to_fill': False, u'encryption_password': None, u'encryption': False, u'disks': [], u'raid_level': None, u'raid_device_count': None, u'state': u'present', u'volumes': [], u'shared': False, u'type': u'lvm', u'encryption_key_size': None, u'raid_chunk_size': None}, u'volume_defaults': {u'raid_metadata_version': None, u'raid_level': None, u'fs_type': u'xfs', u'mount_options': u'defaults', u'size': 0, u'mount_point': u'', u'compression': None, u'encryption_password': None, u'encryption': False, u'mount_device_identifier': u'uuid', u'raid_device_count': None, u'state': u'present', u'vdo_pool_size': None, u'thin_pool_name': None, u'type': u'lvm', u'encryption_key_size': None, u'encryption_cipher': None, u'encryption_key': None, u'fs_label': u'', u'encryption_luks_version': None, u'raid_stripe_size': None, u'cache_size': 0, u'raid_spare_count': None, u'cache_mode': None, u'deduplication': None, u'cached': False, u'fs_overwrite_existing': True, u'disks': [], u'thin_pool_size': None, u'thin': None, u'mount_check': 0, u'mount_passno': 0, u'raid_chunk_size': None, u'cache_devices': [], u'fs_create_options': u''}, u'use_partitions': None}}, u'mounts': [], u'packages': [], u'msg': u"encrypted volume 'test1' missing key/password"} TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:113 Friday 17 January 2025 04:13:24 -0500 (0:00:00.141) 0:03:16.795 ******** TASK [Check that we failed in the role] **************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:23 Friday 17 January 2025 04:13:24 -0500 (0:00:00.067) 0:03:16.863 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Verify the blivet output and error message are correct] ****************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:28 Friday 17 January 2025 04:13:24 -0500 (0:00:00.178) 0:03:17.042 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Verify correct exception or error message] ******************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:39 Friday 17 January 2025 04:13:24 -0500 (0:00:00.075) 0:03:17.117 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Create an encrypted partition volume w/ default fs] ********************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:223 Friday 17 January 2025 04:13:24 -0500 (0:00:00.058) 0:03:17.175 ******** TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Friday 17 January 2025 04:13:24 -0500 (0:00:00.298) 0:03:17.473 ******** included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node1 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Friday 17 January 2025 04:13:24 -0500 (0:00:00.089) 0:03:17.563 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Friday 17 January 2025 04:13:24 -0500 (0:00:00.069) 0:03:17.633 ******** skipping: [managed-node1] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node1] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node1] => (item=CentOS_7.yml) => { "ansible_facts": { "__storage_blivet_diskvolume_mkfs_option_map": { "ext2": "-F", "ext3": "-F", "ext4": "-F" }, "blivet_package_list": [ "python-enum34", "python-blivet3", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "{{ 'libblockdev-s390' if ansible_architecture == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_7.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_7.yml" } skipping: [managed-node1] => (item=CentOS_7.9.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS_7.9.yml", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Friday 17 January 2025 04:13:25 -0500 (0:00:00.163) 0:03:17.796 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Friday 17 January 2025 04:13:25 -0500 (0:00:00.053) 0:03:17.849 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Friday 17 January 2025 04:13:25 -0500 (0:00:00.104) 0:03:17.954 ******** ok: [managed-node1] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Friday 17 January 2025 04:13:25 -0500 (0:00:00.067) 0:03:18.021 ******** ok: [managed-node1] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Friday 17 January 2025 04:13:25 -0500 (0:00:00.085) 0:03:18.107 ******** included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node1 TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2 Friday 17 January 2025 04:13:25 -0500 (0:00:00.144) 0:03:18.251 ******** ok: [managed-node1] => { "changed": false, "rc": 0, "results": [ "python-enum34-1.0.4-1.el7.noarch providing python-enum34 is already installed", "1:python2-blivet3-3.1.3-3.el7.noarch providing python-blivet3 is already installed", "libblockdev-crypto-2.18-5.el7.x86_64 providing libblockdev-crypto is already installed", "libblockdev-dm-2.18-5.el7.x86_64 providing libblockdev-dm is already installed", "libblockdev-lvm-2.18-5.el7.x86_64 providing libblockdev-lvm is already installed", "libblockdev-mdraid-2.18-5.el7.x86_64 providing libblockdev-mdraid is already installed", "libblockdev-swap-2.18-5.el7.x86_64 providing libblockdev-swap is already installed", "libblockdev-2.18-5.el7.x86_64 providing libblockdev is already installed" ] } TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Friday 17 January 2025 04:13:29 -0500 (0:00:04.231) 0:03:22.483 ******** ok: [managed-node1] => { "storage_pools": [ { "disks": [ "sda" ], "name": "foo", "type": "partition", "volumes": [ { "encryption": true, "encryption_luks_version": "luks2", "encryption_password": "yabbadabbadoo", "mount_point": "/opt/test1", "name": "test1", "size": "4g", "type": "partition" } ] } ] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:14 Friday 17 January 2025 04:13:29 -0500 (0:00:00.073) 0:03:22.556 ******** ok: [managed-node1] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19 Friday 17 January 2025 04:13:29 -0500 (0:00:00.078) 0:03:22.635 ******** ok: [managed-node1] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "cryptsetup" ], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:31 Friday 17 January 2025 04:13:34 -0500 (0:00:04.247) 0:03:26.882 ******** included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node1 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Friday 17 January 2025 04:13:34 -0500 (0:00:00.180) 0:03:27.063 ******** TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Friday 17 January 2025 04:13:34 -0500 (0:00:00.053) 0:03:27.117 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Friday 17 January 2025 04:13:34 -0500 (0:00:00.082) 0:03:27.199 ******** TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:37 Friday 17 January 2025 04:13:34 -0500 (0:00:00.058) 0:03:27.257 ******** ok: [managed-node1] => { "changed": false, "rc": 0, "results": [ "cryptsetup-2.0.3-6.el7.x86_64 providing cryptsetup is already installed", "kpartx-0.4.9-136.el7_9.x86_64 providing kpartx is already installed" ] } TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:51 Friday 17 January 2025 04:13:35 -0500 (0:00:00.720) 0:03:27.978 ******** ok: [managed-node1] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "arp-ethers.service": { "name": "arp-ethers.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "brandbot.service": { "name": "brandbot.service", "source": "systemd", "state": "inactive", "status": "static" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "console-shell.service": { "name": "console-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.import1.service": { "name": "dbus-org.freedesktop.import1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.machine1.service": { "name": "dbus-org.freedesktop.machine1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dmraid-activation.service": { "name": "dmraid-activation.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "ebtables.service": { "name": "ebtables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iprdump.service": { "name": "iprdump.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprinit.service": { "name": "iprinit.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprupdate.service": { "name": "iprupdate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-lvmetad.service": { "name": "lvm2-lvmetad.service", "source": "systemd", "state": "running", "status": "static" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "netconsole": { "name": "netconsole", "source": "sysv", "state": "stopped", "status": "disabled" }, "network": { "name": "network", "source": "sysv", "state": "running", "status": "enabled" }, "network.service": { "name": "network.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-config.service": { "name": "nfs-config.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-idmap.service": { "name": "nfs-idmap.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-lock.service": { "name": "nfs-lock.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-rquotad.service": { "name": "nfs-rquotad.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-secure.service": { "name": "nfs-secure.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs.service": { "name": "nfs.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfslock.service": { "name": "nfslock.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "postfix.service": { "name": "postfix.service", "source": "systemd", "state": "running", "status": "enabled" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rhel-autorelabel-mark.service": { "name": "rhel-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-autorelabel.service": { "name": "rhel-autorelabel.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-configure.service": { "name": "rhel-configure.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-dmesg.service": { "name": "rhel-dmesg.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-domainname.service": { "name": "rhel-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-import-state.service": { "name": "rhel-import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-loadmodules.service": { "name": "rhel-loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-readonly.service": { "name": "rhel-readonly.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-rquotad.service": { "name": "rpc-rquotad.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpcgssd.service": { "name": "rpcgssd.service", "source": "systemd", "state": "inactive", "status": "static" }, "rpcidmapd.service": { "name": "rpcidmapd.service", "source": "systemd", "state": "inactive", "status": "static" }, "rsyncd.service": { "name": "rsyncd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyncd@.service": { "name": "rsyncd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-policy-migrate-local-changes@.service": { "name": "selinux-policy-migrate-local-changes@.service", "source": "systemd", "state": "unknown", "status": "static" }, "selinux-policy-migrate-local-changes@targeted.service": { "name": "selinux-policy-migrate-local-changes@targeted.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "unknown" }, "sshd-keygen.service": { "name": "sshd-keygen.service", "source": "systemd", "state": "stopped", "status": "static" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-bootchart.service": { "name": "systemd-bootchart.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-importd.service": { "name": "systemd-importd.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-machined.service": { "name": "systemd-machined.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-nspawn@.service": { "name": "systemd-nspawn@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-readahead-collect.service": { "name": "systemd-readahead-collect.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-readahead-done.service": { "name": "systemd-readahead-done.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "systemd-readahead-drop.service": { "name": "systemd-readahead-drop.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "systemd-readahead-replay.service": { "name": "systemd-readahead-replay.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill@.service": { "name": "systemd-rfkill@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-shutdownd.service": { "name": "systemd-shutdownd.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "wpa_supplicant.service": { "name": "wpa_supplicant.service", "source": "systemd", "state": "inactive", "status": "disabled" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Set storage_cryptsetup_services] ***** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:57 Friday 17 January 2025 04:13:36 -0500 (0:00:01.215) 0:03:29.193 ******** ok: [managed-node1] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:63 Friday 17 January 2025 04:13:36 -0500 (0:00:00.079) 0:03:29.272 ******** TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:69 Friday 17 January 2025 04:13:36 -0500 (0:00:00.049) 0:03:29.322 ******** changed: [managed-node1] => { "actions": [ { "action": "destroy format", "device": "/dev/mapper/luks-8c8d336a-42af-4ee5-a371-3bf9cd16f56b", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/luks-8c8d336a-42af-4ee5-a371-3bf9cd16f56b", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda", "fs_type": "luks" }, { "action": "create format", "device": "/dev/sda", "fs_type": "disklabel" }, { "action": "create device", "device": "/dev/sda1", "fs_type": null }, { "action": "create format", "device": "/dev/sda1", "fs_type": "luks" }, { "action": "create device", "device": "/dev/mapper/luks-35e15af6-60a8-41b2-94e6-05460ad82662", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/luks-35e15af6-60a8-41b2-94e6-05460ad82662", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/sda", "name": "luks-8c8d336a-42af-4ee5-a371-3bf9cd16f56b", "password": "-", "state": "absent" }, { "backing_device": "/dev/sda1", "name": "luks-35e15af6-60a8-41b2-94e6-05460ad82662", "password": "-", "state": "present" } ], "leaves": [ "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1", "/dev/mapper/luks-35e15af6-60a8-41b2-94e6-05460ad82662" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-8c8d336a-42af-4ee5-a371-3bf9cd16f56b", "state": "absent" }, { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-35e15af6-60a8-41b2-94e6-05460ad82662", "state": "mounted" } ], "packages": [ "cryptsetup", "xfsprogs", "e2fsprogs" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "partition", "volumes": [ { "_device": "/dev/mapper/luks-35e15af6-60a8-41b2-94e6-05460ad82662", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-35e15af6-60a8-41b2-94e6-05460ad82662", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } ] } ], "volumes": [] } TASK [fedora.linux_system_roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:83 Friday 17 January 2025 04:13:50 -0500 (0:00:13.993) 0:03:43.315 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if /etc/fstab is present] ****** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:90 Friday 17 January 2025 04:13:50 -0500 (0:00:00.088) 0:03:43.404 ******** ok: [managed-node1] => { "changed": false, "stat": { "atime": 1737105173.8648953, "attr_flags": "e", "attributes": [ "extents" ], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "386fdb4bd01584d50a85e3e40259b18d82eddb2e", "ctime": 1737105173.8618953, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 263644, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0644", "mtime": 1737105173.8618953, "nlink": 1, "path": "/etc/fstab", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 1311, "uid": 0, "version": "18446744072031193646", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Add fingerprint to /etc/fstab if present] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:95 Friday 17 January 2025 04:13:51 -0500 (0:00:00.489) 0:03:43.893 ******** ok: [managed-node1] => { "backup": "", "changed": false } TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:113 Friday 17 January 2025 04:13:51 -0500 (0:00:00.388) 0:03:44.282 ******** TASK [fedora.linux_system_roles.storage : Show blivet_output] ****************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:119 Friday 17 January 2025 04:13:51 -0500 (0:00:00.040) 0:03:44.322 ******** ok: [managed-node1] => { "blivet_output": { "actions": [ { "action": "destroy format", "device": "/dev/mapper/luks-8c8d336a-42af-4ee5-a371-3bf9cd16f56b", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/luks-8c8d336a-42af-4ee5-a371-3bf9cd16f56b", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda", "fs_type": "luks" }, { "action": "create format", "device": "/dev/sda", "fs_type": "disklabel" }, { "action": "create device", "device": "/dev/sda1", "fs_type": null }, { "action": "create format", "device": "/dev/sda1", "fs_type": "luks" }, { "action": "create device", "device": "/dev/mapper/luks-35e15af6-60a8-41b2-94e6-05460ad82662", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/luks-35e15af6-60a8-41b2-94e6-05460ad82662", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/sda", "name": "luks-8c8d336a-42af-4ee5-a371-3bf9cd16f56b", "password": "-", "state": "absent" }, { "backing_device": "/dev/sda1", "name": "luks-35e15af6-60a8-41b2-94e6-05460ad82662", "password": "-", "state": "present" } ], "failed": false, "leaves": [ "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1", "/dev/mapper/luks-35e15af6-60a8-41b2-94e6-05460ad82662" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-8c8d336a-42af-4ee5-a371-3bf9cd16f56b", "state": "absent" }, { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-35e15af6-60a8-41b2-94e6-05460ad82662", "state": "mounted" } ], "packages": [ "cryptsetup", "xfsprogs", "e2fsprogs" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "partition", "volumes": [ { "_device": "/dev/mapper/luks-35e15af6-60a8-41b2-94e6-05460ad82662", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-35e15af6-60a8-41b2-94e6-05460ad82662", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } ] } ], "volumes": [] } } TASK [fedora.linux_system_roles.storage : Set the list of pools for test verification] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:128 Friday 17 January 2025 04:13:51 -0500 (0:00:00.067) 0:03:44.389 ******** ok: [managed-node1] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "partition", "volumes": [ { "_device": "/dev/mapper/luks-35e15af6-60a8-41b2-94e6-05460ad82662", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-35e15af6-60a8-41b2-94e6-05460ad82662", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } ] } ] }, "changed": false } TASK [fedora.linux_system_roles.storage : Set the list of volumes for test verification] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:132 Friday 17 January 2025 04:13:51 -0500 (0:00:00.054) 0:03:44.443 ******** ok: [managed-node1] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Remove obsolete mounts] ************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:148 Friday 17 January 2025 04:13:51 -0500 (0:00:00.041) 0:03:44.485 ******** changed: [managed-node1] => (item={u'src': u'/dev/mapper/luks-8c8d336a-42af-4ee5-a371-3bf9cd16f56b', u'state': u'absent', u'fstype': u'xfs', u'path': u'/opt/test1'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-8c8d336a-42af-4ee5-a371-3bf9cd16f56b", "state": "absent" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/luks-8c8d336a-42af-4ee5-a371-3bf9cd16f56b" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:159 Friday 17 January 2025 04:13:52 -0500 (0:00:00.466) 0:03:44.952 ******** ok: [managed-node1] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Set up new/current mounts] *********** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:164 Friday 17 January 2025 04:13:53 -0500 (0:00:00.771) 0:03:45.723 ******** changed: [managed-node1] => (item={u'src': u'/dev/mapper/luks-35e15af6-60a8-41b2-94e6-05460ad82662', u'group': None, u'dump': 0, u'passno': 0, u'fstype': u'xfs', u'state': u'mounted', u'mode': None, u'owner': None, u'path': u'/opt/test1', u'opts': u'defaults'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-35e15af6-60a8-41b2-94e6-05460ad82662", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/luks-35e15af6-60a8-41b2-94e6-05460ad82662" } TASK [fedora.linux_system_roles.storage : Manage mount ownership/permissions] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:175 Friday 17 January 2025 04:13:53 -0500 (0:00:00.662) 0:03:46.386 ******** skipping: [managed-node1] => (item={u'src': u'/dev/mapper/luks-35e15af6-60a8-41b2-94e6-05460ad82662', u'group': None, u'dump': 0, u'passno': 0, u'fstype': u'xfs', u'state': u'mounted', u'mode': None, u'owner': None, u'path': u'/opt/test1', u'opts': u'defaults'}) => { "ansible_loop_var": "mount_info", "changed": false, "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-35e15af6-60a8-41b2-94e6-05460ad82662", "state": "mounted" }, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:187 Friday 17 January 2025 04:13:53 -0500 (0:00:00.098) 0:03:46.484 ******** ok: [managed-node1] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Retrieve facts for the /etc/crypttab file] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:195 Friday 17 January 2025 04:13:54 -0500 (0:00:00.560) 0:03:47.044 ******** ok: [managed-node1] => { "changed": false, "stat": { "atime": 1737105177.6548963, "attr_flags": "e", "attributes": [ "extents" ], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "cd89e9295b991324eddbfd9bbfdc013351afb933", "ctime": 1737105175.3718958, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 263646, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0600", "mtime": 1737105175.3708959, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 53, "uid": 0, "version": "18446744072031193994", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Manage /etc/crypttab to account for changes we just made] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:200 Friday 17 January 2025 04:13:54 -0500 (0:00:00.600) 0:03:47.645 ******** changed: [managed-node1] => (item={u'state': u'absent', u'password': u'-', u'name': u'luks-8c8d336a-42af-4ee5-a371-3bf9cd16f56b', u'backing_device': u'/dev/sda'}) => { "ansible_loop_var": "entry", "backup": "", "changed": true, "entry": { "backing_device": "/dev/sda", "name": "luks-8c8d336a-42af-4ee5-a371-3bf9cd16f56b", "password": "-", "state": "absent" }, "found": 1 } MSG: 1 line(s) removed changed: [managed-node1] => (item={u'state': u'present', u'password': u'-', u'name': u'luks-35e15af6-60a8-41b2-94e6-05460ad82662', u'backing_device': u'/dev/sda1'}) => { "ansible_loop_var": "entry", "backup": "", "changed": true, "entry": { "backing_device": "/dev/sda1", "name": "luks-35e15af6-60a8-41b2-94e6-05460ad82662", "password": "-", "state": "present" } } MSG: line added TASK [fedora.linux_system_roles.storage : Update facts] ************************ task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:222 Friday 17 January 2025 04:13:55 -0500 (0:00:00.858) 0:03:48.504 ******** ok: [managed-node1] TASK [Verify role results] ***************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:241 Friday 17 January 2025 04:13:56 -0500 (0:00:00.853) 0:03:49.358 ******** included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml for managed-node1 TASK [Print out pool information] ********************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:2 Friday 17 January 2025 04:13:56 -0500 (0:00:00.155) 0:03:49.513 ******** ok: [managed-node1] => { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "partition", "volumes": [ { "_device": "/dev/mapper/luks-35e15af6-60a8-41b2-94e6-05460ad82662", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-35e15af6-60a8-41b2-94e6-05460ad82662", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } ] } ] } TASK [Print out volume information] ******************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:7 Friday 17 January 2025 04:13:56 -0500 (0:00:00.071) 0:03:49.584 ******** skipping: [managed-node1] => {} TASK [Collect info about the volumes.] ***************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:15 Friday 17 January 2025 04:13:56 -0500 (0:00:00.053) 0:03:49.638 ******** ok: [managed-node1] => { "changed": false, "info": { "/dev/mapper/luks-35e15af6-60a8-41b2-94e6-05460ad82662": { "fstype": "xfs", "label": "", "mountpoint": "/opt/test1", "name": "/dev/mapper/luks-35e15af6-60a8-41b2-94e6-05460ad82662", "size": "10G", "type": "crypt", "uuid": "0b911e7b-4467-44f4-83b9-5fef05dcc1e6" }, "/dev/sda": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sda1": { "fstype": "crypto_LUKS", "label": "", "mountpoint": "", "name": "/dev/sda1", "size": "10G", "type": "partition", "uuid": "35e15af6-60a8-41b2-94e6-05460ad82662" }, "/dev/sdb": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdd": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdd", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sde": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sde", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdf": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdf", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdg": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdg", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdh": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdh", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdi": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdi", "size": "10G", "type": "disk", "uuid": "" }, "/dev/xvda": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/xvda", "size": "250G", "type": "disk", "uuid": "" }, "/dev/xvda1": { "fstype": "ext4", "label": "", "mountpoint": "/", "name": "/dev/xvda1", "size": "250G", "type": "partition", "uuid": "c7b7d6a5-fd01-4b9b-bcca-153eaff9d312" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:20 Friday 17 January 2025 04:13:57 -0500 (0:00:00.408) 0:03:50.047 ******** ok: [managed-node1] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.003000", "end": "2025-01-17 04:13:57.658965", "rc": 0, "start": "2025-01-17 04:13:57.655965" } STDOUT: # system_role:storage # # /etc/fstab # Created by anaconda on Thu Jun 20 10:23:46 2024 # # Accessible filesystems, by reference, are maintained under '/dev/disk' # See man pages fstab(5), findfs(8), mount(8) and/or blkid(8) for more info # UUID=c7b7d6a5-fd01-4b9b-bcca-153eaff9d312 / ext4 defaults 1 1 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat /mnt/redhat nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/qa /mnt/qa nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 vtap-eng01.storage.rdu2.redhat.com:/vol/engarchive /mnt/engarchive nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/tpsdist /mnt/tpsdist nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat/brewroot /mnt/brew nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_brew_scratch_nfs/scratch /mnt/brew_scratch nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 /dev/mapper/luks-35e15af6-60a8-41b2-94e6-05460ad82662 /opt/test1 xfs defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:25 Friday 17 January 2025 04:13:57 -0500 (0:00:00.390) 0:03:50.438 ******** ok: [managed-node1] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.002757", "end": "2025-01-17 04:13:58.038735", "failed_when_result": false, "rc": 0, "start": "2025-01-17 04:13:58.035978" } STDOUT: luks-35e15af6-60a8-41b2-94e6-05460ad82662 /dev/sda1 - TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:34 Friday 17 January 2025 04:13:58 -0500 (0:00:00.389) 0:03:50.827 ******** included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml for managed-node1 TASK [Set _storage_pool_tests] ************************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:5 Friday 17 January 2025 04:13:58 -0500 (0:00:00.125) 0:03:50.953 ******** ok: [managed-node1] => { "ansible_facts": { "_storage_pool_tests": [ "members", "volumes" ] }, "changed": false } TASK [Get VG shared value status] ********************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:18 Friday 17 January 2025 04:13:58 -0500 (0:00:00.054) 0:03:51.007 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that VG shared value checks out] ********************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:24 Friday 17 January 2025 04:13:58 -0500 (0:00:00.076) 0:03:51.083 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify pool subset] ****************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:34 Friday 17 January 2025 04:13:58 -0500 (0:00:00.127) 0:03:51.210 ******** included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml for managed-node1 included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-volumes.yml for managed-node1 TASK [Set test variables] ****************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:2 Friday 17 January 2025 04:13:58 -0500 (0:00:00.132) 0:03:51.343 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get the canonical device path for each member device] ******************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:8 Friday 17 January 2025 04:13:58 -0500 (0:00:00.059) 0:03:51.403 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set pvs lvm length] ****************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:17 Friday 17 January 2025 04:13:58 -0500 (0:00:00.069) 0:03:51.472 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set pool pvs] ************************************************************ task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:22 Friday 17 January 2025 04:13:58 -0500 (0:00:00.063) 0:03:51.536 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify PV count] ********************************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:27 Friday 17 January 2025 04:13:58 -0500 (0:00:00.057) 0:03:51.593 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected pv type] **************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:36 Friday 17 January 2025 04:13:58 -0500 (0:00:00.057) 0:03:51.651 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected pv type] **************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:41 Friday 17 January 2025 04:13:59 -0500 (0:00:00.057) 0:03:51.709 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected pv type] **************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:46 Friday 17 January 2025 04:13:59 -0500 (0:00:00.060) 0:03:51.769 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check the type of each PV] *********************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:51 Friday 17 January 2025 04:13:59 -0500 (0:00:00.055) 0:03:51.825 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that blivet supports PV grow to fill] ****************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:64 Friday 17 January 2025 04:13:59 -0500 (0:00:00.057) 0:03:51.883 ******** ok: [managed-node1] => { "changed": false, "rc": 0 } STDOUT: False STDERR: Shared connection to 10.31.46.65 closed. TASK [Verify that PVs fill the whole devices when they should] ***************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:73 Friday 17 January 2025 04:13:59 -0500 (0:00:00.284) 0:03:52.167 ******** TASK [Check MD RAID] *********************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:83 Friday 17 January 2025 04:13:59 -0500 (0:00:00.053) 0:03:52.221 ******** included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml for managed-node1 TASK [Get information about RAID] ********************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:8 Friday 17 January 2025 04:13:59 -0500 (0:00:00.127) 0:03:52.348 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:14 Friday 17 January 2025 04:13:59 -0500 (0:00:00.060) 0:03:52.408 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:19 Friday 17 January 2025 04:13:59 -0500 (0:00:00.063) 0:03:52.471 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:24 Friday 17 January 2025 04:13:59 -0500 (0:00:00.080) 0:03:52.551 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md chunk size regex] ************************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:29 Friday 17 January 2025 04:13:59 -0500 (0:00:00.059) 0:03:52.611 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:37 Friday 17 January 2025 04:13:59 -0500 (0:00:00.056) 0:03:52.668 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:46 Friday 17 January 2025 04:14:00 -0500 (0:00:00.100) 0:03:52.768 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:55 Friday 17 January 2025 04:14:00 -0500 (0:00:00.123) 0:03:52.891 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:64 Friday 17 January 2025 04:14:00 -0500 (0:00:00.065) 0:03:52.957 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:74 Friday 17 January 2025 04:14:00 -0500 (0:00:00.066) 0:03:53.024 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Reset variables used by tests] ******************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:83 Friday 17 January 2025 04:14:00 -0500 (0:00:00.086) 0:03:53.111 ******** ok: [managed-node1] => { "ansible_facts": { "storage_test_md_active_devices_re": null, "storage_test_md_chunk_size_re": null, "storage_test_md_metadata_version_re": null, "storage_test_md_spare_devices_re": null }, "changed": false } TASK [Check LVM RAID] ********************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:86 Friday 17 January 2025 04:14:00 -0500 (0:00:00.085) 0:03:53.196 ******** included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-lvmraid.yml for managed-node1 TASK [Validate pool member LVM RAID settings] ********************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-lvmraid.yml:2 Friday 17 January 2025 04:14:00 -0500 (0:00:00.130) 0:03:53.327 ******** skipping: [managed-node1] => (item={u'_raw_device': u'/dev/sda1', u'raid_metadata_version': None, u'mount_device_identifier': u'uuid', u'fs_type': u'xfs', u'mount_options': u'defaults', u'_device': u'/dev/mapper/luks-35e15af6-60a8-41b2-94e6-05460ad82662', u'size': u'4g', u'mount_point': u'/opt/test1', u'compression': None, u'encryption_password': u'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', u'_kernel_device': u'/dev/dm-0', u'encryption': True, u'raid_level': None, u'name': u'test1', u'raid_device_count': None, u'state': u'present', u'vdo_pool_size': None, u'mount_mode': None, u'thin_pool_name': None, u'type': u'partition', u'encryption_key_size': None, u'encryption_cipher': None, u'encryption_key': None, u'fs_label': u'', u'encryption_luks_version': u'luks2', u'raid_stripe_size': None, u'mount_passno': 0, u'_mount_id': u'/dev/mapper/luks-35e15af6-60a8-41b2-94e6-05460ad82662', u'mount_user': None, u'raid_spare_count': None, u'raid_disks': [], u'_raw_kernel_device': u'/dev/sda1', u'cache_mode': None, u'cache_devices': [], u'deduplication': None, u'mount_group': None, u'thin_pool_size': None, u'disks': [], u'cached': False, u'thin': False, u'mount_check': 0, u'cache_size': 0, u'raid_chunk_size': None, u'fs_overwrite_existing': True, u'fs_create_options': u''}) => { "ansible_loop_var": "storage_test_lvmraid_volume", "changed": false, "skip_reason": "Conditional result was False", "storage_test_lvmraid_volume": { "_device": "/dev/mapper/luks-35e15af6-60a8-41b2-94e6-05460ad82662", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-35e15af6-60a8-41b2-94e6-05460ad82662", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } } TASK [Check Thin Pools] ******************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:89 Friday 17 January 2025 04:14:00 -0500 (0:00:00.079) 0:03:53.406 ******** included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-thin.yml for managed-node1 TASK [Validate pool member thinpool settings] ********************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-thin.yml:2 Friday 17 January 2025 04:14:00 -0500 (0:00:00.124) 0:03:53.531 ******** skipping: [managed-node1] => (item={u'_raw_device': u'/dev/sda1', u'raid_metadata_version': None, u'mount_device_identifier': u'uuid', u'fs_type': u'xfs', u'mount_options': u'defaults', u'_device': u'/dev/mapper/luks-35e15af6-60a8-41b2-94e6-05460ad82662', u'size': u'4g', u'mount_point': u'/opt/test1', u'compression': None, u'encryption_password': u'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', u'_kernel_device': u'/dev/dm-0', u'encryption': True, u'raid_level': None, u'name': u'test1', u'raid_device_count': None, u'state': u'present', u'vdo_pool_size': None, u'mount_mode': None, u'thin_pool_name': None, u'type': u'partition', u'encryption_key_size': None, u'encryption_cipher': None, u'encryption_key': None, u'fs_label': u'', u'encryption_luks_version': u'luks2', u'raid_stripe_size': None, u'mount_passno': 0, u'_mount_id': u'/dev/mapper/luks-35e15af6-60a8-41b2-94e6-05460ad82662', u'mount_user': None, u'raid_spare_count': None, u'raid_disks': [], u'_raw_kernel_device': u'/dev/sda1', u'cache_mode': None, u'cache_devices': [], u'deduplication': None, u'mount_group': None, u'thin_pool_size': None, u'disks': [], u'cached': False, u'thin': False, u'mount_check': 0, u'cache_size': 0, u'raid_chunk_size': None, u'fs_overwrite_existing': True, u'fs_create_options': u''}) => { "ansible_loop_var": "storage_test_thin_volume", "changed": false, "skip_reason": "Conditional result was False", "storage_test_thin_volume": { "_device": "/dev/mapper/luks-35e15af6-60a8-41b2-94e6-05460ad82662", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-35e15af6-60a8-41b2-94e6-05460ad82662", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } } TASK [Check member encryption] ************************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:92 Friday 17 January 2025 04:14:00 -0500 (0:00:00.078) 0:03:53.610 ******** included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml for managed-node1 TASK [Set test variables] ****************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:5 Friday 17 January 2025 04:14:01 -0500 (0:00:00.187) 0:03:53.797 ******** ok: [managed-node1] => { "ansible_facts": { "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Validate pool member LUKS settings] ************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:10 Friday 17 January 2025 04:14:01 -0500 (0:00:00.060) 0:03:53.857 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Validate pool member crypttab entries] *********************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:17 Friday 17 January 2025 04:14:01 -0500 (0:00:00.046) 0:03:53.904 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clear test variables] **************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:24 Friday 17 January 2025 04:14:01 -0500 (0:00:00.040) 0:03:53.944 ******** ok: [managed-node1] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_crypttab_key_file": null }, "changed": false } TASK [Check VDO] *************************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:95 Friday 17 January 2025 04:14:01 -0500 (0:00:00.039) 0:03:53.984 ******** included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-vdo.yml for managed-node1 TASK [Validate pool member VDO settings] *************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-vdo.yml:2 Friday 17 January 2025 04:14:01 -0500 (0:00:00.085) 0:03:54.069 ******** skipping: [managed-node1] => (item={u'_raw_device': u'/dev/sda1', u'raid_metadata_version': None, u'mount_device_identifier': u'uuid', u'fs_type': u'xfs', u'mount_options': u'defaults', u'_device': u'/dev/mapper/luks-35e15af6-60a8-41b2-94e6-05460ad82662', u'size': u'4g', u'mount_point': u'/opt/test1', u'compression': None, u'encryption_password': u'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', u'_kernel_device': u'/dev/dm-0', u'encryption': True, u'raid_level': None, u'name': u'test1', u'raid_device_count': None, u'state': u'present', u'vdo_pool_size': None, u'mount_mode': None, u'thin_pool_name': None, u'type': u'partition', u'encryption_key_size': None, u'encryption_cipher': None, u'encryption_key': None, u'fs_label': u'', u'encryption_luks_version': u'luks2', u'raid_stripe_size': None, u'mount_passno': 0, u'_mount_id': u'/dev/mapper/luks-35e15af6-60a8-41b2-94e6-05460ad82662', u'mount_user': None, u'raid_spare_count': None, u'raid_disks': [], u'_raw_kernel_device': u'/dev/sda1', u'cache_mode': None, u'cache_devices': [], u'deduplication': None, u'mount_group': None, u'thin_pool_size': None, u'disks': [], u'cached': False, u'thin': False, u'mount_check': 0, u'cache_size': 0, u'raid_chunk_size': None, u'fs_overwrite_existing': True, u'fs_create_options': u''}) => { "ansible_loop_var": "storage_test_vdo_volume", "changed": false, "skip_reason": "Conditional result was False", "storage_test_vdo_volume": { "_device": "/dev/mapper/luks-35e15af6-60a8-41b2-94e6-05460ad82662", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-35e15af6-60a8-41b2-94e6-05460ad82662", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } } TASK [Check Stratis] *********************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:98 Friday 17 January 2025 04:14:01 -0500 (0:00:00.059) 0:03:54.129 ******** included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml for managed-node1 TASK [Run 'stratis report'] **************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:6 Friday 17 January 2025 04:14:01 -0500 (0:00:00.196) 0:03:54.325 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about Stratis] ******************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:11 Friday 17 January 2025 04:14:01 -0500 (0:00:00.077) 0:03:54.402 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the pools was created] *************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:15 Friday 17 January 2025 04:14:01 -0500 (0:00:00.103) 0:03:54.505 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that encryption is correctly set] ********************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:25 Friday 17 January 2025 04:14:01 -0500 (0:00:00.099) 0:03:54.605 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that Clevis/Tang encryption is correctly set] ********************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:34 Friday 17 January 2025 04:14:01 -0500 (0:00:00.075) 0:03:54.680 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Reset variable used by test] ********************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:44 Friday 17 January 2025 04:14:02 -0500 (0:00:00.142) 0:03:54.823 ******** ok: [managed-node1] => { "ansible_facts": { "storage_test_stratis_report": null }, "changed": false } TASK [Clean up test variables] ************************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:101 Friday 17 January 2025 04:14:02 -0500 (0:00:00.112) 0:03:54.936 ******** ok: [managed-node1] => { "ansible_facts": { "__pvs_lvm_len": null, "_storage_test_expected_pv_count": null, "_storage_test_expected_pv_type": null, "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [Verify the volumes] ****************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-volumes.yml:3 Friday 17 January 2025 04:14:02 -0500 (0:00:00.099) 0:03:55.035 ******** included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml for managed-node1 TASK [Set storage volume test variables] *************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:2 Friday 17 January 2025 04:14:02 -0500 (0:00:00.123) 0:03:55.158 ******** ok: [managed-node1] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [Run test verify for {{ storage_test_volume_subset }}] ******************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:19 Friday 17 January 2025 04:14:02 -0500 (0:00:00.090) 0:03:55.249 ******** included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml for managed-node1 included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml for managed-node1 included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml for managed-node1 included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml for managed-node1 included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml for managed-node1 included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml for managed-node1 included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml for managed-node1 included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml for managed-node1 TASK [Get expected mount device based on device type] ************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:7 Friday 17 January 2025 04:14:03 -0500 (0:00:00.460) 0:03:55.709 ******** ok: [managed-node1] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/luks-35e15af6-60a8-41b2-94e6-05460ad82662" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:11 Friday 17 January 2025 04:14:03 -0500 (0:00:00.059) 0:03:55.768 ******** ok: [managed-node1] => { "ansible_facts": { "storage_test_mount_expected_mount_point": "/opt/test1", "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Get information about the mountpoint directory] ************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:19 Friday 17 January 2025 04:14:03 -0500 (0:00:00.058) 0:03:55.827 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by device] ******************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:28 Friday 17 January 2025 04:14:03 -0500 (0:00:00.057) 0:03:55.885 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Verify mount directory user] ********************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:36 Friday 17 January 2025 04:14:03 -0500 (0:00:00.071) 0:03:55.956 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory group] ******************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:42 Friday 17 January 2025 04:14:03 -0500 (0:00:00.060) 0:03:56.017 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory permissions] ************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:48 Friday 17 January 2025 04:14:03 -0500 (0:00:00.050) 0:03:56.067 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get path of test volume device] ****************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:57 Friday 17 January 2025 04:14:03 -0500 (0:00:00.046) 0:03:56.114 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:63 Friday 17 January 2025 04:14:03 -0500 (0:00:00.036) 0:03:56.151 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:69 Friday 17 January 2025 04:14:03 -0500 (0:00:00.036) 0:03:56.187 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:79 Friday 17 January 2025 04:14:03 -0500 (0:00:00.050) 0:03:56.238 ******** ok: [managed-node1] => { "ansible_facts": { "storage_test_found_mount_stat": null, "storage_test_mount_expected_mount_point": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:2 Friday 17 January 2025 04:14:03 -0500 (0:00:00.058) 0:03:56.296 ******** ok: [managed-node1] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "/dev/mapper/luks-35e15af6-60a8-41b2-94e6-05460ad82662 " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:17 Friday 17 January 2025 04:14:03 -0500 (0:00:00.100) 0:03:56.397 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:24 Friday 17 January 2025 04:14:03 -0500 (0:00:00.078) 0:03:56.475 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:33 Friday 17 January 2025 04:14:03 -0500 (0:00:00.106) 0:03:56.581 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify fingerprint] ****************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:45 Friday 17 January 2025 04:14:03 -0500 (0:00:00.058) 0:03:56.640 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Clean up variables] ****************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:52 Friday 17 January 2025 04:14:04 -0500 (0:00:00.076) 0:03:56.717 ******** ok: [managed-node1] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:6 Friday 17 January 2025 04:14:04 -0500 (0:00:00.060) 0:03:56.777 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:14 Friday 17 January 2025 04:14:04 -0500 (0:00:00.075) 0:03:56.852 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:3 Friday 17 January 2025 04:14:04 -0500 (0:00:00.121) 0:03:56.974 ******** ok: [managed-node1] => { "changed": false, "stat": { "atime": 1737105230.2709384, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1737105230.2709384, "dev": 5, "device_type": 2049, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 69537, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/blockdevice", "mode": "0660", "mtime": 1737105230.2709384, "nlink": 1, "path": "/dev/sda1", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:9 Friday 17 January 2025 04:14:04 -0500 (0:00:00.510) 0:03:57.484 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:16 Friday 17 January 2025 04:14:04 -0500 (0:00:00.075) 0:03:57.559 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Make sure we got info about this volume] ********************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:23 Friday 17 January 2025 04:14:04 -0500 (0:00:00.061) 0:03:57.621 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Process volume type (set initial value) (1/2)] *************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:29 Friday 17 January 2025 04:14:05 -0500 (0:00:00.077) 0:03:57.699 ******** ok: [managed-node1] => { "ansible_facts": { "st_volume_type": "partition" }, "changed": false } TASK [Process volume type (get RAID value) (2/2)] ****************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:33 Friday 17 January 2025 04:14:05 -0500 (0:00:00.070) 0:03:57.770 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:38 Friday 17 January 2025 04:14:05 -0500 (0:00:00.072) 0:03:57.842 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:3 Friday 17 January 2025 04:14:05 -0500 (0:00:00.079) 0:03:57.921 ******** ok: [managed-node1] => { "changed": false, "stat": { "atime": 1737105230.3859384, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1737105230.3859384, "dev": 5, "device_type": 64768, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 69596, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1737105230.3859384, "nlink": 1, "path": "/dev/mapper/luks-35e15af6-60a8-41b2-94e6-05460ad82662", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Ensure cryptsetup is present] ******************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:10 Friday 17 January 2025 04:14:05 -0500 (0:00:00.384) 0:03:58.306 ******** ok: [managed-node1] => { "changed": false, "rc": 0, "results": [ "cryptsetup-2.0.3-6.el7.x86_64 providing cryptsetup is already installed" ] } TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:16 Friday 17 January 2025 04:14:06 -0500 (0:00:00.784) 0:03:59.091 ******** ok: [managed-node1] => { "changed": false, "cmd": [ "cryptsetup", "luksDump", "/dev/sda1" ], "delta": "0:00:00.027478", "end": "2025-01-17 04:14:06.703064", "rc": 0, "start": "2025-01-17 04:14:06.675586" } STDOUT: LUKS header information Version: 2 Epoch: 3 Metadata area: 12288 bytes UUID: 35e15af6-60a8-41b2-94e6-05460ad82662 Label: (no label) Subsystem: (no subsystem) Flags: (no flags) Data segments: 0: crypt offset: 4194304 [bytes] length: (whole device) cipher: aes-xts-plain64 sector: 512 [bytes] Keyslots: 0: luks2 Key: 512 bits Priority: normal Cipher: aes-xts-plain64 PBKDF: argon2i Time cost: 4 Memory: 667563 Threads: 2 Salt: 58 50 a7 48 1e 71 e7 9f cb 4f 7a e3 63 fd 41 61 09 f3 78 68 d8 83 0d 6b be ba 05 83 10 d2 59 a9 AF stripes: 4000 Area offset:32768 [bytes] Area length:258048 [bytes] Digest ID: 0 Tokens: Digests: 0: pbkdf2 Hash: sha256 Iterations: 23011 Salt: 54 94 c6 d9 2b fd ef d0 44 e8 0c 7a d7 a7 9e 7b 80 c7 bd 17 33 50 c1 e7 ac dd e9 d5 22 b0 2d 82 Digest: 9f b5 52 09 80 8d 6e 42 e3 2c ff 2b a8 67 33 ee e4 e1 4d f5 73 1b 3a 77 cd f7 c1 2e 2e 29 e3 b1 TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:22 Friday 17 January 2025 04:14:06 -0500 (0:00:00.409) 0:03:59.500 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:29 Friday 17 January 2025 04:14:06 -0500 (0:00:00.077) 0:03:59.577 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:40 Friday 17 January 2025 04:14:06 -0500 (0:00:00.070) 0:03:59.648 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:46 Friday 17 January 2025 04:14:07 -0500 (0:00:00.077) 0:03:59.726 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Check LUKS version] ****************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:51 Friday 17 January 2025 04:14:07 -0500 (0:00:00.072) 0:03:59.798 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Check LUKS key size] ***************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:63 Friday 17 January 2025 04:14:07 -0500 (0:00:00.077) 0:03:59.876 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:75 Friday 17 January 2025 04:14:07 -0500 (0:00:00.064) 0:03:59.940 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set test variables] ****************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:87 Friday 17 January 2025 04:14:07 -0500 (0:00:00.063) 0:04:00.004 ******** ok: [managed-node1] => { "ansible_facts": { "_storage_test_crypttab_entries": [ "luks-35e15af6-60a8-41b2-94e6-05460ad82662 /dev/sda1 -" ], "_storage_test_expected_crypttab_entries": "1", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:93 Friday 17 January 2025 04:14:07 -0500 (0:00:00.159) 0:04:00.164 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:100 Friday 17 January 2025 04:14:07 -0500 (0:00:00.079) 0:04:00.244 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:108 Friday 17 January 2025 04:14:07 -0500 (0:00:00.088) 0:04:00.333 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Check key file of crypttab entry] **************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:116 Friday 17 January 2025 04:14:07 -0500 (0:00:00.171) 0:04:00.504 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Clear test variables] **************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:124 Friday 17 January 2025 04:14:07 -0500 (0:00:00.123) 0:04:00.628 ******** ok: [managed-node1] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [Get information about RAID] ********************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:8 Friday 17 January 2025 04:14:07 -0500 (0:00:00.065) 0:04:00.693 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:14 Friday 17 January 2025 04:14:08 -0500 (0:00:00.079) 0:04:00.773 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:19 Friday 17 January 2025 04:14:08 -0500 (0:00:00.047) 0:04:00.820 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:24 Friday 17 January 2025 04:14:08 -0500 (0:00:00.037) 0:04:00.858 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set chunk size regex] **************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:29 Friday 17 January 2025 04:14:08 -0500 (0:00:00.037) 0:04:00.895 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:37 Friday 17 January 2025 04:14:08 -0500 (0:00:00.046) 0:04:00.942 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:46 Friday 17 January 2025 04:14:08 -0500 (0:00:00.134) 0:04:01.077 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:54 Friday 17 January 2025 04:14:08 -0500 (0:00:00.064) 0:04:01.142 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:62 Friday 17 January 2025 04:14:08 -0500 (0:00:00.091) 0:04:01.233 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:70 Friday 17 January 2025 04:14:08 -0500 (0:00:00.066) 0:04:01.300 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the actual size of the volume] ************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:3 Friday 17 January 2025 04:14:08 -0500 (0:00:00.074) 0:04:01.375 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested size of the volume] ********************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:11 Friday 17 January 2025 04:14:08 -0500 (0:00:00.070) 0:04:01.446 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected size] ********************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:20 Friday 17 January 2025 04:14:08 -0500 (0:00:00.069) 0:04:01.515 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show expected size] ****************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:28 Friday 17 January 2025 04:14:08 -0500 (0:00:00.070) 0:04:01.586 ******** ok: [managed-node1] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [Get the size of parent/pool device] ************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:32 Friday 17 January 2025 04:14:08 -0500 (0:00:00.082) 0:04:01.668 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show test pool] ********************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:46 Friday 17 January 2025 04:14:09 -0500 (0:00:00.067) 0:04:01.735 ******** skipping: [managed-node1] => {} TASK [Show test blockinfo] ***************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:50 Friday 17 January 2025 04:14:09 -0500 (0:00:00.072) 0:04:01.807 ******** skipping: [managed-node1] => {} TASK [Show test pool size] ***************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:54 Friday 17 January 2025 04:14:09 -0500 (0:00:00.069) 0:04:01.877 ******** skipping: [managed-node1] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:58 Friday 17 January 2025 04:14:09 -0500 (0:00:00.072) 0:04:01.949 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default thin pool reserved space values] ********************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:67 Friday 17 January 2025 04:14:09 -0500 (0:00:00.060) 0:04:02.010 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default minimal thin pool reserved space size] *************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:71 Friday 17 January 2025 04:14:09 -0500 (0:00:00.054) 0:04:02.064 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default maximal thin pool reserved space size] *************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:76 Friday 17 January 2025 04:14:09 -0500 (0:00:00.064) 0:04:02.129 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate maximum usable space in thin pool] ***************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:82 Friday 17 January 2025 04:14:09 -0500 (0:00:00.056) 0:04:02.186 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply upper size limit to max usable thin pool space] ******************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:86 Friday 17 January 2025 04:14:09 -0500 (0:00:00.056) 0:04:02.242 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply lower size limit to max usable thin pool space] ******************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:91 Friday 17 January 2025 04:14:09 -0500 (0:00:00.060) 0:04:02.303 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Convert maximum usable thin pool space from int to Size] ***************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:96 Friday 17 January 2025 04:14:09 -0500 (0:00:00.055) 0:04:02.358 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show max thin pool size] ************************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:101 Friday 17 January 2025 04:14:09 -0500 (0:00:00.056) 0:04:02.415 ******** skipping: [managed-node1] => {} TASK [Show volume thin pool size] ********************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:105 Friday 17 January 2025 04:14:09 -0500 (0:00:00.057) 0:04:02.472 ******** skipping: [managed-node1] => {} TASK [Show test volume size] *************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:109 Friday 17 January 2025 04:14:09 -0500 (0:00:00.057) 0:04:02.529 ******** skipping: [managed-node1] => {} TASK [Establish base value for expected thin pool size] ************************ task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:113 Friday 17 January 2025 04:14:09 -0500 (0:00:00.061) 0:04:02.591 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:120 Friday 17 January 2025 04:14:09 -0500 (0:00:00.066) 0:04:02.657 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected thin pool volume size] ***************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:127 Friday 17 January 2025 04:14:10 -0500 (0:00:00.072) 0:04:02.730 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected thin pool volume size based on percentage value] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:131 Friday 17 January 2025 04:14:10 -0500 (0:00:00.057) 0:04:02.787 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Replace expected volume size with calculated value] ********************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:137 Friday 17 January 2025 04:14:10 -0500 (0:00:00.057) 0:04:02.845 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show actual size] ******************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:143 Friday 17 January 2025 04:14:10 -0500 (0:00:00.056) 0:04:02.902 ******** ok: [managed-node1] => { "storage_test_actual_size": { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } } TASK [Show expected size] ****************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:147 Friday 17 January 2025 04:14:10 -0500 (0:00:00.066) 0:04:02.968 ******** ok: [managed-node1] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [Assert expected size is actual size] ************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:151 Friday 17 January 2025 04:14:10 -0500 (0:00:00.066) 0:04:03.035 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about the LV] ******************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:5 Friday 17 January 2025 04:14:10 -0500 (0:00:00.064) 0:04:03.100 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV segment type] ***************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:13 Friday 17 January 2025 04:14:10 -0500 (0:00:00.063) 0:04:03.163 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check segment type] ****************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:17 Friday 17 January 2025 04:14:10 -0500 (0:00:00.056) 0:04:03.220 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV cache size] ******************************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:24 Friday 17 January 2025 04:14:10 -0500 (0:00:00.056) 0:04:03.277 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested cache size] ****************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:31 Friday 17 January 2025 04:14:10 -0500 (0:00:00.057) 0:04:03.335 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected cache size] ************************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:37 Friday 17 January 2025 04:14:10 -0500 (0:00:00.062) 0:04:03.397 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:42 Friday 17 January 2025 04:14:10 -0500 (0:00:00.057) 0:04:03.455 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:25 Friday 17 January 2025 04:14:10 -0500 (0:00:00.056) 0:04:03.511 ******** ok: [managed-node1] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:44 Friday 17 January 2025 04:14:10 -0500 (0:00:00.057) 0:04:03.569 ******** TASK [Clean up variable namespace] ********************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:54 Friday 17 January 2025 04:14:10 -0500 (0:00:00.053) 0:04:03.622 ******** ok: [managed-node1] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null }, "changed": false } TASK [Create a file] *********************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/create-test-file.yml:12 Friday 17 January 2025 04:14:10 -0500 (0:00:00.045) 0:04:03.667 ******** changed: [managed-node1] => { "changed": true, "dest": "/opt/test1/quux", "gid": 0, "group": "root", "mode": "0644", "owner": "root", "secontext": "unconfined_u:object_r:unlabeled_t:s0", "size": 0, "state": "file", "uid": 0 } TASK [Test for correct handling of safe_mode] ********************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:247 Friday 17 January 2025 04:14:11 -0500 (0:00:00.360) 0:04:04.028 ******** included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml for managed-node1 TASK [Store global variable value copy] **************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:4 Friday 17 January 2025 04:14:11 -0500 (0:00:00.099) 0:04:04.127 ******** ok: [managed-node1] => { "ansible_facts": { "storage_pools_global": [], "storage_safe_mode_global": true, "storage_volumes_global": [] }, "changed": false } TASK [Verify role raises correct error] **************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:10 Friday 17 January 2025 04:14:11 -0500 (0:00:00.045) 0:04:04.172 ******** TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Friday 17 January 2025 04:14:11 -0500 (0:00:00.060) 0:04:04.233 ******** included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node1 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Friday 17 January 2025 04:14:11 -0500 (0:00:00.061) 0:04:04.294 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Friday 17 January 2025 04:14:11 -0500 (0:00:00.046) 0:04:04.341 ******** skipping: [managed-node1] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node1] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node1] => (item=CentOS_7.yml) => { "ansible_facts": { "__storage_blivet_diskvolume_mkfs_option_map": { "ext2": "-F", "ext3": "-F", "ext4": "-F" }, "blivet_package_list": [ "python-enum34", "python-blivet3", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "{{ 'libblockdev-s390' if ansible_architecture == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_7.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_7.yml" } skipping: [managed-node1] => (item=CentOS_7.9.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS_7.9.yml", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Friday 17 January 2025 04:14:11 -0500 (0:00:00.094) 0:04:04.435 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Friday 17 January 2025 04:14:11 -0500 (0:00:00.040) 0:04:04.475 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Friday 17 January 2025 04:14:11 -0500 (0:00:00.037) 0:04:04.513 ******** ok: [managed-node1] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Friday 17 January 2025 04:14:11 -0500 (0:00:00.039) 0:04:04.552 ******** ok: [managed-node1] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Friday 17 January 2025 04:14:11 -0500 (0:00:00.037) 0:04:04.590 ******** included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node1 TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2 Friday 17 January 2025 04:14:12 -0500 (0:00:00.158) 0:04:04.749 ******** ok: [managed-node1] => { "changed": false, "rc": 0, "results": [ "python-enum34-1.0.4-1.el7.noarch providing python-enum34 is already installed", "1:python2-blivet3-3.1.3-3.el7.noarch providing python-blivet3 is already installed", "libblockdev-crypto-2.18-5.el7.x86_64 providing libblockdev-crypto is already installed", "libblockdev-dm-2.18-5.el7.x86_64 providing libblockdev-dm is already installed", "libblockdev-lvm-2.18-5.el7.x86_64 providing libblockdev-lvm is already installed", "libblockdev-mdraid-2.18-5.el7.x86_64 providing libblockdev-mdraid is already installed", "libblockdev-swap-2.18-5.el7.x86_64 providing libblockdev-swap is already installed", "libblockdev-2.18-5.el7.x86_64 providing libblockdev is already installed" ] } TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Friday 17 January 2025 04:14:15 -0500 (0:00:03.880) 0:04:08.630 ******** ok: [managed-node1] => { "storage_pools": [ { "disks": [ "sda" ], "name": "foo", "type": "partition", "volumes": [ { "encryption": false, "encryption_luks_version": "luks2", "encryption_password": "yabbadabbadoo", "mount_point": "/opt/test1", "name": "test1", "size": "4g", "type": "partition" } ] } ] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:14 Friday 17 January 2025 04:14:16 -0500 (0:00:00.079) 0:04:08.709 ******** ok: [managed-node1] => { "storage_volumes": [] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19 Friday 17 January 2025 04:14:16 -0500 (0:00:00.076) 0:04:08.786 ******** ok: [managed-node1] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:31 Friday 17 January 2025 04:14:20 -0500 (0:00:04.170) 0:04:12.956 ******** included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node1 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Friday 17 January 2025 04:14:20 -0500 (0:00:00.127) 0:04:13.084 ******** TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Friday 17 January 2025 04:14:20 -0500 (0:00:00.054) 0:04:13.138 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Friday 17 January 2025 04:14:20 -0500 (0:00:00.058) 0:04:13.197 ******** TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:37 Friday 17 January 2025 04:14:20 -0500 (0:00:00.046) 0:04:13.244 ******** ok: [managed-node1] => { "changed": false, "rc": 0, "results": [ "kpartx-0.4.9-136.el7_9.x86_64 providing kpartx is already installed" ] } TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:51 Friday 17 January 2025 04:14:21 -0500 (0:00:00.697) 0:04:13.941 ******** ok: [managed-node1] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "arp-ethers.service": { "name": "arp-ethers.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "brandbot.service": { "name": "brandbot.service", "source": "systemd", "state": "inactive", "status": "static" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "console-shell.service": { "name": "console-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.import1.service": { "name": "dbus-org.freedesktop.import1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.machine1.service": { "name": "dbus-org.freedesktop.machine1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dmraid-activation.service": { "name": "dmraid-activation.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "ebtables.service": { "name": "ebtables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iprdump.service": { "name": "iprdump.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprinit.service": { "name": "iprinit.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprupdate.service": { "name": "iprupdate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-lvmetad.service": { "name": "lvm2-lvmetad.service", "source": "systemd", "state": "running", "status": "static" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "netconsole": { "name": "netconsole", "source": "sysv", "state": "stopped", "status": "disabled" }, "network": { "name": "network", "source": "sysv", "state": "running", "status": "enabled" }, "network.service": { "name": "network.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-config.service": { "name": "nfs-config.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-idmap.service": { "name": "nfs-idmap.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-lock.service": { "name": "nfs-lock.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-rquotad.service": { "name": "nfs-rquotad.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-secure.service": { "name": "nfs-secure.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs.service": { "name": "nfs.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfslock.service": { "name": "nfslock.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "postfix.service": { "name": "postfix.service", "source": "systemd", "state": "running", "status": "enabled" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rhel-autorelabel-mark.service": { "name": "rhel-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-autorelabel.service": { "name": "rhel-autorelabel.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-configure.service": { "name": "rhel-configure.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-dmesg.service": { "name": "rhel-dmesg.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-domainname.service": { "name": "rhel-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-import-state.service": { "name": "rhel-import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-loadmodules.service": { "name": "rhel-loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-readonly.service": { "name": "rhel-readonly.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-rquotad.service": { "name": "rpc-rquotad.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpcgssd.service": { "name": "rpcgssd.service", "source": "systemd", "state": "inactive", "status": "static" }, "rpcidmapd.service": { "name": "rpcidmapd.service", "source": "systemd", "state": "inactive", "status": "static" }, "rsyncd.service": { "name": "rsyncd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyncd@.service": { "name": "rsyncd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-policy-migrate-local-changes@.service": { "name": "selinux-policy-migrate-local-changes@.service", "source": "systemd", "state": "unknown", "status": "static" }, "selinux-policy-migrate-local-changes@targeted.service": { "name": "selinux-policy-migrate-local-changes@targeted.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "unknown" }, "sshd-keygen.service": { "name": "sshd-keygen.service", "source": "systemd", "state": "stopped", "status": "static" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-bootchart.service": { "name": "systemd-bootchart.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-cryptsetup@luks\\x2d8c8d336a\\x2d42af\\x2d4ee5\\x2da371\\x2d3bf9cd16f56b.service": { "name": "systemd-cryptsetup@luks\\x2d8c8d336a\\x2d42af\\x2d4ee5\\x2da371\\x2d3bf9cd16f56b.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-importd.service": { "name": "systemd-importd.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-machined.service": { "name": "systemd-machined.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-nspawn@.service": { "name": "systemd-nspawn@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-readahead-collect.service": { "name": "systemd-readahead-collect.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-readahead-done.service": { "name": "systemd-readahead-done.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "systemd-readahead-drop.service": { "name": "systemd-readahead-drop.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "systemd-readahead-replay.service": { "name": "systemd-readahead-replay.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill@.service": { "name": "systemd-rfkill@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-shutdownd.service": { "name": "systemd-shutdownd.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "wpa_supplicant.service": { "name": "wpa_supplicant.service", "source": "systemd", "state": "inactive", "status": "disabled" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Set storage_cryptsetup_services] ***** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:57 Friday 17 January 2025 04:14:22 -0500 (0:00:01.012) 0:04:14.954 ******** ok: [managed-node1] => { "ansible_facts": { "storage_cryptsetup_services": [ "systemd-cryptsetup@luks\\x2d8c8d336a\\x2d42af\\x2d4ee5\\x2da371\\x2d3bf9cd16f56b.service" ] }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:63 Friday 17 January 2025 04:14:22 -0500 (0:00:00.062) 0:04:15.017 ******** changed: [managed-node1] => (item=systemd-cryptsetup@luks\x2d8c8d336a\x2d42af\x2d4ee5\x2da371\x2d3bf9cd16f56b.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2d8c8d336a\\x2d42af\\x2d4ee5\\x2da371\\x2d3bf9cd16f56b.service", "name": "systemd-cryptsetup@luks\\x2d8c8d336a\\x2d42af\\x2d4ee5\\x2da371\\x2d3bf9cd16f56b.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "After": "systemd-journald.socket cryptsetup-pre.target system-systemd\\x2dcryptsetup.slice dev-sda.device systemd-readahead-collect.service systemd-readahead-replay.service", "AllowIsolate": "no", "AmbientCapabilities": "0", "AssertResult": "no", "AssertTimestampMonotonic": "0", "Before": "cryptsetup.target umount.target", "BindsTo": "dev-sda.device", "BlockIOAccounting": "no", "BlockIOWeight": "18446744073709551615", "CPUAccounting": "no", "CPUQuotaPerSecUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "18446744073709551615", "CanIsolate": "no", "CanReload": "no", "CanStart": "yes", "CanStop": "yes", "CapabilityBoundingSet": "18446744073709551615", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "Conflicts": "umount.target", "ControlPID": "0", "DefaultDependencies": "no", "Delegate": "no", "Description": "Cryptography Setup for luks-8c8d336a-42af-4ee5-a371-3bf9cd16f56b", "DevicePolicy": "auto", "Documentation": "man:crypttab(5) man:systemd-cryptsetup-generator(8) man:systemd-cryptsetup@.service(8)", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup attach luks-8c8d336a-42af-4ee5-a371-3bf9cd16f56b /dev/sda - ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStop": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup detach luks-8c8d336a-42af-4ee5-a371-3bf9cd16f56b ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/run/systemd/generator/systemd-cryptsetup@luks\\x2d8c8d336a\\x2d42af\\x2d4ee5\\x2da371\\x2d3bf9cd16f56b.service", "GuessMainPID": "yes", "IOScheduling": "0", "Id": "systemd-cryptsetup@luks\\x2d8c8d336a\\x2d42af\\x2d4ee5\\x2da371\\x2d3bf9cd16f56b.service", "IgnoreOnIsolate": "yes", "IgnoreOnSnapshot": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobTimeoutAction": "none", "JobTimeoutUSec": "0", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "18446744073709551615", "LimitCORE": "18446744073709551615", "LimitCPU": "18446744073709551615", "LimitDATA": "18446744073709551615", "LimitFSIZE": "18446744073709551615", "LimitLOCKS": "18446744073709551615", "LimitMEMLOCK": "65536", "LimitMSGQUEUE": "819200", "LimitNICE": "0", "LimitNOFILE": "4096", "LimitNPROC": "14311", "LimitRSS": "18446744073709551615", "LimitRTPRIO": "0", "LimitRTTIME": "18446744073709551615", "LimitSIGPENDING": "14311", "LimitSTACK": "18446744073709551615", "LoadState": "loaded", "MainPID": "0", "MemoryAccounting": "no", "MemoryCurrent": "18446744073709551615", "MemoryLimit": "18446744073709551615", "MountFlags": "0", "Names": "systemd-cryptsetup@luks\\x2d8c8d336a\\x2d42af\\x2d4ee5\\x2da371\\x2d3bf9cd16f56b.service", "NeedDaemonReload": "yes", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "PrivateDevices": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "ProtectHome": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "yes", "RequiredBy": "cryptsetup.target", "Requires": "system-systemd\\x2dcryptsetup.slice", "Restart": "no", "RestartUSec": "100ms", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "Slice": "system-systemd\\x2dcryptsetup.slice", "SourcePath": "/etc/crypttab", "StandardError": "inherit", "StandardInput": "null", "StandardOutput": "journal", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitInterval": "10000000", "StartupBlockIOWeight": "18446744073709551615", "StartupCPUShares": "18446744073709551615", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "no", "TasksCurrent": "18446744073709551615", "TasksMax": "18446744073709551615", "TimeoutStartUSec": "0", "TimeoutStopUSec": "0", "TimerSlackNSec": "50000", "Transient": "no", "Type": "oneshot", "UMask": "0022", "UnitFilePreset": "disabled", "UnitFileState": "bad", "WantedBy": "dev-sda.device", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:69 Friday 17 January 2025 04:14:22 -0500 (0:00:00.545) 0:04:15.562 ******** fatal: [managed-node1]: FAILED! => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } MSG: cannot remove existing formatting on device 'luks-35e15af6-60a8-41b2-94e6-05460ad82662' in safe mode due to encryption removal TASK [fedora.linux_system_roles.storage : Failed message] ********************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:109 Friday 17 January 2025 04:14:26 -0500 (0:00:03.999) 0:04:19.562 ******** fatal: [managed-node1]: FAILED! => { "changed": false } MSG: {u'_ansible_no_log': False, u'crypts': [], u'pools': [], u'leaves': [], u'changed': False, u'actions': [], u'failed': True, u'volumes': [], u'invocation': {u'module_args': {u'packages_only': False, u'disklabel_type': None, u'diskvolume_mkfs_option_map': {u'ext4': u'-F', u'ext3': u'-F', u'ext2': u'-F'}, u'safe_mode': True, u'pools': [{u'raid_metadata_version': None, u'encryption_key_size': None, u'encryption_key': None, u'encryption_luks_version': None, u'encryption_tang_url': None, u'raid_spare_count': None, u'grow_to_fill': False, u'encryption_tang_thumbprint': None, u'name': u'foo', u'encryption_password': None, u'encryption': False, u'disks': [u'sda'], u'raid_level': None, u'raid_device_count': None, u'state': u'present', u'volumes': [{u'raid_metadata_version': None, u'mount_device_identifier': u'uuid', u'fs_type': u'xfs', u'mount_options': u'defaults', u'size': u'4g', u'mount_point': u'/opt/test1', u'compression': None, u'encryption_password': u'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', u'encryption': False, u'raid_level': None, u'raid_device_count': None, u'state': u'present', u'vdo_pool_size': None, u'mount_mode': None, u'thin_pool_name': None, u'type': u'partition', u'encryption_key_size': 0, u'deduplication': None, u'encryption_cipher': None, u'encryption_key': None, u'fs_label': u'', u'encryption_luks_version': u'luks2', u'raid_stripe_size': None, u'mount_passno': 0, u'mount_user': None, u'raid_spare_count': None, u'name': u'test1', u'cache_mode': None, u'raid_disks': [], u'mount_group': None, u'fs_overwrite_existing': True, u'disks': [u'sda'], u'cached': False, u'thin_pool_size': None, u'thin': False, u'mount_check': 0, u'cache_size': 0, u'raid_chunk_size': None, u'cache_devices': [], u'fs_create_options': u''}], u'shared': False, u'encryption_clevis_pin': None, u'type': u'partition', u'encryption_cipher': None, u'raid_chunk_size': None}], u'volumes': [], u'pool_defaults': {u'raid_metadata_version': None, u'encryption_cipher': None, u'encryption_key': None, u'encryption_luks_version': None, u'raid_spare_count': None, u'grow_to_fill': False, u'encryption_password': None, u'encryption': False, u'disks': [], u'raid_level': None, u'raid_device_count': None, u'state': u'present', u'volumes': [], u'shared': False, u'type': u'lvm', u'encryption_key_size': None, u'raid_chunk_size': None}, u'volume_defaults': {u'raid_metadata_version': None, u'raid_level': None, u'fs_type': u'xfs', u'mount_options': u'defaults', u'size': 0, u'mount_point': u'', u'compression': None, u'encryption_password': None, u'encryption': False, u'mount_device_identifier': u'uuid', u'raid_device_count': None, u'state': u'present', u'vdo_pool_size': None, u'thin_pool_name': None, u'type': u'lvm', u'encryption_key_size': None, u'encryption_cipher': None, u'encryption_key': None, u'fs_label': u'', u'encryption_luks_version': None, u'raid_stripe_size': None, u'cache_size': 0, u'raid_spare_count': None, u'cache_mode': None, u'deduplication': None, u'cached': False, u'fs_overwrite_existing': True, u'disks': [], u'thin_pool_size': None, u'thin': None, u'mount_check': 0, u'mount_passno': 0, u'raid_chunk_size': None, u'cache_devices': [], u'fs_create_options': u''}, u'use_partitions': None}}, u'mounts': [], u'packages': [], u'msg': u"cannot remove existing formatting on device 'luks-35e15af6-60a8-41b2-94e6-05460ad82662' in safe mode due to encryption removal"} TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:113 Friday 17 January 2025 04:14:26 -0500 (0:00:00.053) 0:04:19.615 ******** changed: [managed-node1] => (item=systemd-cryptsetup@luks\x2d8c8d336a\x2d42af\x2d4ee5\x2da371\x2d3bf9cd16f56b.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2d8c8d336a\\x2d42af\\x2d4ee5\\x2da371\\x2d3bf9cd16f56b.service", "name": "systemd-cryptsetup@luks\\x2d8c8d336a\\x2d42af\\x2d4ee5\\x2da371\\x2d3bf9cd16f56b.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AmbientCapabilities": "0", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "18446744073709551615", "CPUAccounting": "no", "CPUQuotaPerSecUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "18446744073709551615", "CanIsolate": "no", "CanReload": "no", "CanStart": "yes", "CanStop": "yes", "CapabilityBoundingSet": "18446744073709551615", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ControlPID": "0", "DefaultDependencies": "yes", "Delegate": "no", "Description": "systemd-cryptsetup@luks\\x2d8c8d336a\\x2d42af\\x2d4ee5\\x2da371\\x2d3bf9cd16f56b.service", "DevicePolicy": "auto", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/dev/null", "GuessMainPID": "yes", "IOScheduling": "0", "Id": "systemd-cryptsetup@luks\\x2d8c8d336a\\x2d42af\\x2d4ee5\\x2da371\\x2d3bf9cd16f56b.service", "IgnoreOnIsolate": "no", "IgnoreOnSnapshot": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobTimeoutAction": "none", "JobTimeoutUSec": "0", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "18446744073709551615", "LimitCORE": "18446744073709551615", "LimitCPU": "18446744073709551615", "LimitDATA": "18446744073709551615", "LimitFSIZE": "18446744073709551615", "LimitLOCKS": "18446744073709551615", "LimitMEMLOCK": "65536", "LimitMSGQUEUE": "819200", "LimitNICE": "0", "LimitNOFILE": "1048576", "LimitNPROC": "14311", "LimitRSS": "18446744073709551615", "LimitRTPRIO": "0", "LimitRTTIME": "18446744073709551615", "LimitSIGPENDING": "14311", "LimitSTACK": "18446744073709551615", "LoadState": "masked", "MainPID": "0", "MemoryAccounting": "no", "MemoryCurrent": "18446744073709551615", "MemoryLimit": "18446744073709551615", "MountFlags": "0", "Names": "systemd-cryptsetup@luks\\x2d8c8d336a\\x2d42af\\x2d4ee5\\x2da371\\x2d3bf9cd16f56b.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "PrivateDevices": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "ProtectHome": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "no", "Restart": "no", "RestartUSec": "100ms", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitInterval": "10000000", "StartupBlockIOWeight": "18446744073709551615", "StartupCPUShares": "18446744073709551615", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "no", "TasksCurrent": "18446744073709551615", "TasksMax": "18446744073709551615", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UMask": "0022", "UnitFileState": "bad", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } TASK [Check that we failed in the role] **************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:23 Friday 17 January 2025 04:14:27 -0500 (0:00:00.491) 0:04:20.106 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Verify the blivet output and error message are correct] ****************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:28 Friday 17 January 2025 04:14:27 -0500 (0:00:00.045) 0:04:20.152 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Verify correct exception or error message] ******************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:39 Friday 17 January 2025 04:14:27 -0500 (0:00:00.054) 0:04:20.206 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Stat the file] *********************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml:11 Friday 17 January 2025 04:14:27 -0500 (0:00:00.037) 0:04:20.244 ******** ok: [managed-node1] => { "changed": false, "stat": { "atime": 1737105251.2669551, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1737105251.2669551, "dev": 64768, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 67, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0644", "mtime": 1737105251.2669551, "nlink": 1, "path": "/opt/test1/quux", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "version": "18446744071878616042", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Assert file presence] **************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml:16 Friday 17 January 2025 04:14:27 -0500 (0:00:00.344) 0:04:20.588 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Remove the encryption layer] ********************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:272 Friday 17 January 2025 04:14:27 -0500 (0:00:00.044) 0:04:20.633 ******** TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Friday 17 January 2025 04:14:28 -0500 (0:00:00.208) 0:04:20.842 ******** included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node1 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Friday 17 January 2025 04:14:28 -0500 (0:00:00.058) 0:04:20.901 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Friday 17 January 2025 04:14:28 -0500 (0:00:00.047) 0:04:20.948 ******** skipping: [managed-node1] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node1] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node1] => (item=CentOS_7.yml) => { "ansible_facts": { "__storage_blivet_diskvolume_mkfs_option_map": { "ext2": "-F", "ext3": "-F", "ext4": "-F" }, "blivet_package_list": [ "python-enum34", "python-blivet3", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "{{ 'libblockdev-s390' if ansible_architecture == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_7.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_7.yml" } skipping: [managed-node1] => (item=CentOS_7.9.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS_7.9.yml", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Friday 17 January 2025 04:14:28 -0500 (0:00:00.096) 0:04:21.044 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Friday 17 January 2025 04:14:28 -0500 (0:00:00.038) 0:04:21.083 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Friday 17 January 2025 04:14:28 -0500 (0:00:00.036) 0:04:21.120 ******** ok: [managed-node1] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Friday 17 January 2025 04:14:28 -0500 (0:00:00.038) 0:04:21.158 ******** ok: [managed-node1] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Friday 17 January 2025 04:14:28 -0500 (0:00:00.038) 0:04:21.196 ******** included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node1 TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2 Friday 17 January 2025 04:14:28 -0500 (0:00:00.088) 0:04:21.285 ******** ok: [managed-node1] => { "changed": false, "rc": 0, "results": [ "python-enum34-1.0.4-1.el7.noarch providing python-enum34 is already installed", "1:python2-blivet3-3.1.3-3.el7.noarch providing python-blivet3 is already installed", "libblockdev-crypto-2.18-5.el7.x86_64 providing libblockdev-crypto is already installed", "libblockdev-dm-2.18-5.el7.x86_64 providing libblockdev-dm is already installed", "libblockdev-lvm-2.18-5.el7.x86_64 providing libblockdev-lvm is already installed", "libblockdev-mdraid-2.18-5.el7.x86_64 providing libblockdev-mdraid is already installed", "libblockdev-swap-2.18-5.el7.x86_64 providing libblockdev-swap is already installed", "libblockdev-2.18-5.el7.x86_64 providing libblockdev is already installed" ] } TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Friday 17 January 2025 04:14:32 -0500 (0:00:03.900) 0:04:25.185 ******** ok: [managed-node1] => { "storage_pools": [ { "disks": [ "sda" ], "name": "foo", "type": "partition", "volumes": [ { "encryption": false, "encryption_luks_version": "luks2", "encryption_password": "yabbadabbadoo", "mount_point": "/opt/test1", "name": "test1", "size": "4g", "type": "partition" } ] } ] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:14 Friday 17 January 2025 04:14:32 -0500 (0:00:00.046) 0:04:25.232 ******** ok: [managed-node1] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19 Friday 17 January 2025 04:14:32 -0500 (0:00:00.041) 0:04:25.274 ******** ok: [managed-node1] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:31 Friday 17 January 2025 04:14:36 -0500 (0:00:04.035) 0:04:29.309 ******** included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node1 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Friday 17 January 2025 04:14:36 -0500 (0:00:00.069) 0:04:29.378 ******** TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Friday 17 January 2025 04:14:36 -0500 (0:00:00.036) 0:04:29.415 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Friday 17 January 2025 04:14:36 -0500 (0:00:00.040) 0:04:29.455 ******** TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:37 Friday 17 January 2025 04:14:36 -0500 (0:00:00.036) 0:04:29.491 ******** ok: [managed-node1] => { "changed": false, "rc": 0, "results": [ "kpartx-0.4.9-136.el7_9.x86_64 providing kpartx is already installed" ] } TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:51 Friday 17 January 2025 04:14:37 -0500 (0:00:00.638) 0:04:30.130 ******** ok: [managed-node1] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "arp-ethers.service": { "name": "arp-ethers.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "brandbot.service": { "name": "brandbot.service", "source": "systemd", "state": "inactive", "status": "static" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "console-shell.service": { "name": "console-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.import1.service": { "name": "dbus-org.freedesktop.import1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.machine1.service": { "name": "dbus-org.freedesktop.machine1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dmraid-activation.service": { "name": "dmraid-activation.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "ebtables.service": { "name": "ebtables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iprdump.service": { "name": "iprdump.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprinit.service": { "name": "iprinit.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprupdate.service": { "name": "iprupdate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-lvmetad.service": { "name": "lvm2-lvmetad.service", "source": "systemd", "state": "running", "status": "static" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "netconsole": { "name": "netconsole", "source": "sysv", "state": "stopped", "status": "disabled" }, "network": { "name": "network", "source": "sysv", "state": "running", "status": "enabled" }, "network.service": { "name": "network.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-config.service": { "name": "nfs-config.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-idmap.service": { "name": "nfs-idmap.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-lock.service": { "name": "nfs-lock.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-rquotad.service": { "name": "nfs-rquotad.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-secure.service": { "name": "nfs-secure.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs.service": { "name": "nfs.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfslock.service": { "name": "nfslock.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "postfix.service": { "name": "postfix.service", "source": "systemd", "state": "running", "status": "enabled" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rhel-autorelabel-mark.service": { "name": "rhel-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-autorelabel.service": { "name": "rhel-autorelabel.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-configure.service": { "name": "rhel-configure.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-dmesg.service": { "name": "rhel-dmesg.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-domainname.service": { "name": "rhel-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-import-state.service": { "name": "rhel-import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-loadmodules.service": { "name": "rhel-loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-readonly.service": { "name": "rhel-readonly.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-rquotad.service": { "name": "rpc-rquotad.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpcgssd.service": { "name": "rpcgssd.service", "source": "systemd", "state": "inactive", "status": "static" }, "rpcidmapd.service": { "name": "rpcidmapd.service", "source": "systemd", "state": "inactive", "status": "static" }, "rsyncd.service": { "name": "rsyncd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyncd@.service": { "name": "rsyncd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-policy-migrate-local-changes@.service": { "name": "selinux-policy-migrate-local-changes@.service", "source": "systemd", "state": "unknown", "status": "static" }, "selinux-policy-migrate-local-changes@targeted.service": { "name": "selinux-policy-migrate-local-changes@targeted.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "unknown" }, "sshd-keygen.service": { "name": "sshd-keygen.service", "source": "systemd", "state": "stopped", "status": "static" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-bootchart.service": { "name": "systemd-bootchart.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-cryptsetup@luks\\x2d35e15af6\\x2d60a8\\x2d41b2\\x2d94e6\\x2d05460ad82662.service": { "name": "systemd-cryptsetup@luks\\x2d35e15af6\\x2d60a8\\x2d41b2\\x2d94e6\\x2d05460ad82662.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-importd.service": { "name": "systemd-importd.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-machined.service": { "name": "systemd-machined.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-nspawn@.service": { "name": "systemd-nspawn@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-readahead-collect.service": { "name": "systemd-readahead-collect.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-readahead-done.service": { "name": "systemd-readahead-done.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "systemd-readahead-drop.service": { "name": "systemd-readahead-drop.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "systemd-readahead-replay.service": { "name": "systemd-readahead-replay.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill@.service": { "name": "systemd-rfkill@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-shutdownd.service": { "name": "systemd-shutdownd.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "wpa_supplicant.service": { "name": "wpa_supplicant.service", "source": "systemd", "state": "inactive", "status": "disabled" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Set storage_cryptsetup_services] ***** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:57 Friday 17 January 2025 04:14:38 -0500 (0:00:00.944) 0:04:31.075 ******** ok: [managed-node1] => { "ansible_facts": { "storage_cryptsetup_services": [ "systemd-cryptsetup@luks\\x2d35e15af6\\x2d60a8\\x2d41b2\\x2d94e6\\x2d05460ad82662.service" ] }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:63 Friday 17 January 2025 04:14:38 -0500 (0:00:00.056) 0:04:31.131 ******** changed: [managed-node1] => (item=systemd-cryptsetup@luks\x2d35e15af6\x2d60a8\x2d41b2\x2d94e6\x2d05460ad82662.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2d35e15af6\\x2d60a8\\x2d41b2\\x2d94e6\\x2d05460ad82662.service", "name": "systemd-cryptsetup@luks\\x2d35e15af6\\x2d60a8\\x2d41b2\\x2d94e6\\x2d05460ad82662.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "After": "dev-sda1.device systemd-journald.socket cryptsetup-pre.target system-systemd\\x2dcryptsetup.slice systemd-readahead-replay.service systemd-readahead-collect.service", "AllowIsolate": "no", "AmbientCapabilities": "0", "AssertResult": "no", "AssertTimestampMonotonic": "0", "Before": "cryptsetup.target umount.target", "BindsTo": "dev-sda1.device", "BlockIOAccounting": "no", "BlockIOWeight": "18446744073709551615", "CPUAccounting": "no", "CPUQuotaPerSecUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "18446744073709551615", "CanIsolate": "no", "CanReload": "no", "CanStart": "yes", "CanStop": "yes", "CapabilityBoundingSet": "18446744073709551615", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "Conflicts": "umount.target", "ControlPID": "0", "DefaultDependencies": "no", "Delegate": "no", "Description": "Cryptography Setup for luks-35e15af6-60a8-41b2-94e6-05460ad82662", "DevicePolicy": "auto", "Documentation": "man:crypttab(5) man:systemd-cryptsetup-generator(8) man:systemd-cryptsetup@.service(8)", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup attach luks-35e15af6-60a8-41b2-94e6-05460ad82662 /dev/sda1 - ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStop": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup detach luks-35e15af6-60a8-41b2-94e6-05460ad82662 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/run/systemd/generator/systemd-cryptsetup@luks\\x2d35e15af6\\x2d60a8\\x2d41b2\\x2d94e6\\x2d05460ad82662.service", "GuessMainPID": "yes", "IOScheduling": "0", "Id": "systemd-cryptsetup@luks\\x2d35e15af6\\x2d60a8\\x2d41b2\\x2d94e6\\x2d05460ad82662.service", "IgnoreOnIsolate": "yes", "IgnoreOnSnapshot": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobTimeoutAction": "none", "JobTimeoutUSec": "0", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "18446744073709551615", "LimitCORE": "18446744073709551615", "LimitCPU": "18446744073709551615", "LimitDATA": "18446744073709551615", "LimitFSIZE": "18446744073709551615", "LimitLOCKS": "18446744073709551615", "LimitMEMLOCK": "65536", "LimitMSGQUEUE": "819200", "LimitNICE": "0", "LimitNOFILE": "4096", "LimitNPROC": "14311", "LimitRSS": "18446744073709551615", "LimitRTPRIO": "0", "LimitRTTIME": "18446744073709551615", "LimitSIGPENDING": "14311", "LimitSTACK": "18446744073709551615", "LoadState": "loaded", "MainPID": "0", "MemoryAccounting": "no", "MemoryCurrent": "18446744073709551615", "MemoryLimit": "18446744073709551615", "MountFlags": "0", "Names": "systemd-cryptsetup@luks\\x2d35e15af6\\x2d60a8\\x2d41b2\\x2d94e6\\x2d05460ad82662.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "PrivateDevices": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "ProtectHome": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "yes", "RequiredBy": "cryptsetup.target dev-mapper-luks\\x2d35e15af6\\x2d60a8\\x2d41b2\\x2d94e6\\x2d05460ad82662.device", "Requires": "system-systemd\\x2dcryptsetup.slice", "Restart": "no", "RestartUSec": "100ms", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "Slice": "system-systemd\\x2dcryptsetup.slice", "SourcePath": "/etc/crypttab", "StandardError": "inherit", "StandardInput": "null", "StandardOutput": "journal", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitInterval": "10000000", "StartupBlockIOWeight": "18446744073709551615", "StartupCPUShares": "18446744073709551615", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "no", "TasksCurrent": "18446744073709551615", "TasksMax": "18446744073709551615", "TimeoutStartUSec": "0", "TimeoutStopUSec": "0", "TimerSlackNSec": "50000", "Transient": "no", "Type": "oneshot", "UMask": "0022", "UnitFilePreset": "disabled", "UnitFileState": "bad", "WantedBy": "dev-sda1.device", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:69 Friday 17 January 2025 04:14:38 -0500 (0:00:00.490) 0:04:31.622 ******** changed: [managed-node1] => { "actions": [ { "action": "destroy format", "device": "/dev/mapper/luks-35e15af6-60a8-41b2-94e6-05460ad82662", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/luks-35e15af6-60a8-41b2-94e6-05460ad82662", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda1", "fs_type": "luks" }, { "action": "create format", "device": "/dev/sda1", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/sda1", "name": "luks-35e15af6-60a8-41b2-94e6-05460ad82662", "password": "-", "state": "absent" } ], "leaves": [ "/dev/sda1", "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-35e15af6-60a8-41b2-94e6-05460ad82662", "state": "absent" }, { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "UUID=173e2a19-a65c-4bfd-a311-0de3d72abddf", "state": "mounted" } ], "packages": [ "xfsprogs", "e2fsprogs" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "partition", "volumes": [ { "_device": "/dev/sda1", "_kernel_device": "/dev/sda1", "_mount_id": "UUID=173e2a19-a65c-4bfd-a311-0de3d72abddf", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } ] } ], "volumes": [] } TASK [fedora.linux_system_roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:83 Friday 17 January 2025 04:14:43 -0500 (0:00:04.528) 0:04:36.150 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if /etc/fstab is present] ****** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:90 Friday 17 January 2025 04:14:43 -0500 (0:00:00.036) 0:04:36.187 ******** ok: [managed-node1] => { "changed": false, "stat": { "atime": 1737105233.573941, "attr_flags": "e", "attributes": [ "extents" ], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "3d0a7870d46b0d5b054c9829ba613b86802c4d91", "ctime": 1737105233.570941, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 263644, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0644", "mtime": 1737105233.570941, "nlink": 1, "path": "/etc/fstab", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 1311, "uid": 0, "version": "18446744072031193646", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Add fingerprint to /etc/fstab if present] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:95 Friday 17 January 2025 04:14:43 -0500 (0:00:00.333) 0:04:36.520 ******** ok: [managed-node1] => { "backup": "", "changed": false } TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:113 Friday 17 January 2025 04:14:44 -0500 (0:00:00.338) 0:04:36.859 ******** changed: [managed-node1] => (item=systemd-cryptsetup@luks\x2d35e15af6\x2d60a8\x2d41b2\x2d94e6\x2d05460ad82662.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2d35e15af6\\x2d60a8\\x2d41b2\\x2d94e6\\x2d05460ad82662.service", "name": "systemd-cryptsetup@luks\\x2d35e15af6\\x2d60a8\\x2d41b2\\x2d94e6\\x2d05460ad82662.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AmbientCapabilities": "0", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "18446744073709551615", "CPUAccounting": "no", "CPUQuotaPerSecUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "18446744073709551615", "CanIsolate": "no", "CanReload": "no", "CanStart": "yes", "CanStop": "yes", "CapabilityBoundingSet": "18446744073709551615", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ControlPID": "0", "DefaultDependencies": "yes", "Delegate": "no", "Description": "systemd-cryptsetup@luks\\x2d35e15af6\\x2d60a8\\x2d41b2\\x2d94e6\\x2d05460ad82662.service", "DevicePolicy": "auto", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/dev/null", "GuessMainPID": "yes", "IOScheduling": "0", "Id": "systemd-cryptsetup@luks\\x2d35e15af6\\x2d60a8\\x2d41b2\\x2d94e6\\x2d05460ad82662.service", "IgnoreOnIsolate": "no", "IgnoreOnSnapshot": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobTimeoutAction": "none", "JobTimeoutUSec": "0", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "18446744073709551615", "LimitCORE": "18446744073709551615", "LimitCPU": "18446744073709551615", "LimitDATA": "18446744073709551615", "LimitFSIZE": "18446744073709551615", "LimitLOCKS": "18446744073709551615", "LimitMEMLOCK": "65536", "LimitMSGQUEUE": "819200", "LimitNICE": "0", "LimitNOFILE": "1048576", "LimitNPROC": "14311", "LimitRSS": "18446744073709551615", "LimitRTPRIO": "0", "LimitRTTIME": "18446744073709551615", "LimitSIGPENDING": "14311", "LimitSTACK": "18446744073709551615", "LoadState": "masked", "MainPID": "0", "MemoryAccounting": "no", "MemoryCurrent": "18446744073709551615", "MemoryLimit": "18446744073709551615", "MountFlags": "0", "Names": "systemd-cryptsetup@luks\\x2d35e15af6\\x2d60a8\\x2d41b2\\x2d94e6\\x2d05460ad82662.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "PrivateDevices": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "ProtectHome": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "no", "RequiredBy": "dev-mapper-luks\\x2d35e15af6\\x2d60a8\\x2d41b2\\x2d94e6\\x2d05460ad82662.device cryptsetup.target", "Restart": "no", "RestartUSec": "100ms", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitInterval": "10000000", "StartupBlockIOWeight": "18446744073709551615", "StartupCPUShares": "18446744073709551615", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "no", "TasksCurrent": "18446744073709551615", "TasksMax": "18446744073709551615", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UMask": "0022", "UnitFileState": "bad", "WantedBy": "dev-sda1.device", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } TASK [fedora.linux_system_roles.storage : Show blivet_output] ****************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:119 Friday 17 January 2025 04:14:44 -0500 (0:00:00.492) 0:04:37.351 ******** ok: [managed-node1] => { "blivet_output": { "actions": [ { "action": "destroy format", "device": "/dev/mapper/luks-35e15af6-60a8-41b2-94e6-05460ad82662", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/luks-35e15af6-60a8-41b2-94e6-05460ad82662", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda1", "fs_type": "luks" }, { "action": "create format", "device": "/dev/sda1", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/sda1", "name": "luks-35e15af6-60a8-41b2-94e6-05460ad82662", "password": "-", "state": "absent" } ], "failed": false, "leaves": [ "/dev/sda1", "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-35e15af6-60a8-41b2-94e6-05460ad82662", "state": "absent" }, { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "UUID=173e2a19-a65c-4bfd-a311-0de3d72abddf", "state": "mounted" } ], "packages": [ "xfsprogs", "e2fsprogs" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "partition", "volumes": [ { "_device": "/dev/sda1", "_kernel_device": "/dev/sda1", "_mount_id": "UUID=173e2a19-a65c-4bfd-a311-0de3d72abddf", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } ] } ], "volumes": [] } } TASK [fedora.linux_system_roles.storage : Set the list of pools for test verification] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:128 Friday 17 January 2025 04:14:44 -0500 (0:00:00.052) 0:04:37.404 ******** ok: [managed-node1] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "partition", "volumes": [ { "_device": "/dev/sda1", "_kernel_device": "/dev/sda1", "_mount_id": "UUID=173e2a19-a65c-4bfd-a311-0de3d72abddf", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } ] } ] }, "changed": false } TASK [fedora.linux_system_roles.storage : Set the list of volumes for test verification] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:132 Friday 17 January 2025 04:14:44 -0500 (0:00:00.048) 0:04:37.452 ******** ok: [managed-node1] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Remove obsolete mounts] ************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:148 Friday 17 January 2025 04:14:44 -0500 (0:00:00.043) 0:04:37.496 ******** changed: [managed-node1] => (item={u'src': u'/dev/mapper/luks-35e15af6-60a8-41b2-94e6-05460ad82662', u'state': u'absent', u'fstype': u'xfs', u'path': u'/opt/test1'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-35e15af6-60a8-41b2-94e6-05460ad82662", "state": "absent" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/luks-35e15af6-60a8-41b2-94e6-05460ad82662" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:159 Friday 17 January 2025 04:14:45 -0500 (0:00:00.345) 0:04:37.842 ******** ok: [managed-node1] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Set up new/current mounts] *********** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:164 Friday 17 January 2025 04:14:45 -0500 (0:00:00.462) 0:04:38.304 ******** changed: [managed-node1] => (item={u'src': u'UUID=173e2a19-a65c-4bfd-a311-0de3d72abddf', u'group': None, u'dump': 0, u'passno': 0, u'fstype': u'xfs', u'state': u'mounted', u'mode': None, u'owner': None, u'path': u'/opt/test1', u'opts': u'defaults'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "UUID=173e2a19-a65c-4bfd-a311-0de3d72abddf", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "UUID=173e2a19-a65c-4bfd-a311-0de3d72abddf" } TASK [fedora.linux_system_roles.storage : Manage mount ownership/permissions] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:175 Friday 17 January 2025 04:14:45 -0500 (0:00:00.355) 0:04:38.659 ******** skipping: [managed-node1] => (item={u'src': u'UUID=173e2a19-a65c-4bfd-a311-0de3d72abddf', u'group': None, u'dump': 0, u'passno': 0, u'fstype': u'xfs', u'state': u'mounted', u'mode': None, u'owner': None, u'path': u'/opt/test1', u'opts': u'defaults'}) => { "ansible_loop_var": "mount_info", "changed": false, "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "UUID=173e2a19-a65c-4bfd-a311-0de3d72abddf", "state": "mounted" }, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:187 Friday 17 January 2025 04:14:46 -0500 (0:00:00.048) 0:04:38.708 ******** ok: [managed-node1] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Retrieve facts for the /etc/crypttab file] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:195 Friday 17 January 2025 04:14:46 -0500 (0:00:00.453) 0:04:39.161 ******** ok: [managed-node1] => { "changed": false, "stat": { "atime": 1737105238.0369449, "attr_flags": "e", "attributes": [ "extents" ], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "19a783fb1c2b020cce649254e13851bd056d3631", "ctime": 1737105235.708943, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 263646, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0600", "mtime": 1737105235.7079427, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 54, "uid": 0, "version": "18446744072031194148", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Manage /etc/crypttab to account for changes we just made] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:200 Friday 17 January 2025 04:14:46 -0500 (0:00:00.325) 0:04:39.487 ******** changed: [managed-node1] => (item={u'state': u'absent', u'password': u'-', u'name': u'luks-35e15af6-60a8-41b2-94e6-05460ad82662', u'backing_device': u'/dev/sda1'}) => { "ansible_loop_var": "entry", "backup": "", "changed": true, "entry": { "backing_device": "/dev/sda1", "name": "luks-35e15af6-60a8-41b2-94e6-05460ad82662", "password": "-", "state": "absent" }, "found": 1 } MSG: 1 line(s) removed TASK [fedora.linux_system_roles.storage : Update facts] ************************ task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:222 Friday 17 January 2025 04:14:47 -0500 (0:00:00.348) 0:04:39.835 ******** ok: [managed-node1] TASK [Verify role results] ***************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:290 Friday 17 January 2025 04:14:47 -0500 (0:00:00.673) 0:04:40.509 ******** included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml for managed-node1 TASK [Print out pool information] ********************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:2 Friday 17 January 2025 04:14:47 -0500 (0:00:00.102) 0:04:40.612 ******** ok: [managed-node1] => { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "partition", "volumes": [ { "_device": "/dev/sda1", "_kernel_device": "/dev/sda1", "_mount_id": "UUID=173e2a19-a65c-4bfd-a311-0de3d72abddf", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } ] } ] } TASK [Print out volume information] ******************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:7 Friday 17 January 2025 04:14:47 -0500 (0:00:00.050) 0:04:40.663 ******** skipping: [managed-node1] => {} TASK [Collect info about the volumes.] ***************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:15 Friday 17 January 2025 04:14:48 -0500 (0:00:00.037) 0:04:40.701 ******** ok: [managed-node1] => { "changed": false, "info": { "/dev/sda": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sda1": { "fstype": "xfs", "label": "", "mountpoint": "/opt/test1", "name": "/dev/sda1", "size": "10G", "type": "partition", "uuid": "173e2a19-a65c-4bfd-a311-0de3d72abddf" }, "/dev/sdb": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdd": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdd", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sde": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sde", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdf": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdf", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdg": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdg", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdh": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdh", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdi": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdi", "size": "10G", "type": "disk", "uuid": "" }, "/dev/xvda": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/xvda", "size": "250G", "type": "disk", "uuid": "" }, "/dev/xvda1": { "fstype": "ext4", "label": "", "mountpoint": "/", "name": "/dev/xvda1", "size": "250G", "type": "partition", "uuid": "c7b7d6a5-fd01-4b9b-bcca-153eaff9d312" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:20 Friday 17 January 2025 04:14:48 -0500 (0:00:00.337) 0:04:41.039 ******** ok: [managed-node1] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.002991", "end": "2025-01-17 04:14:48.598634", "rc": 0, "start": "2025-01-17 04:14:48.595643" } STDOUT: # system_role:storage # # /etc/fstab # Created by anaconda on Thu Jun 20 10:23:46 2024 # # Accessible filesystems, by reference, are maintained under '/dev/disk' # See man pages fstab(5), findfs(8), mount(8) and/or blkid(8) for more info # UUID=c7b7d6a5-fd01-4b9b-bcca-153eaff9d312 / ext4 defaults 1 1 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat /mnt/redhat nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/qa /mnt/qa nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 vtap-eng01.storage.rdu2.redhat.com:/vol/engarchive /mnt/engarchive nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/tpsdist /mnt/tpsdist nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat/brewroot /mnt/brew nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_brew_scratch_nfs/scratch /mnt/brew_scratch nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 UUID=173e2a19-a65c-4bfd-a311-0de3d72abddf /opt/test1 xfs defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:25 Friday 17 January 2025 04:14:48 -0500 (0:00:00.318) 0:04:41.357 ******** ok: [managed-node1] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.002762", "end": "2025-01-17 04:14:48.917400", "failed_when_result": false, "rc": 0, "start": "2025-01-17 04:14:48.914638" } TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:34 Friday 17 January 2025 04:14:48 -0500 (0:00:00.320) 0:04:41.678 ******** included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml for managed-node1 TASK [Set _storage_pool_tests] ************************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:5 Friday 17 January 2025 04:14:49 -0500 (0:00:00.081) 0:04:41.759 ******** ok: [managed-node1] => { "ansible_facts": { "_storage_pool_tests": [ "members", "volumes" ] }, "changed": false } TASK [Get VG shared value status] ********************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:18 Friday 17 January 2025 04:14:49 -0500 (0:00:00.039) 0:04:41.798 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that VG shared value checks out] ********************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:24 Friday 17 January 2025 04:14:49 -0500 (0:00:00.038) 0:04:41.837 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify pool subset] ****************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:34 Friday 17 January 2025 04:14:49 -0500 (0:00:00.047) 0:04:41.885 ******** included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml for managed-node1 included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-volumes.yml for managed-node1 TASK [Set test variables] ****************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:2 Friday 17 January 2025 04:14:49 -0500 (0:00:00.082) 0:04:41.968 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get the canonical device path for each member device] ******************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:8 Friday 17 January 2025 04:14:49 -0500 (0:00:00.040) 0:04:42.008 ******** TASK [Set pvs lvm length] ****************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:17 Friday 17 January 2025 04:14:49 -0500 (0:00:00.035) 0:04:42.043 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set pool pvs] ************************************************************ task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:22 Friday 17 January 2025 04:14:49 -0500 (0:00:00.037) 0:04:42.080 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify PV count] ********************************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:27 Friday 17 January 2025 04:14:49 -0500 (0:00:00.037) 0:04:42.118 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected pv type] **************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:36 Friday 17 January 2025 04:14:49 -0500 (0:00:00.037) 0:04:42.156 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected pv type] **************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:41 Friday 17 January 2025 04:14:49 -0500 (0:00:00.037) 0:04:42.194 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected pv type] **************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:46 Friday 17 January 2025 04:14:49 -0500 (0:00:00.041) 0:04:42.235 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check the type of each PV] *********************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:51 Friday 17 January 2025 04:14:49 -0500 (0:00:00.093) 0:04:42.329 ******** TASK [Check that blivet supports PV grow to fill] ****************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:64 Friday 17 January 2025 04:14:49 -0500 (0:00:00.035) 0:04:42.365 ******** ok: [managed-node1] => { "changed": false, "rc": 0 } STDOUT: False STDERR: Shared connection to 10.31.46.65 closed. TASK [Verify that PVs fill the whole devices when they should] ***************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:73 Friday 17 January 2025 04:14:49 -0500 (0:00:00.240) 0:04:42.605 ******** TASK [Check MD RAID] *********************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:83 Friday 17 January 2025 04:14:49 -0500 (0:00:00.036) 0:04:42.642 ******** included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml for managed-node1 TASK [Get information about RAID] ********************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:8 Friday 17 January 2025 04:14:50 -0500 (0:00:00.076) 0:04:42.719 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:14 Friday 17 January 2025 04:14:50 -0500 (0:00:00.038) 0:04:42.757 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:19 Friday 17 January 2025 04:14:50 -0500 (0:00:00.038) 0:04:42.795 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:24 Friday 17 January 2025 04:14:50 -0500 (0:00:00.041) 0:04:42.837 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md chunk size regex] ************************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:29 Friday 17 January 2025 04:14:50 -0500 (0:00:00.038) 0:04:42.876 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:37 Friday 17 January 2025 04:14:50 -0500 (0:00:00.055) 0:04:42.931 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:46 Friday 17 January 2025 04:14:50 -0500 (0:00:00.037) 0:04:42.968 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:55 Friday 17 January 2025 04:14:50 -0500 (0:00:00.041) 0:04:43.010 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:64 Friday 17 January 2025 04:14:50 -0500 (0:00:00.038) 0:04:43.049 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:74 Friday 17 January 2025 04:14:50 -0500 (0:00:00.062) 0:04:43.112 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Reset variables used by tests] ******************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:83 Friday 17 January 2025 04:14:50 -0500 (0:00:00.050) 0:04:43.162 ******** ok: [managed-node1] => { "ansible_facts": { "storage_test_md_active_devices_re": null, "storage_test_md_chunk_size_re": null, "storage_test_md_metadata_version_re": null, "storage_test_md_spare_devices_re": null }, "changed": false } TASK [Check LVM RAID] ********************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:86 Friday 17 January 2025 04:14:50 -0500 (0:00:00.047) 0:04:43.210 ******** included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-lvmraid.yml for managed-node1 TASK [Validate pool member LVM RAID settings] ********************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-lvmraid.yml:2 Friday 17 January 2025 04:14:50 -0500 (0:00:00.090) 0:04:43.300 ******** skipping: [managed-node1] => (item={u'_raw_device': u'/dev/sda1', u'raid_metadata_version': None, u'mount_device_identifier': u'uuid', u'fs_type': u'xfs', u'mount_options': u'defaults', u'_device': u'/dev/sda1', u'size': u'4g', u'mount_point': u'/opt/test1', u'compression': None, u'encryption_password': u'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', u'_kernel_device': u'/dev/sda1', u'encryption': False, u'raid_level': None, u'name': u'test1', u'raid_device_count': None, u'state': u'present', u'vdo_pool_size': None, u'mount_mode': None, u'thin_pool_name': None, u'type': u'partition', u'encryption_key_size': 0, u'encryption_cipher': None, u'encryption_key': None, u'fs_label': u'', u'encryption_luks_version': u'luks2', u'raid_stripe_size': None, u'mount_passno': 0, u'_mount_id': u'UUID=173e2a19-a65c-4bfd-a311-0de3d72abddf', u'mount_user': None, u'raid_spare_count': None, u'raid_disks': [], u'_raw_kernel_device': u'/dev/sda1', u'cache_mode': None, u'cache_devices': [], u'deduplication': None, u'mount_group': None, u'thin_pool_size': None, u'disks': [u'sda'], u'cached': False, u'thin': False, u'mount_check': 0, u'cache_size': 0, u'raid_chunk_size': None, u'fs_overwrite_existing': True, u'fs_create_options': u''}) => { "ansible_loop_var": "storage_test_lvmraid_volume", "changed": false, "skip_reason": "Conditional result was False", "storage_test_lvmraid_volume": { "_device": "/dev/sda1", "_kernel_device": "/dev/sda1", "_mount_id": "UUID=173e2a19-a65c-4bfd-a311-0de3d72abddf", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } } TASK [Check Thin Pools] ******************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:89 Friday 17 January 2025 04:14:50 -0500 (0:00:00.057) 0:04:43.357 ******** included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-thin.yml for managed-node1 TASK [Validate pool member thinpool settings] ********************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-thin.yml:2 Friday 17 January 2025 04:14:50 -0500 (0:00:00.101) 0:04:43.459 ******** skipping: [managed-node1] => (item={u'_raw_device': u'/dev/sda1', u'raid_metadata_version': None, u'mount_device_identifier': u'uuid', u'fs_type': u'xfs', u'mount_options': u'defaults', u'_device': u'/dev/sda1', u'size': u'4g', u'mount_point': u'/opt/test1', u'compression': None, u'encryption_password': u'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', u'_kernel_device': u'/dev/sda1', u'encryption': False, u'raid_level': None, u'name': u'test1', u'raid_device_count': None, u'state': u'present', u'vdo_pool_size': None, u'mount_mode': None, u'thin_pool_name': None, u'type': u'partition', u'encryption_key_size': 0, u'encryption_cipher': None, u'encryption_key': None, u'fs_label': u'', u'encryption_luks_version': u'luks2', u'raid_stripe_size': None, u'mount_passno': 0, u'_mount_id': u'UUID=173e2a19-a65c-4bfd-a311-0de3d72abddf', u'mount_user': None, u'raid_spare_count': None, u'raid_disks': [], u'_raw_kernel_device': u'/dev/sda1', u'cache_mode': None, u'cache_devices': [], u'deduplication': None, u'mount_group': None, u'thin_pool_size': None, u'disks': [u'sda'], u'cached': False, u'thin': False, u'mount_check': 0, u'cache_size': 0, u'raid_chunk_size': None, u'fs_overwrite_existing': True, u'fs_create_options': u''}) => { "ansible_loop_var": "storage_test_thin_volume", "changed": false, "skip_reason": "Conditional result was False", "storage_test_thin_volume": { "_device": "/dev/sda1", "_kernel_device": "/dev/sda1", "_mount_id": "UUID=173e2a19-a65c-4bfd-a311-0de3d72abddf", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } } TASK [Check member encryption] ************************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:92 Friday 17 January 2025 04:14:50 -0500 (0:00:00.063) 0:04:43.522 ******** included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml for managed-node1 TASK [Set test variables] ****************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:5 Friday 17 January 2025 04:14:50 -0500 (0:00:00.113) 0:04:43.636 ******** ok: [managed-node1] => { "ansible_facts": { "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Validate pool member LUKS settings] ************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:10 Friday 17 January 2025 04:14:50 -0500 (0:00:00.043) 0:04:43.680 ******** TASK [Validate pool member crypttab entries] *********************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:17 Friday 17 January 2025 04:14:51 -0500 (0:00:00.034) 0:04:43.714 ******** TASK [Clear test variables] **************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:24 Friday 17 January 2025 04:14:51 -0500 (0:00:00.034) 0:04:43.749 ******** ok: [managed-node1] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_crypttab_key_file": null }, "changed": false } TASK [Check VDO] *************************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:95 Friday 17 January 2025 04:14:51 -0500 (0:00:00.043) 0:04:43.792 ******** included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-vdo.yml for managed-node1 TASK [Validate pool member VDO settings] *************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-vdo.yml:2 Friday 17 January 2025 04:14:51 -0500 (0:00:00.130) 0:04:43.923 ******** skipping: [managed-node1] => (item={u'_raw_device': u'/dev/sda1', u'raid_metadata_version': None, u'mount_device_identifier': u'uuid', u'fs_type': u'xfs', u'mount_options': u'defaults', u'_device': u'/dev/sda1', u'size': u'4g', u'mount_point': u'/opt/test1', u'compression': None, u'encryption_password': u'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', u'_kernel_device': u'/dev/sda1', u'encryption': False, u'raid_level': None, u'name': u'test1', u'raid_device_count': None, u'state': u'present', u'vdo_pool_size': None, u'mount_mode': None, u'thin_pool_name': None, u'type': u'partition', u'encryption_key_size': 0, u'encryption_cipher': None, u'encryption_key': None, u'fs_label': u'', u'encryption_luks_version': u'luks2', u'raid_stripe_size': None, u'mount_passno': 0, u'_mount_id': u'UUID=173e2a19-a65c-4bfd-a311-0de3d72abddf', u'mount_user': None, u'raid_spare_count': None, u'raid_disks': [], u'_raw_kernel_device': u'/dev/sda1', u'cache_mode': None, u'cache_devices': [], u'deduplication': None, u'mount_group': None, u'thin_pool_size': None, u'disks': [u'sda'], u'cached': False, u'thin': False, u'mount_check': 0, u'cache_size': 0, u'raid_chunk_size': None, u'fs_overwrite_existing': True, u'fs_create_options': u''}) => { "ansible_loop_var": "storage_test_vdo_volume", "changed": false, "skip_reason": "Conditional result was False", "storage_test_vdo_volume": { "_device": "/dev/sda1", "_kernel_device": "/dev/sda1", "_mount_id": "UUID=173e2a19-a65c-4bfd-a311-0de3d72abddf", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } } TASK [Check Stratis] *********************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:98 Friday 17 January 2025 04:14:51 -0500 (0:00:00.085) 0:04:44.008 ******** included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml for managed-node1 TASK [Run 'stratis report'] **************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:6 Friday 17 January 2025 04:14:51 -0500 (0:00:00.121) 0:04:44.130 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about Stratis] ******************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:11 Friday 17 January 2025 04:14:51 -0500 (0:00:00.050) 0:04:44.181 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the pools was created] *************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:15 Friday 17 January 2025 04:14:51 -0500 (0:00:00.037) 0:04:44.218 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that encryption is correctly set] ********************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:25 Friday 17 January 2025 04:14:51 -0500 (0:00:00.037) 0:04:44.256 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that Clevis/Tang encryption is correctly set] ********************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:34 Friday 17 January 2025 04:14:51 -0500 (0:00:00.038) 0:04:44.294 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Reset variable used by test] ********************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:44 Friday 17 January 2025 04:14:51 -0500 (0:00:00.037) 0:04:44.332 ******** ok: [managed-node1] => { "ansible_facts": { "storage_test_stratis_report": null }, "changed": false } TASK [Clean up test variables] ************************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:101 Friday 17 January 2025 04:14:51 -0500 (0:00:00.036) 0:04:44.369 ******** ok: [managed-node1] => { "ansible_facts": { "__pvs_lvm_len": null, "_storage_test_expected_pv_count": null, "_storage_test_expected_pv_type": null, "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [Verify the volumes] ****************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-volumes.yml:3 Friday 17 January 2025 04:14:51 -0500 (0:00:00.039) 0:04:44.409 ******** included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml for managed-node1 TASK [Set storage volume test variables] *************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:2 Friday 17 January 2025 04:14:51 -0500 (0:00:00.110) 0:04:44.519 ******** ok: [managed-node1] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [Run test verify for {{ storage_test_volume_subset }}] ******************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:19 Friday 17 January 2025 04:14:51 -0500 (0:00:00.082) 0:04:44.602 ******** included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml for managed-node1 included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml for managed-node1 included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml for managed-node1 included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml for managed-node1 included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml for managed-node1 included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml for managed-node1 included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml for managed-node1 included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml for managed-node1 TASK [Get expected mount device based on device type] ************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:7 Friday 17 January 2025 04:14:52 -0500 (0:00:00.276) 0:04:44.879 ******** ok: [managed-node1] => { "ansible_facts": { "storage_test_device_path": "/dev/sda1" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:11 Friday 17 January 2025 04:14:52 -0500 (0:00:00.079) 0:04:44.958 ******** ok: [managed-node1] => { "ansible_facts": { "storage_test_mount_expected_mount_point": "/opt/test1", "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Get information about the mountpoint directory] ************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:19 Friday 17 January 2025 04:14:52 -0500 (0:00:00.059) 0:04:45.017 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by device] ******************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:28 Friday 17 January 2025 04:14:52 -0500 (0:00:00.045) 0:04:45.063 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Verify mount directory user] ********************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:36 Friday 17 January 2025 04:14:52 -0500 (0:00:00.049) 0:04:45.113 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory group] ******************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:42 Friday 17 January 2025 04:14:52 -0500 (0:00:00.036) 0:04:45.149 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory permissions] ************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:48 Friday 17 January 2025 04:14:52 -0500 (0:00:00.038) 0:04:45.188 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get path of test volume device] ****************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:57 Friday 17 January 2025 04:14:52 -0500 (0:00:00.039) 0:04:45.228 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:63 Friday 17 January 2025 04:14:52 -0500 (0:00:00.101) 0:04:45.330 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:69 Friday 17 January 2025 04:14:52 -0500 (0:00:00.037) 0:04:45.368 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:79 Friday 17 January 2025 04:14:52 -0500 (0:00:00.046) 0:04:45.414 ******** ok: [managed-node1] => { "ansible_facts": { "storage_test_found_mount_stat": null, "storage_test_mount_expected_mount_point": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:2 Friday 17 January 2025 04:14:52 -0500 (0:00:00.057) 0:04:45.472 ******** ok: [managed-node1] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "UUID=173e2a19-a65c-4bfd-a311-0de3d72abddf " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:17 Friday 17 January 2025 04:14:52 -0500 (0:00:00.090) 0:04:45.562 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:24 Friday 17 January 2025 04:14:52 -0500 (0:00:00.051) 0:04:45.613 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:33 Friday 17 January 2025 04:14:52 -0500 (0:00:00.057) 0:04:45.670 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify fingerprint] ****************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:45 Friday 17 January 2025 04:14:53 -0500 (0:00:00.046) 0:04:45.717 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Clean up variables] ****************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:52 Friday 17 January 2025 04:14:53 -0500 (0:00:00.056) 0:04:45.774 ******** ok: [managed-node1] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:6 Friday 17 January 2025 04:14:53 -0500 (0:00:00.040) 0:04:45.814 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:14 Friday 17 January 2025 04:14:53 -0500 (0:00:00.051) 0:04:45.866 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:3 Friday 17 January 2025 04:14:53 -0500 (0:00:00.056) 0:04:45.923 ******** ok: [managed-node1] => { "changed": false, "stat": { "atime": 1737105283.344978, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1737105283.344978, "dev": 5, "device_type": 2049, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 80701, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/blockdevice", "mode": "0660", "mtime": 1737105283.344978, "nlink": 1, "path": "/dev/sda1", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:9 Friday 17 January 2025 04:14:53 -0500 (0:00:00.332) 0:04:46.255 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:16 Friday 17 January 2025 04:14:53 -0500 (0:00:00.049) 0:04:46.304 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Make sure we got info about this volume] ********************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:23 Friday 17 January 2025 04:14:53 -0500 (0:00:00.040) 0:04:46.344 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Process volume type (set initial value) (1/2)] *************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:29 Friday 17 January 2025 04:14:53 -0500 (0:00:00.046) 0:04:46.391 ******** ok: [managed-node1] => { "ansible_facts": { "st_volume_type": "partition" }, "changed": false } TASK [Process volume type (get RAID value) (2/2)] ****************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:33 Friday 17 January 2025 04:14:53 -0500 (0:00:00.042) 0:04:46.433 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:38 Friday 17 January 2025 04:14:53 -0500 (0:00:00.039) 0:04:46.472 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:3 Friday 17 January 2025 04:14:53 -0500 (0:00:00.045) 0:04:46.518 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Ensure cryptsetup is present] ******************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:10 Friday 17 January 2025 04:14:53 -0500 (0:00:00.037) 0:04:46.556 ******** ok: [managed-node1] => { "changed": false, "rc": 0, "results": [ "cryptsetup-2.0.3-6.el7.x86_64 providing cryptsetup is already installed" ] } TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:16 Friday 17 January 2025 04:14:54 -0500 (0:00:00.622) 0:04:47.178 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:22 Friday 17 January 2025 04:14:54 -0500 (0:00:00.037) 0:04:47.216 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:29 Friday 17 January 2025 04:14:54 -0500 (0:00:00.037) 0:04:47.254 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:40 Friday 17 January 2025 04:14:54 -0500 (0:00:00.050) 0:04:47.304 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:46 Friday 17 January 2025 04:14:54 -0500 (0:00:00.038) 0:04:47.343 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:51 Friday 17 January 2025 04:14:54 -0500 (0:00:00.041) 0:04:47.384 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:63 Friday 17 January 2025 04:14:54 -0500 (0:00:00.038) 0:04:47.422 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:75 Friday 17 January 2025 04:14:54 -0500 (0:00:00.038) 0:04:47.461 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set test variables] ****************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:87 Friday 17 January 2025 04:14:54 -0500 (0:00:00.039) 0:04:47.500 ******** ok: [managed-node1] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:93 Friday 17 January 2025 04:14:54 -0500 (0:00:00.047) 0:04:47.547 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:100 Friday 17 January 2025 04:14:54 -0500 (0:00:00.043) 0:04:47.591 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:108 Friday 17 January 2025 04:14:54 -0500 (0:00:00.040) 0:04:47.632 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:116 Friday 17 January 2025 04:14:54 -0500 (0:00:00.037) 0:04:47.670 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clear test variables] **************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:124 Friday 17 January 2025 04:14:55 -0500 (0:00:00.037) 0:04:47.708 ******** ok: [managed-node1] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [Get information about RAID] ********************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:8 Friday 17 January 2025 04:14:55 -0500 (0:00:00.037) 0:04:47.745 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:14 Friday 17 January 2025 04:14:55 -0500 (0:00:00.039) 0:04:47.784 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:19 Friday 17 January 2025 04:14:55 -0500 (0:00:00.038) 0:04:47.823 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:24 Friday 17 January 2025 04:14:55 -0500 (0:00:00.041) 0:04:47.864 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set chunk size regex] **************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:29 Friday 17 January 2025 04:14:55 -0500 (0:00:00.039) 0:04:47.904 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:37 Friday 17 January 2025 04:14:55 -0500 (0:00:00.038) 0:04:47.943 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:46 Friday 17 January 2025 04:14:55 -0500 (0:00:00.039) 0:04:47.982 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:54 Friday 17 January 2025 04:14:55 -0500 (0:00:00.037) 0:04:48.020 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:62 Friday 17 January 2025 04:14:55 -0500 (0:00:00.038) 0:04:48.058 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:70 Friday 17 January 2025 04:14:55 -0500 (0:00:00.044) 0:04:48.103 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the actual size of the volume] ************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:3 Friday 17 January 2025 04:14:55 -0500 (0:00:00.044) 0:04:48.148 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested size of the volume] ********************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:11 Friday 17 January 2025 04:14:55 -0500 (0:00:00.041) 0:04:48.190 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected size] ********************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:20 Friday 17 January 2025 04:14:55 -0500 (0:00:00.040) 0:04:48.231 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show expected size] ****************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:28 Friday 17 January 2025 04:14:55 -0500 (0:00:00.041) 0:04:48.272 ******** ok: [managed-node1] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [Get the size of parent/pool device] ************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:32 Friday 17 January 2025 04:14:55 -0500 (0:00:00.040) 0:04:48.313 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show test pool] ********************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:46 Friday 17 January 2025 04:14:55 -0500 (0:00:00.043) 0:04:48.356 ******** skipping: [managed-node1] => {} TASK [Show test blockinfo] ***************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:50 Friday 17 January 2025 04:14:55 -0500 (0:00:00.040) 0:04:48.397 ******** skipping: [managed-node1] => {} TASK [Show test pool size] ***************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:54 Friday 17 January 2025 04:14:55 -0500 (0:00:00.041) 0:04:48.438 ******** skipping: [managed-node1] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:58 Friday 17 January 2025 04:14:55 -0500 (0:00:00.040) 0:04:48.479 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default thin pool reserved space values] ********************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:67 Friday 17 January 2025 04:14:55 -0500 (0:00:00.040) 0:04:48.520 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default minimal thin pool reserved space size] *************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:71 Friday 17 January 2025 04:14:55 -0500 (0:00:00.037) 0:04:48.557 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default maximal thin pool reserved space size] *************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:76 Friday 17 January 2025 04:14:55 -0500 (0:00:00.042) 0:04:48.600 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate maximum usable space in thin pool] ***************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:82 Friday 17 January 2025 04:14:55 -0500 (0:00:00.039) 0:04:48.639 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply upper size limit to max usable thin pool space] ******************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:86 Friday 17 January 2025 04:14:55 -0500 (0:00:00.048) 0:04:48.687 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply lower size limit to max usable thin pool space] ******************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:91 Friday 17 January 2025 04:14:56 -0500 (0:00:00.056) 0:04:48.744 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Convert maximum usable thin pool space from int to Size] ***************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:96 Friday 17 January 2025 04:14:56 -0500 (0:00:00.054) 0:04:48.799 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show max thin pool size] ************************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:101 Friday 17 January 2025 04:14:56 -0500 (0:00:00.059) 0:04:48.859 ******** skipping: [managed-node1] => {} TASK [Show volume thin pool size] ********************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:105 Friday 17 January 2025 04:14:56 -0500 (0:00:00.056) 0:04:48.915 ******** skipping: [managed-node1] => {} TASK [Show test volume size] *************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:109 Friday 17 January 2025 04:14:56 -0500 (0:00:00.056) 0:04:48.972 ******** skipping: [managed-node1] => {} TASK [Establish base value for expected thin pool size] ************************ task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:113 Friday 17 January 2025 04:14:56 -0500 (0:00:00.056) 0:04:49.028 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:120 Friday 17 January 2025 04:14:56 -0500 (0:00:00.058) 0:04:49.087 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected thin pool volume size] ***************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:127 Friday 17 January 2025 04:14:56 -0500 (0:00:00.081) 0:04:49.168 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected thin pool volume size based on percentage value] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:131 Friday 17 January 2025 04:14:56 -0500 (0:00:00.050) 0:04:49.218 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Replace expected volume size with calculated value] ********************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:137 Friday 17 January 2025 04:14:56 -0500 (0:00:00.047) 0:04:49.266 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show actual size] ******************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:143 Friday 17 January 2025 04:14:56 -0500 (0:00:00.135) 0:04:49.402 ******** ok: [managed-node1] => { "storage_test_actual_size": { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } } TASK [Show expected size] ****************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:147 Friday 17 January 2025 04:14:56 -0500 (0:00:00.053) 0:04:49.455 ******** ok: [managed-node1] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [Assert expected size is actual size] ************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:151 Friday 17 January 2025 04:14:56 -0500 (0:00:00.047) 0:04:49.502 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about the LV] ******************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:5 Friday 17 January 2025 04:14:56 -0500 (0:00:00.043) 0:04:49.546 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV segment type] ***************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:13 Friday 17 January 2025 04:14:56 -0500 (0:00:00.037) 0:04:49.584 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check segment type] ****************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:17 Friday 17 January 2025 04:14:56 -0500 (0:00:00.038) 0:04:49.623 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV cache size] ******************************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:24 Friday 17 January 2025 04:14:56 -0500 (0:00:00.047) 0:04:49.671 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested cache size] ****************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:31 Friday 17 January 2025 04:14:57 -0500 (0:00:00.054) 0:04:49.725 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected cache size] ************************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:37 Friday 17 January 2025 04:14:57 -0500 (0:00:00.051) 0:04:49.776 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:42 Friday 17 January 2025 04:14:57 -0500 (0:00:00.041) 0:04:49.818 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:25 Friday 17 January 2025 04:14:57 -0500 (0:00:00.056) 0:04:49.875 ******** ok: [managed-node1] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:44 Friday 17 January 2025 04:14:57 -0500 (0:00:00.057) 0:04:49.932 ******** TASK [Clean up variable namespace] ********************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:54 Friday 17 January 2025 04:14:57 -0500 (0:00:00.054) 0:04:49.987 ******** ok: [managed-node1] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null }, "changed": false } TASK [Create a file] *********************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/create-test-file.yml:12 Friday 17 January 2025 04:14:57 -0500 (0:00:00.046) 0:04:50.033 ******** changed: [managed-node1] => { "changed": true, "dest": "/opt/test1/quux", "gid": 0, "group": "root", "mode": "0644", "owner": "root", "secontext": "unconfined_u:object_r:unlabeled_t:s0", "size": 0, "state": "file", "uid": 0 } TASK [Test for correct handling of safe_mode] ********************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:296 Friday 17 January 2025 04:14:57 -0500 (0:00:00.342) 0:04:50.376 ******** included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml for managed-node1 TASK [Store global variable value copy] **************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:4 Friday 17 January 2025 04:14:57 -0500 (0:00:00.138) 0:04:50.514 ******** ok: [managed-node1] => { "ansible_facts": { "storage_pools_global": [], "storage_safe_mode_global": true, "storage_volumes_global": [] }, "changed": false } TASK [Verify role raises correct error] **************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:10 Friday 17 January 2025 04:14:57 -0500 (0:00:00.060) 0:04:50.574 ******** TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Friday 17 January 2025 04:14:57 -0500 (0:00:00.071) 0:04:50.646 ******** included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node1 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Friday 17 January 2025 04:14:58 -0500 (0:00:00.060) 0:04:50.707 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Friday 17 January 2025 04:14:58 -0500 (0:00:00.045) 0:04:50.752 ******** skipping: [managed-node1] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node1] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node1] => (item=CentOS_7.yml) => { "ansible_facts": { "__storage_blivet_diskvolume_mkfs_option_map": { "ext2": "-F", "ext3": "-F", "ext4": "-F" }, "blivet_package_list": [ "python-enum34", "python-blivet3", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "{{ 'libblockdev-s390' if ansible_architecture == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_7.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_7.yml" } skipping: [managed-node1] => (item=CentOS_7.9.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS_7.9.yml", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Friday 17 January 2025 04:14:58 -0500 (0:00:00.129) 0:04:50.882 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Friday 17 January 2025 04:14:58 -0500 (0:00:00.057) 0:04:50.940 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Friday 17 January 2025 04:14:58 -0500 (0:00:00.054) 0:04:50.994 ******** ok: [managed-node1] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Friday 17 January 2025 04:14:58 -0500 (0:00:00.058) 0:04:51.053 ******** ok: [managed-node1] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Friday 17 January 2025 04:14:58 -0500 (0:00:00.052) 0:04:51.106 ******** included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node1 TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2 Friday 17 January 2025 04:14:58 -0500 (0:00:00.137) 0:04:51.244 ******** ok: [managed-node1] => { "changed": false, "rc": 0, "results": [ "python-enum34-1.0.4-1.el7.noarch providing python-enum34 is already installed", "1:python2-blivet3-3.1.3-3.el7.noarch providing python-blivet3 is already installed", "libblockdev-crypto-2.18-5.el7.x86_64 providing libblockdev-crypto is already installed", "libblockdev-dm-2.18-5.el7.x86_64 providing libblockdev-dm is already installed", "libblockdev-lvm-2.18-5.el7.x86_64 providing libblockdev-lvm is already installed", "libblockdev-mdraid-2.18-5.el7.x86_64 providing libblockdev-mdraid is already installed", "libblockdev-swap-2.18-5.el7.x86_64 providing libblockdev-swap is already installed", "libblockdev-2.18-5.el7.x86_64 providing libblockdev is already installed" ] } TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Friday 17 January 2025 04:15:02 -0500 (0:00:03.924) 0:04:55.168 ******** ok: [managed-node1] => { "storage_pools": [ { "disks": [ "sda" ], "name": "foo", "type": "partition", "volumes": [ { "encryption": true, "encryption_luks_version": "luks2", "encryption_password": "yabbadabbadoo", "mount_point": "/opt/test1", "name": "test1", "size": "4g", "type": "partition" } ] } ] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:14 Friday 17 January 2025 04:15:02 -0500 (0:00:00.069) 0:04:55.238 ******** ok: [managed-node1] => { "storage_volumes": [] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19 Friday 17 January 2025 04:15:02 -0500 (0:00:00.110) 0:04:55.348 ******** ok: [managed-node1] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "cryptsetup" ], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:31 Friday 17 January 2025 04:15:06 -0500 (0:00:03.782) 0:04:59.131 ******** included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node1 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Friday 17 January 2025 04:15:06 -0500 (0:00:00.105) 0:04:59.237 ******** TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Friday 17 January 2025 04:15:06 -0500 (0:00:00.051) 0:04:59.288 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Friday 17 January 2025 04:15:06 -0500 (0:00:00.055) 0:04:59.344 ******** TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:37 Friday 17 January 2025 04:15:06 -0500 (0:00:00.055) 0:04:59.399 ******** ok: [managed-node1] => { "changed": false, "rc": 0, "results": [ "cryptsetup-2.0.3-6.el7.x86_64 providing cryptsetup is already installed", "kpartx-0.4.9-136.el7_9.x86_64 providing kpartx is already installed" ] } TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:51 Friday 17 January 2025 04:15:07 -0500 (0:00:00.726) 0:05:00.125 ******** ok: [managed-node1] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "arp-ethers.service": { "name": "arp-ethers.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "brandbot.service": { "name": "brandbot.service", "source": "systemd", "state": "inactive", "status": "static" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "console-shell.service": { "name": "console-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.import1.service": { "name": "dbus-org.freedesktop.import1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.machine1.service": { "name": "dbus-org.freedesktop.machine1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dmraid-activation.service": { "name": "dmraid-activation.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "ebtables.service": { "name": "ebtables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iprdump.service": { "name": "iprdump.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprinit.service": { "name": "iprinit.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprupdate.service": { "name": "iprupdate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-lvmetad.service": { "name": "lvm2-lvmetad.service", "source": "systemd", "state": "running", "status": "static" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "netconsole": { "name": "netconsole", "source": "sysv", "state": "stopped", "status": "disabled" }, "network": { "name": "network", "source": "sysv", "state": "running", "status": "enabled" }, "network.service": { "name": "network.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-config.service": { "name": "nfs-config.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-idmap.service": { "name": "nfs-idmap.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-lock.service": { "name": "nfs-lock.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-rquotad.service": { "name": "nfs-rquotad.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-secure.service": { "name": "nfs-secure.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs.service": { "name": "nfs.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfslock.service": { "name": "nfslock.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "postfix.service": { "name": "postfix.service", "source": "systemd", "state": "running", "status": "enabled" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rhel-autorelabel-mark.service": { "name": "rhel-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-autorelabel.service": { "name": "rhel-autorelabel.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-configure.service": { "name": "rhel-configure.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-dmesg.service": { "name": "rhel-dmesg.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-domainname.service": { "name": "rhel-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-import-state.service": { "name": "rhel-import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-loadmodules.service": { "name": "rhel-loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-readonly.service": { "name": "rhel-readonly.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-rquotad.service": { "name": "rpc-rquotad.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpcgssd.service": { "name": "rpcgssd.service", "source": "systemd", "state": "inactive", "status": "static" }, "rpcidmapd.service": { "name": "rpcidmapd.service", "source": "systemd", "state": "inactive", "status": "static" }, "rsyncd.service": { "name": "rsyncd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyncd@.service": { "name": "rsyncd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-policy-migrate-local-changes@.service": { "name": "selinux-policy-migrate-local-changes@.service", "source": "systemd", "state": "unknown", "status": "static" }, "selinux-policy-migrate-local-changes@targeted.service": { "name": "selinux-policy-migrate-local-changes@targeted.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "unknown" }, "sshd-keygen.service": { "name": "sshd-keygen.service", "source": "systemd", "state": "stopped", "status": "static" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-bootchart.service": { "name": "systemd-bootchart.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-cryptsetup@luks\\x2d35e15af6\\x2d60a8\\x2d41b2\\x2d94e6\\x2d05460ad82662.service": { "name": "systemd-cryptsetup@luks\\x2d35e15af6\\x2d60a8\\x2d41b2\\x2d94e6\\x2d05460ad82662.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-importd.service": { "name": "systemd-importd.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-machined.service": { "name": "systemd-machined.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-nspawn@.service": { "name": "systemd-nspawn@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-readahead-collect.service": { "name": "systemd-readahead-collect.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-readahead-done.service": { "name": "systemd-readahead-done.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "systemd-readahead-drop.service": { "name": "systemd-readahead-drop.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "systemd-readahead-replay.service": { "name": "systemd-readahead-replay.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill@.service": { "name": "systemd-rfkill@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-shutdownd.service": { "name": "systemd-shutdownd.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "wpa_supplicant.service": { "name": "wpa_supplicant.service", "source": "systemd", "state": "inactive", "status": "disabled" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Set storage_cryptsetup_services] ***** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:57 Friday 17 January 2025 04:15:08 -0500 (0:00:01.061) 0:05:01.187 ******** ok: [managed-node1] => { "ansible_facts": { "storage_cryptsetup_services": [ "systemd-cryptsetup@luks\\x2d35e15af6\\x2d60a8\\x2d41b2\\x2d94e6\\x2d05460ad82662.service" ] }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:63 Friday 17 January 2025 04:15:08 -0500 (0:00:00.056) 0:05:01.244 ******** changed: [managed-node1] => (item=systemd-cryptsetup@luks\x2d35e15af6\x2d60a8\x2d41b2\x2d94e6\x2d05460ad82662.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2d35e15af6\\x2d60a8\\x2d41b2\\x2d94e6\\x2d05460ad82662.service", "name": "systemd-cryptsetup@luks\\x2d35e15af6\\x2d60a8\\x2d41b2\\x2d94e6\\x2d05460ad82662.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "After": "systemd-journald.socket cryptsetup-pre.target system-systemd\\x2dcryptsetup.slice systemd-readahead-collect.service dev-sda1.device systemd-readahead-replay.service", "AllowIsolate": "no", "AmbientCapabilities": "0", "AssertResult": "no", "AssertTimestampMonotonic": "0", "Before": "umount.target cryptsetup.target", "BindsTo": "dev-sda1.device", "BlockIOAccounting": "no", "BlockIOWeight": "18446744073709551615", "CPUAccounting": "no", "CPUQuotaPerSecUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "18446744073709551615", "CanIsolate": "no", "CanReload": "no", "CanStart": "yes", "CanStop": "yes", "CapabilityBoundingSet": "18446744073709551615", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "Conflicts": "umount.target", "ControlPID": "0", "DefaultDependencies": "no", "Delegate": "no", "Description": "Cryptography Setup for luks-35e15af6-60a8-41b2-94e6-05460ad82662", "DevicePolicy": "auto", "Documentation": "man:crypttab(5) man:systemd-cryptsetup-generator(8) man:systemd-cryptsetup@.service(8)", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup attach luks-35e15af6-60a8-41b2-94e6-05460ad82662 /dev/sda1 - ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStop": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup detach luks-35e15af6-60a8-41b2-94e6-05460ad82662 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/run/systemd/generator/systemd-cryptsetup@luks\\x2d35e15af6\\x2d60a8\\x2d41b2\\x2d94e6\\x2d05460ad82662.service", "GuessMainPID": "yes", "IOScheduling": "0", "Id": "systemd-cryptsetup@luks\\x2d35e15af6\\x2d60a8\\x2d41b2\\x2d94e6\\x2d05460ad82662.service", "IgnoreOnIsolate": "yes", "IgnoreOnSnapshot": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobTimeoutAction": "none", "JobTimeoutUSec": "0", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "18446744073709551615", "LimitCORE": "18446744073709551615", "LimitCPU": "18446744073709551615", "LimitDATA": "18446744073709551615", "LimitFSIZE": "18446744073709551615", "LimitLOCKS": "18446744073709551615", "LimitMEMLOCK": "65536", "LimitMSGQUEUE": "819200", "LimitNICE": "0", "LimitNOFILE": "4096", "LimitNPROC": "14311", "LimitRSS": "18446744073709551615", "LimitRTPRIO": "0", "LimitRTTIME": "18446744073709551615", "LimitSIGPENDING": "14311", "LimitSTACK": "18446744073709551615", "LoadState": "loaded", "MainPID": "0", "MemoryAccounting": "no", "MemoryCurrent": "18446744073709551615", "MemoryLimit": "18446744073709551615", "MountFlags": "0", "Names": "systemd-cryptsetup@luks\\x2d35e15af6\\x2d60a8\\x2d41b2\\x2d94e6\\x2d05460ad82662.service", "NeedDaemonReload": "yes", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "PrivateDevices": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "ProtectHome": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "yes", "RequiredBy": "cryptsetup.target", "Requires": "system-systemd\\x2dcryptsetup.slice", "Restart": "no", "RestartUSec": "100ms", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "Slice": "system-systemd\\x2dcryptsetup.slice", "SourcePath": "/etc/crypttab", "StandardError": "inherit", "StandardInput": "null", "StandardOutput": "journal", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitInterval": "10000000", "StartupBlockIOWeight": "18446744073709551615", "StartupCPUShares": "18446744073709551615", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "no", "TasksCurrent": "18446744073709551615", "TasksMax": "18446744073709551615", "TimeoutStartUSec": "0", "TimeoutStopUSec": "0", "TimerSlackNSec": "50000", "Transient": "no", "Type": "oneshot", "UMask": "0022", "UnitFilePreset": "disabled", "UnitFileState": "bad", "WantedBy": "dev-sda1.device", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:69 Friday 17 January 2025 04:15:09 -0500 (0:00:00.494) 0:05:01.738 ******** fatal: [managed-node1]: FAILED! => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } MSG: cannot remove existing formatting on device 'sda1' in safe mode due to adding encryption TASK [fedora.linux_system_roles.storage : Failed message] ********************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:109 Friday 17 January 2025 04:15:13 -0500 (0:00:03.961) 0:05:05.700 ******** fatal: [managed-node1]: FAILED! => { "changed": false } MSG: {u'_ansible_no_log': False, u'crypts': [], u'pools': [], u'leaves': [], u'changed': False, u'actions': [], u'failed': True, u'volumes': [], u'invocation': {u'module_args': {u'packages_only': False, u'disklabel_type': None, u'diskvolume_mkfs_option_map': {u'ext4': u'-F', u'ext3': u'-F', u'ext2': u'-F'}, u'safe_mode': True, u'pools': [{u'raid_metadata_version': None, u'encryption_key_size': None, u'encryption_key': None, u'encryption_luks_version': None, u'encryption_tang_url': None, u'raid_spare_count': None, u'grow_to_fill': False, u'encryption_tang_thumbprint': None, u'name': u'foo', u'encryption_password': None, u'encryption': False, u'disks': [u'sda'], u'raid_level': None, u'raid_device_count': None, u'state': u'present', u'volumes': [{u'raid_metadata_version': None, u'mount_device_identifier': u'uuid', u'fs_type': u'xfs', u'mount_options': u'defaults', u'size': u'4g', u'mount_point': u'/opt/test1', u'compression': None, u'encryption_password': u'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', u'encryption': True, u'raid_level': None, u'raid_device_count': None, u'state': u'present', u'vdo_pool_size': None, u'mount_mode': None, u'thin_pool_name': None, u'type': u'partition', u'encryption_key_size': None, u'deduplication': None, u'encryption_cipher': None, u'encryption_key': None, u'fs_label': u'', u'encryption_luks_version': u'luks2', u'raid_stripe_size': None, u'mount_passno': 0, u'mount_user': None, u'raid_spare_count': None, u'name': u'test1', u'cache_mode': None, u'raid_disks': [], u'mount_group': None, u'fs_overwrite_existing': True, u'disks': [u'sda'], u'cached': False, u'thin_pool_size': None, u'thin': False, u'mount_check': 0, u'cache_size': 0, u'raid_chunk_size': None, u'cache_devices': [], u'fs_create_options': u''}], u'shared': False, u'encryption_clevis_pin': None, u'type': u'partition', u'encryption_cipher': None, u'raid_chunk_size': None}], u'volumes': [], u'pool_defaults': {u'raid_metadata_version': None, u'encryption_cipher': None, u'encryption_key': None, u'encryption_luks_version': None, u'raid_spare_count': None, u'grow_to_fill': False, u'encryption_password': None, u'encryption': False, u'disks': [], u'raid_level': None, u'raid_device_count': None, u'state': u'present', u'volumes': [], u'shared': False, u'type': u'lvm', u'encryption_key_size': None, u'raid_chunk_size': None}, u'volume_defaults': {u'raid_metadata_version': None, u'raid_level': None, u'fs_type': u'xfs', u'mount_options': u'defaults', u'size': 0, u'mount_point': u'', u'compression': None, u'encryption_password': None, u'encryption': False, u'mount_device_identifier': u'uuid', u'raid_device_count': None, u'state': u'present', u'vdo_pool_size': None, u'thin_pool_name': None, u'type': u'lvm', u'encryption_key_size': None, u'encryption_cipher': None, u'encryption_key': None, u'fs_label': u'', u'encryption_luks_version': None, u'raid_stripe_size': None, u'cache_size': 0, u'raid_spare_count': None, u'cache_mode': None, u'deduplication': None, u'cached': False, u'fs_overwrite_existing': True, u'disks': [], u'thin_pool_size': None, u'thin': None, u'mount_check': 0, u'mount_passno': 0, u'raid_chunk_size': None, u'cache_devices': [], u'fs_create_options': u''}, u'use_partitions': None}}, u'mounts': [], u'packages': [], u'msg': u"cannot remove existing formatting on device 'sda1' in safe mode due to adding encryption"} TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:113 Friday 17 January 2025 04:15:13 -0500 (0:00:00.061) 0:05:05.762 ******** changed: [managed-node1] => (item=systemd-cryptsetup@luks\x2d35e15af6\x2d60a8\x2d41b2\x2d94e6\x2d05460ad82662.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2d35e15af6\\x2d60a8\\x2d41b2\\x2d94e6\\x2d05460ad82662.service", "name": "systemd-cryptsetup@luks\\x2d35e15af6\\x2d60a8\\x2d41b2\\x2d94e6\\x2d05460ad82662.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AmbientCapabilities": "0", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "18446744073709551615", "CPUAccounting": "no", "CPUQuotaPerSecUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "18446744073709551615", "CanIsolate": "no", "CanReload": "no", "CanStart": "yes", "CanStop": "yes", "CapabilityBoundingSet": "18446744073709551615", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ControlPID": "0", "DefaultDependencies": "yes", "Delegate": "no", "Description": "systemd-cryptsetup@luks\\x2d35e15af6\\x2d60a8\\x2d41b2\\x2d94e6\\x2d05460ad82662.service", "DevicePolicy": "auto", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/dev/null", "GuessMainPID": "yes", "IOScheduling": "0", "Id": "systemd-cryptsetup@luks\\x2d35e15af6\\x2d60a8\\x2d41b2\\x2d94e6\\x2d05460ad82662.service", "IgnoreOnIsolate": "no", "IgnoreOnSnapshot": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobTimeoutAction": "none", "JobTimeoutUSec": "0", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "18446744073709551615", "LimitCORE": "18446744073709551615", "LimitCPU": "18446744073709551615", "LimitDATA": "18446744073709551615", "LimitFSIZE": "18446744073709551615", "LimitLOCKS": "18446744073709551615", "LimitMEMLOCK": "65536", "LimitMSGQUEUE": "819200", "LimitNICE": "0", "LimitNOFILE": "1048576", "LimitNPROC": "14311", "LimitRSS": "18446744073709551615", "LimitRTPRIO": "0", "LimitRTTIME": "18446744073709551615", "LimitSIGPENDING": "14311", "LimitSTACK": "18446744073709551615", "LoadState": "masked", "MainPID": "0", "MemoryAccounting": "no", "MemoryCurrent": "18446744073709551615", "MemoryLimit": "18446744073709551615", "MountFlags": "0", "Names": "systemd-cryptsetup@luks\\x2d35e15af6\\x2d60a8\\x2d41b2\\x2d94e6\\x2d05460ad82662.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "PrivateDevices": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "ProtectHome": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "no", "Restart": "no", "RestartUSec": "100ms", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitInterval": "10000000", "StartupBlockIOWeight": "18446744073709551615", "StartupCPUShares": "18446744073709551615", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "no", "TasksCurrent": "18446744073709551615", "TasksMax": "18446744073709551615", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UMask": "0022", "UnitFileState": "bad", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } TASK [Check that we failed in the role] **************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:23 Friday 17 January 2025 04:15:13 -0500 (0:00:00.543) 0:05:06.306 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Verify the blivet output and error message are correct] ****************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:28 Friday 17 January 2025 04:15:13 -0500 (0:00:00.043) 0:05:06.350 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Verify correct exception or error message] ******************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:39 Friday 17 January 2025 04:15:13 -0500 (0:00:00.053) 0:05:06.404 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Stat the file] *********************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml:11 Friday 17 January 2025 04:15:13 -0500 (0:00:00.037) 0:05:06.442 ******** ok: [managed-node1] => { "changed": false, "stat": { "atime": 1737105297.6099882, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1737105297.6099882, "dev": 2049, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 67, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0644", "mtime": 1737105297.6099882, "nlink": 1, "path": "/opt/test1/quux", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "version": "18446744072005817143", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Assert file presence] **************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml:16 Friday 17 January 2025 04:15:14 -0500 (0:00:00.414) 0:05:06.856 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Create a key file] ******************************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:323 Friday 17 January 2025 04:15:14 -0500 (0:00:00.063) 0:05:06.919 ******** ok: [managed-node1] => { "changed": false, "gid": 0, "group": "root", "mode": "0600", "owner": "root", "path": "/tmp/storage_testyWoW78lukskey", "secontext": "unconfined_u:object_r:user_tmp_t:s0", "size": 0, "state": "file", "uid": 0 } TASK [Write the key into the key file] ***************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:330 Friday 17 January 2025 04:15:15 -0500 (0:00:00.834) 0:05:07.754 ******** ok: [managed-node1] => { "changed": false, "checksum": "7a4dff3752e2baf5617c57eaac048e2b95e8af91", "dest": "/tmp/storage_testyWoW78lukskey", "gid": 0, "group": "root", "md5sum": "4ac07b967150835c00d0865161e48744", "mode": "0600", "owner": "root", "secontext": "unconfined_u:object_r:user_tmp_t:s0", "size": 32, "src": "/root/.ansible/tmp/ansible-tmp-1737105315.13-25540-63369759493750/source", "state": "file", "uid": 0 } TASK [Add encryption to the volume] ******************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:337 Friday 17 January 2025 04:15:16 -0500 (0:00:01.136) 0:05:08.890 ******** TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Friday 17 January 2025 04:15:16 -0500 (0:00:00.082) 0:05:08.972 ******** included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node1 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Friday 17 January 2025 04:15:16 -0500 (0:00:00.126) 0:05:09.099 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Friday 17 January 2025 04:15:16 -0500 (0:00:00.066) 0:05:09.165 ******** skipping: [managed-node1] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node1] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node1] => (item=CentOS_7.yml) => { "ansible_facts": { "__storage_blivet_diskvolume_mkfs_option_map": { "ext2": "-F", "ext3": "-F", "ext4": "-F" }, "blivet_package_list": [ "python-enum34", "python-blivet3", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "{{ 'libblockdev-s390' if ansible_architecture == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_7.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_7.yml" } skipping: [managed-node1] => (item=CentOS_7.9.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS_7.9.yml", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Friday 17 January 2025 04:15:16 -0500 (0:00:00.136) 0:05:09.302 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Friday 17 January 2025 04:15:16 -0500 (0:00:00.037) 0:05:09.339 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Friday 17 January 2025 04:15:16 -0500 (0:00:00.038) 0:05:09.378 ******** ok: [managed-node1] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Friday 17 January 2025 04:15:16 -0500 (0:00:00.040) 0:05:09.418 ******** ok: [managed-node1] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Friday 17 January 2025 04:15:16 -0500 (0:00:00.066) 0:05:09.485 ******** included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node1 TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2 Friday 17 January 2025 04:15:16 -0500 (0:00:00.163) 0:05:09.648 ******** ok: [managed-node1] => { "changed": false, "rc": 0, "results": [ "python-enum34-1.0.4-1.el7.noarch providing python-enum34 is already installed", "1:python2-blivet3-3.1.3-3.el7.noarch providing python-blivet3 is already installed", "libblockdev-crypto-2.18-5.el7.x86_64 providing libblockdev-crypto is already installed", "libblockdev-dm-2.18-5.el7.x86_64 providing libblockdev-dm is already installed", "libblockdev-lvm-2.18-5.el7.x86_64 providing libblockdev-lvm is already installed", "libblockdev-mdraid-2.18-5.el7.x86_64 providing libblockdev-mdraid is already installed", "libblockdev-swap-2.18-5.el7.x86_64 providing libblockdev-swap is already installed", "libblockdev-2.18-5.el7.x86_64 providing libblockdev is already installed" ] } TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Friday 17 January 2025 04:15:19 -0500 (0:00:02.231) 0:05:11.880 ******** ok: [managed-node1] => { "storage_pools": [ { "disks": [ "sda" ], "name": "foo", "type": "partition", "volumes": [ { "encryption": true, "encryption_key": "/tmp/storage_testyWoW78lukskey", "encryption_luks_version": "luks2", "mount_point": "/opt/test1", "name": "test1", "size": "4g", "type": "partition" } ] } ] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:14 Friday 17 January 2025 04:15:19 -0500 (0:00:00.071) 0:05:11.952 ******** ok: [managed-node1] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19 Friday 17 January 2025 04:15:19 -0500 (0:00:00.061) 0:05:12.014 ******** ok: [managed-node1] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "cryptsetup" ], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:31 Friday 17 January 2025 04:15:23 -0500 (0:00:04.074) 0:05:16.088 ******** included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node1 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Friday 17 January 2025 04:15:23 -0500 (0:00:00.160) 0:05:16.249 ******** TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Friday 17 January 2025 04:15:23 -0500 (0:00:00.184) 0:05:16.434 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Friday 17 January 2025 04:15:23 -0500 (0:00:00.057) 0:05:16.491 ******** TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:37 Friday 17 January 2025 04:15:23 -0500 (0:00:00.052) 0:05:16.544 ******** ok: [managed-node1] => { "changed": false, "rc": 0, "results": [ "cryptsetup-2.0.3-6.el7.x86_64 providing cryptsetup is already installed", "kpartx-0.4.9-136.el7_9.x86_64 providing kpartx is already installed" ] } TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:51 Friday 17 January 2025 04:15:24 -0500 (0:00:00.765) 0:05:17.310 ******** ok: [managed-node1] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "arp-ethers.service": { "name": "arp-ethers.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "brandbot.service": { "name": "brandbot.service", "source": "systemd", "state": "inactive", "status": "static" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "console-shell.service": { "name": "console-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.import1.service": { "name": "dbus-org.freedesktop.import1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.machine1.service": { "name": "dbus-org.freedesktop.machine1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dmraid-activation.service": { "name": "dmraid-activation.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "ebtables.service": { "name": "ebtables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iprdump.service": { "name": "iprdump.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprinit.service": { "name": "iprinit.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprupdate.service": { "name": "iprupdate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-lvmetad.service": { "name": "lvm2-lvmetad.service", "source": "systemd", "state": "running", "status": "static" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "netconsole": { "name": "netconsole", "source": "sysv", "state": "stopped", "status": "disabled" }, "network": { "name": "network", "source": "sysv", "state": "running", "status": "enabled" }, "network.service": { "name": "network.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-config.service": { "name": "nfs-config.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-idmap.service": { "name": "nfs-idmap.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-lock.service": { "name": "nfs-lock.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-rquotad.service": { "name": "nfs-rquotad.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-secure.service": { "name": "nfs-secure.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs.service": { "name": "nfs.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfslock.service": { "name": "nfslock.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "postfix.service": { "name": "postfix.service", "source": "systemd", "state": "running", "status": "enabled" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rhel-autorelabel-mark.service": { "name": "rhel-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-autorelabel.service": { "name": "rhel-autorelabel.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-configure.service": { "name": "rhel-configure.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-dmesg.service": { "name": "rhel-dmesg.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-domainname.service": { "name": "rhel-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-import-state.service": { "name": "rhel-import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-loadmodules.service": { "name": "rhel-loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-readonly.service": { "name": "rhel-readonly.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-rquotad.service": { "name": "rpc-rquotad.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpcgssd.service": { "name": "rpcgssd.service", "source": "systemd", "state": "inactive", "status": "static" }, "rpcidmapd.service": { "name": "rpcidmapd.service", "source": "systemd", "state": "inactive", "status": "static" }, "rsyncd.service": { "name": "rsyncd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyncd@.service": { "name": "rsyncd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-policy-migrate-local-changes@.service": { "name": "selinux-policy-migrate-local-changes@.service", "source": "systemd", "state": "unknown", "status": "static" }, "selinux-policy-migrate-local-changes@targeted.service": { "name": "selinux-policy-migrate-local-changes@targeted.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "unknown" }, "sshd-keygen.service": { "name": "sshd-keygen.service", "source": "systemd", "state": "stopped", "status": "static" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-bootchart.service": { "name": "systemd-bootchart.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-importd.service": { "name": "systemd-importd.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-machined.service": { "name": "systemd-machined.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-nspawn@.service": { "name": "systemd-nspawn@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-readahead-collect.service": { "name": "systemd-readahead-collect.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-readahead-done.service": { "name": "systemd-readahead-done.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "systemd-readahead-drop.service": { "name": "systemd-readahead-drop.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "systemd-readahead-replay.service": { "name": "systemd-readahead-replay.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill@.service": { "name": "systemd-rfkill@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-shutdownd.service": { "name": "systemd-shutdownd.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "wpa_supplicant.service": { "name": "wpa_supplicant.service", "source": "systemd", "state": "inactive", "status": "disabled" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Set storage_cryptsetup_services] ***** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:57 Friday 17 January 2025 04:15:25 -0500 (0:00:01.245) 0:05:18.555 ******** ok: [managed-node1] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:63 Friday 17 January 2025 04:15:25 -0500 (0:00:00.102) 0:05:18.657 ******** TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:69 Friday 17 January 2025 04:15:26 -0500 (0:00:00.093) 0:05:18.751 ******** changed: [managed-node1] => { "actions": [ { "action": "destroy format", "device": "/dev/sda1", "fs_type": "xfs" }, { "action": "create format", "device": "/dev/sda1", "fs_type": "luks" }, { "action": "create device", "device": "/dev/mapper/luks-0989597c-795f-4f25-b2f2-121f3eb103c2", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/luks-0989597c-795f-4f25-b2f2-121f3eb103c2", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/sda1", "name": "luks-0989597c-795f-4f25-b2f2-121f3eb103c2", "password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "state": "present" } ], "leaves": [ "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1", "/dev/mapper/luks-0989597c-795f-4f25-b2f2-121f3eb103c2" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "UUID=173e2a19-a65c-4bfd-a311-0de3d72abddf", "state": "absent" }, { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-0989597c-795f-4f25-b2f2-121f3eb103c2", "state": "mounted" } ], "packages": [ "cryptsetup", "xfsprogs", "e2fsprogs" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "partition", "volumes": [ { "_device": "/dev/mapper/luks-0989597c-795f-4f25-b2f2-121f3eb103c2", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-0989597c-795f-4f25-b2f2-121f3eb103c2", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "encryption_key_size": null, "encryption_luks_version": "luks2", "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } ] } ], "volumes": [] } TASK [fedora.linux_system_roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:83 Friday 17 January 2025 04:15:38 -0500 (0:00:12.931) 0:05:31.682 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if /etc/fstab is present] ****** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:90 Friday 17 January 2025 04:15:39 -0500 (0:00:00.066) 0:05:31.748 ******** ok: [managed-node1] => { "changed": false, "stat": { "atime": 1737105285.88198, "attr_flags": "e", "attributes": [ "extents" ], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "249b751cae0251b6e0c69fad9f818d9a6eb71baf", "ctime": 1737105285.8799798, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 263644, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0644", "mtime": 1737105285.8799798, "nlink": 1, "path": "/etc/fstab", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 1299, "uid": 0, "version": "18446744072031193646", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Add fingerprint to /etc/fstab if present] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:95 Friday 17 January 2025 04:15:39 -0500 (0:00:00.508) 0:05:32.257 ******** ok: [managed-node1] => { "backup": "", "changed": false } TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:113 Friday 17 January 2025 04:15:40 -0500 (0:00:00.655) 0:05:32.912 ******** TASK [fedora.linux_system_roles.storage : Show blivet_output] ****************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:119 Friday 17 January 2025 04:15:40 -0500 (0:00:00.105) 0:05:33.017 ******** ok: [managed-node1] => { "blivet_output": { "actions": [ { "action": "destroy format", "device": "/dev/sda1", "fs_type": "xfs" }, { "action": "create format", "device": "/dev/sda1", "fs_type": "luks" }, { "action": "create device", "device": "/dev/mapper/luks-0989597c-795f-4f25-b2f2-121f3eb103c2", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/luks-0989597c-795f-4f25-b2f2-121f3eb103c2", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/sda1", "name": "luks-0989597c-795f-4f25-b2f2-121f3eb103c2", "password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "state": "present" } ], "failed": false, "leaves": [ "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1", "/dev/mapper/luks-0989597c-795f-4f25-b2f2-121f3eb103c2" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "UUID=173e2a19-a65c-4bfd-a311-0de3d72abddf", "state": "absent" }, { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-0989597c-795f-4f25-b2f2-121f3eb103c2", "state": "mounted" } ], "packages": [ "cryptsetup", "xfsprogs", "e2fsprogs" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "partition", "volumes": [ { "_device": "/dev/mapper/luks-0989597c-795f-4f25-b2f2-121f3eb103c2", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-0989597c-795f-4f25-b2f2-121f3eb103c2", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "encryption_key_size": null, "encryption_luks_version": "luks2", "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } ] } ], "volumes": [] } } TASK [fedora.linux_system_roles.storage : Set the list of pools for test verification] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:128 Friday 17 January 2025 04:15:40 -0500 (0:00:00.130) 0:05:33.148 ******** ok: [managed-node1] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "partition", "volumes": [ { "_device": "/dev/mapper/luks-0989597c-795f-4f25-b2f2-121f3eb103c2", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-0989597c-795f-4f25-b2f2-121f3eb103c2", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "encryption_key_size": null, "encryption_luks_version": "luks2", "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } ] } ] }, "changed": false } TASK [fedora.linux_system_roles.storage : Set the list of volumes for test verification] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:132 Friday 17 January 2025 04:15:40 -0500 (0:00:00.096) 0:05:33.244 ******** ok: [managed-node1] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Remove obsolete mounts] ************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:148 Friday 17 January 2025 04:15:40 -0500 (0:00:00.078) 0:05:33.322 ******** changed: [managed-node1] => (item={u'src': u'UUID=173e2a19-a65c-4bfd-a311-0de3d72abddf', u'state': u'absent', u'fstype': u'xfs', u'path': u'/opt/test1'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "fstype": "xfs", "path": "/opt/test1", "src": "UUID=173e2a19-a65c-4bfd-a311-0de3d72abddf", "state": "absent" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "UUID=173e2a19-a65c-4bfd-a311-0de3d72abddf" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:159 Friday 17 January 2025 04:15:41 -0500 (0:00:00.434) 0:05:33.757 ******** ok: [managed-node1] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Set up new/current mounts] *********** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:164 Friday 17 January 2025 04:15:41 -0500 (0:00:00.575) 0:05:34.333 ******** changed: [managed-node1] => (item={u'src': u'/dev/mapper/luks-0989597c-795f-4f25-b2f2-121f3eb103c2', u'group': None, u'dump': 0, u'passno': 0, u'fstype': u'xfs', u'state': u'mounted', u'mode': None, u'owner': None, u'path': u'/opt/test1', u'opts': u'defaults'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-0989597c-795f-4f25-b2f2-121f3eb103c2", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/luks-0989597c-795f-4f25-b2f2-121f3eb103c2" } TASK [fedora.linux_system_roles.storage : Manage mount ownership/permissions] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:175 Friday 17 January 2025 04:15:42 -0500 (0:00:00.739) 0:05:35.072 ******** skipping: [managed-node1] => (item={u'src': u'/dev/mapper/luks-0989597c-795f-4f25-b2f2-121f3eb103c2', u'group': None, u'dump': 0, u'passno': 0, u'fstype': u'xfs', u'state': u'mounted', u'mode': None, u'owner': None, u'path': u'/opt/test1', u'opts': u'defaults'}) => { "ansible_loop_var": "mount_info", "changed": false, "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-0989597c-795f-4f25-b2f2-121f3eb103c2", "state": "mounted" }, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:187 Friday 17 January 2025 04:15:42 -0500 (0:00:00.138) 0:05:35.210 ******** ok: [managed-node1] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Retrieve facts for the /etc/crypttab file] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:195 Friday 17 January 2025 04:15:43 -0500 (0:00:00.742) 0:05:35.953 ******** ok: [managed-node1] => { "changed": false, "stat": { "atime": 1737105288.915982, "attr_flags": "e", "attributes": [ "extents" ], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1737105287.0559807, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 263647, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0600", "mtime": 1737105287.0549808, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": "18446744072031194333", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Manage /etc/crypttab to account for changes we just made] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:200 Friday 17 January 2025 04:15:43 -0500 (0:00:00.472) 0:05:36.425 ******** changed: [managed-node1] => (item={u'state': u'present', u'password': u'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', u'name': u'luks-0989597c-795f-4f25-b2f2-121f3eb103c2', u'backing_device': u'/dev/sda1'}) => { "ansible_loop_var": "entry", "backup": "", "changed": true, "entry": { "backing_device": "/dev/sda1", "name": "luks-0989597c-795f-4f25-b2f2-121f3eb103c2", "password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "state": "present" } } MSG: line added TASK [fedora.linux_system_roles.storage : Update facts] ************************ task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:222 Friday 17 January 2025 04:15:44 -0500 (0:00:00.396) 0:05:36.822 ******** ok: [managed-node1] TASK [Verify role results] ***************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:355 Friday 17 January 2025 04:15:44 -0500 (0:00:00.732) 0:05:37.554 ******** included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml for managed-node1 TASK [Print out pool information] ********************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:2 Friday 17 January 2025 04:15:44 -0500 (0:00:00.081) 0:05:37.636 ******** ok: [managed-node1] => { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "partition", "volumes": [ { "_device": "/dev/mapper/luks-0989597c-795f-4f25-b2f2-121f3eb103c2", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-0989597c-795f-4f25-b2f2-121f3eb103c2", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "encryption_key_size": null, "encryption_luks_version": "luks2", "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } ] } ] } TASK [Print out volume information] ******************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:7 Friday 17 January 2025 04:15:45 -0500 (0:00:00.062) 0:05:37.699 ******** skipping: [managed-node1] => {} TASK [Collect info about the volumes.] ***************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:15 Friday 17 January 2025 04:15:45 -0500 (0:00:00.042) 0:05:37.741 ******** ok: [managed-node1] => { "changed": false, "info": { "/dev/mapper/luks-0989597c-795f-4f25-b2f2-121f3eb103c2": { "fstype": "xfs", "label": "", "mountpoint": "/opt/test1", "name": "/dev/mapper/luks-0989597c-795f-4f25-b2f2-121f3eb103c2", "size": "10G", "type": "crypt", "uuid": "ac1c240a-4b60-40aa-ab6a-711d7c5a5551" }, "/dev/sda": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sda1": { "fstype": "crypto_LUKS", "label": "", "mountpoint": "", "name": "/dev/sda1", "size": "10G", "type": "partition", "uuid": "0989597c-795f-4f25-b2f2-121f3eb103c2" }, "/dev/sdb": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdd": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdd", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sde": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sde", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdf": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdf", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdg": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdg", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdh": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdh", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdi": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdi", "size": "10G", "type": "disk", "uuid": "" }, "/dev/xvda": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/xvda", "size": "250G", "type": "disk", "uuid": "" }, "/dev/xvda1": { "fstype": "ext4", "label": "", "mountpoint": "/", "name": "/dev/xvda1", "size": "250G", "type": "partition", "uuid": "c7b7d6a5-fd01-4b9b-bcca-153eaff9d312" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:20 Friday 17 January 2025 04:15:45 -0500 (0:00:00.331) 0:05:38.072 ******** ok: [managed-node1] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.002866", "end": "2025-01-17 04:15:45.649348", "rc": 0, "start": "2025-01-17 04:15:45.646482" } STDOUT: # system_role:storage # # /etc/fstab # Created by anaconda on Thu Jun 20 10:23:46 2024 # # Accessible filesystems, by reference, are maintained under '/dev/disk' # See man pages fstab(5), findfs(8), mount(8) and/or blkid(8) for more info # UUID=c7b7d6a5-fd01-4b9b-bcca-153eaff9d312 / ext4 defaults 1 1 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat /mnt/redhat nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/qa /mnt/qa nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 vtap-eng01.storage.rdu2.redhat.com:/vol/engarchive /mnt/engarchive nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/tpsdist /mnt/tpsdist nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat/brewroot /mnt/brew nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_brew_scratch_nfs/scratch /mnt/brew_scratch nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 /dev/mapper/luks-0989597c-795f-4f25-b2f2-121f3eb103c2 /opt/test1 xfs defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:25 Friday 17 January 2025 04:15:45 -0500 (0:00:00.379) 0:05:38.452 ******** ok: [managed-node1] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.002738", "end": "2025-01-17 04:15:46.100341", "failed_when_result": false, "rc": 0, "start": "2025-01-17 04:15:46.097603" } STDOUT: luks-0989597c-795f-4f25-b2f2-121f3eb103c2 /dev/sda1 VALUE_SPECIFIED_IN_NO_LOG_PARAMETER TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:34 Friday 17 January 2025 04:15:46 -0500 (0:00:00.432) 0:05:38.884 ******** included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml for managed-node1 TASK [Set _storage_pool_tests] ************************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:5 Friday 17 January 2025 04:15:46 -0500 (0:00:00.116) 0:05:39.001 ******** ok: [managed-node1] => { "ansible_facts": { "_storage_pool_tests": [ "members", "volumes" ] }, "changed": false } TASK [Get VG shared value status] ********************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:18 Friday 17 January 2025 04:15:46 -0500 (0:00:00.128) 0:05:39.130 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that VG shared value checks out] ********************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:24 Friday 17 January 2025 04:15:46 -0500 (0:00:00.062) 0:05:39.192 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify pool subset] ****************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:34 Friday 17 January 2025 04:15:46 -0500 (0:00:00.057) 0:05:39.249 ******** included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml for managed-node1 included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-volumes.yml for managed-node1 TASK [Set test variables] ****************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:2 Friday 17 January 2025 04:15:46 -0500 (0:00:00.136) 0:05:39.385 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get the canonical device path for each member device] ******************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:8 Friday 17 January 2025 04:15:46 -0500 (0:00:00.060) 0:05:39.445 ******** TASK [Set pvs lvm length] ****************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:17 Friday 17 January 2025 04:15:46 -0500 (0:00:00.071) 0:05:39.517 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set pool pvs] ************************************************************ task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:22 Friday 17 January 2025 04:15:46 -0500 (0:00:00.055) 0:05:39.572 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify PV count] ********************************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:27 Friday 17 January 2025 04:15:46 -0500 (0:00:00.054) 0:05:39.627 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected pv type] **************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:36 Friday 17 January 2025 04:15:46 -0500 (0:00:00.056) 0:05:39.683 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected pv type] **************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:41 Friday 17 January 2025 04:15:47 -0500 (0:00:00.048) 0:05:39.732 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected pv type] **************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:46 Friday 17 January 2025 04:15:47 -0500 (0:00:00.045) 0:05:39.777 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check the type of each PV] *********************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:51 Friday 17 January 2025 04:15:47 -0500 (0:00:00.043) 0:05:39.821 ******** TASK [Check that blivet supports PV grow to fill] ****************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:64 Friday 17 January 2025 04:15:47 -0500 (0:00:00.043) 0:05:39.864 ******** ok: [managed-node1] => { "changed": false, "rc": 0 } STDOUT: False STDERR: Shared connection to 10.31.46.65 closed. TASK [Verify that PVs fill the whole devices when they should] ***************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:73 Friday 17 January 2025 04:15:47 -0500 (0:00:00.245) 0:05:40.110 ******** TASK [Check MD RAID] *********************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:83 Friday 17 January 2025 04:15:47 -0500 (0:00:00.052) 0:05:40.162 ******** included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml for managed-node1 TASK [Get information about RAID] ********************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:8 Friday 17 January 2025 04:15:47 -0500 (0:00:00.101) 0:05:40.264 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:14 Friday 17 January 2025 04:15:47 -0500 (0:00:00.047) 0:05:40.311 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:19 Friday 17 January 2025 04:15:47 -0500 (0:00:00.047) 0:05:40.358 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:24 Friday 17 January 2025 04:15:47 -0500 (0:00:00.043) 0:05:40.402 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md chunk size regex] ************************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:29 Friday 17 January 2025 04:15:47 -0500 (0:00:00.037) 0:05:40.440 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:37 Friday 17 January 2025 04:15:47 -0500 (0:00:00.038) 0:05:40.478 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:46 Friday 17 January 2025 04:15:47 -0500 (0:00:00.044) 0:05:40.522 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:55 Friday 17 January 2025 04:15:47 -0500 (0:00:00.042) 0:05:40.565 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:64 Friday 17 January 2025 04:15:47 -0500 (0:00:00.046) 0:05:40.611 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:74 Friday 17 January 2025 04:15:47 -0500 (0:00:00.053) 0:05:40.665 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Reset variables used by tests] ******************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:83 Friday 17 January 2025 04:15:48 -0500 (0:00:00.057) 0:05:40.722 ******** ok: [managed-node1] => { "ansible_facts": { "storage_test_md_active_devices_re": null, "storage_test_md_chunk_size_re": null, "storage_test_md_metadata_version_re": null, "storage_test_md_spare_devices_re": null }, "changed": false } TASK [Check LVM RAID] ********************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:86 Friday 17 January 2025 04:15:48 -0500 (0:00:00.057) 0:05:40.780 ******** included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-lvmraid.yml for managed-node1 TASK [Validate pool member LVM RAID settings] ********************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-lvmraid.yml:2 Friday 17 January 2025 04:15:48 -0500 (0:00:00.112) 0:05:40.893 ******** skipping: [managed-node1] => (item={u'_raw_device': u'/dev/sda1', u'raid_metadata_version': None, u'mount_device_identifier': u'uuid', u'fs_type': u'xfs', u'mount_options': u'defaults', u'_device': u'/dev/mapper/luks-0989597c-795f-4f25-b2f2-121f3eb103c2', u'size': u'4g', u'mount_point': u'/opt/test1', u'compression': None, u'encryption_password': None, u'_kernel_device': u'/dev/dm-0', u'encryption': True, u'raid_level': None, u'name': u'test1', u'raid_device_count': None, u'state': u'present', u'vdo_pool_size': None, u'mount_mode': None, u'thin_pool_name': None, u'type': u'partition', u'encryption_key_size': None, u'encryption_cipher': None, u'encryption_key': u'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', u'fs_label': u'', u'encryption_luks_version': u'luks2', u'raid_stripe_size': None, u'mount_passno': 0, u'_mount_id': u'/dev/mapper/luks-0989597c-795f-4f25-b2f2-121f3eb103c2', u'mount_user': None, u'raid_spare_count': None, u'raid_disks': [], u'_raw_kernel_device': u'/dev/sda1', u'cache_mode': None, u'cache_devices': [], u'deduplication': None, u'mount_group': None, u'thin_pool_size': None, u'disks': [u'sda'], u'cached': False, u'thin': False, u'mount_check': 0, u'cache_size': 0, u'raid_chunk_size': None, u'fs_overwrite_existing': True, u'fs_create_options': u''}) => { "ansible_loop_var": "storage_test_lvmraid_volume", "changed": false, "skip_reason": "Conditional result was False", "storage_test_lvmraid_volume": { "_device": "/dev/mapper/luks-0989597c-795f-4f25-b2f2-121f3eb103c2", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-0989597c-795f-4f25-b2f2-121f3eb103c2", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "encryption_key_size": null, "encryption_luks_version": "luks2", "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } } TASK [Check Thin Pools] ******************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:89 Friday 17 January 2025 04:15:48 -0500 (0:00:00.066) 0:05:40.959 ******** included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-thin.yml for managed-node1 TASK [Validate pool member thinpool settings] ********************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-thin.yml:2 Friday 17 January 2025 04:15:48 -0500 (0:00:00.092) 0:05:41.051 ******** skipping: [managed-node1] => (item={u'_raw_device': u'/dev/sda1', u'raid_metadata_version': None, u'mount_device_identifier': u'uuid', u'fs_type': u'xfs', u'mount_options': u'defaults', u'_device': u'/dev/mapper/luks-0989597c-795f-4f25-b2f2-121f3eb103c2', u'size': u'4g', u'mount_point': u'/opt/test1', u'compression': None, u'encryption_password': None, u'_kernel_device': u'/dev/dm-0', u'encryption': True, u'raid_level': None, u'name': u'test1', u'raid_device_count': None, u'state': u'present', u'vdo_pool_size': None, u'mount_mode': None, u'thin_pool_name': None, u'type': u'partition', u'encryption_key_size': None, u'encryption_cipher': None, u'encryption_key': u'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', u'fs_label': u'', u'encryption_luks_version': u'luks2', u'raid_stripe_size': None, u'mount_passno': 0, u'_mount_id': u'/dev/mapper/luks-0989597c-795f-4f25-b2f2-121f3eb103c2', u'mount_user': None, u'raid_spare_count': None, u'raid_disks': [], u'_raw_kernel_device': u'/dev/sda1', u'cache_mode': None, u'cache_devices': [], u'deduplication': None, u'mount_group': None, u'thin_pool_size': None, u'disks': [u'sda'], u'cached': False, u'thin': False, u'mount_check': 0, u'cache_size': 0, u'raid_chunk_size': None, u'fs_overwrite_existing': True, u'fs_create_options': u''}) => { "ansible_loop_var": "storage_test_thin_volume", "changed": false, "skip_reason": "Conditional result was False", "storage_test_thin_volume": { "_device": "/dev/mapper/luks-0989597c-795f-4f25-b2f2-121f3eb103c2", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-0989597c-795f-4f25-b2f2-121f3eb103c2", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "encryption_key_size": null, "encryption_luks_version": "luks2", "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } } TASK [Check member encryption] ************************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:92 Friday 17 January 2025 04:15:48 -0500 (0:00:00.051) 0:05:41.103 ******** included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml for managed-node1 TASK [Set test variables] ****************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:5 Friday 17 January 2025 04:15:48 -0500 (0:00:00.112) 0:05:41.215 ******** ok: [managed-node1] => { "ansible_facts": { "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Validate pool member LUKS settings] ************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:10 Friday 17 January 2025 04:15:48 -0500 (0:00:00.067) 0:05:41.282 ******** TASK [Validate pool member crypttab entries] *********************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:17 Friday 17 January 2025 04:15:48 -0500 (0:00:00.054) 0:05:41.336 ******** TASK [Clear test variables] **************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:24 Friday 17 January 2025 04:15:48 -0500 (0:00:00.055) 0:05:41.392 ******** ok: [managed-node1] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_crypttab_key_file": null }, "changed": false } TASK [Check VDO] *************************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:95 Friday 17 January 2025 04:15:48 -0500 (0:00:00.064) 0:05:41.456 ******** included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-vdo.yml for managed-node1 TASK [Validate pool member VDO settings] *************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-vdo.yml:2 Friday 17 January 2025 04:15:48 -0500 (0:00:00.149) 0:05:41.606 ******** skipping: [managed-node1] => (item={u'_raw_device': u'/dev/sda1', u'raid_metadata_version': None, u'mount_device_identifier': u'uuid', u'fs_type': u'xfs', u'mount_options': u'defaults', u'_device': u'/dev/mapper/luks-0989597c-795f-4f25-b2f2-121f3eb103c2', u'size': u'4g', u'mount_point': u'/opt/test1', u'compression': None, u'encryption_password': None, u'_kernel_device': u'/dev/dm-0', u'encryption': True, u'raid_level': None, u'name': u'test1', u'raid_device_count': None, u'state': u'present', u'vdo_pool_size': None, u'mount_mode': None, u'thin_pool_name': None, u'type': u'partition', u'encryption_key_size': None, u'encryption_cipher': None, u'encryption_key': u'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', u'fs_label': u'', u'encryption_luks_version': u'luks2', u'raid_stripe_size': None, u'mount_passno': 0, u'_mount_id': u'/dev/mapper/luks-0989597c-795f-4f25-b2f2-121f3eb103c2', u'mount_user': None, u'raid_spare_count': None, u'raid_disks': [], u'_raw_kernel_device': u'/dev/sda1', u'cache_mode': None, u'cache_devices': [], u'deduplication': None, u'mount_group': None, u'thin_pool_size': None, u'disks': [u'sda'], u'cached': False, u'thin': False, u'mount_check': 0, u'cache_size': 0, u'raid_chunk_size': None, u'fs_overwrite_existing': True, u'fs_create_options': u''}) => { "ansible_loop_var": "storage_test_vdo_volume", "changed": false, "skip_reason": "Conditional result was False", "storage_test_vdo_volume": { "_device": "/dev/mapper/luks-0989597c-795f-4f25-b2f2-121f3eb103c2", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-0989597c-795f-4f25-b2f2-121f3eb103c2", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "encryption_key_size": null, "encryption_luks_version": "luks2", "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } } TASK [Check Stratis] *********************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:98 Friday 17 January 2025 04:15:48 -0500 (0:00:00.082) 0:05:41.688 ******** included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml for managed-node1 TASK [Run 'stratis report'] **************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:6 Friday 17 January 2025 04:15:49 -0500 (0:00:00.146) 0:05:41.835 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about Stratis] ******************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:11 Friday 17 January 2025 04:15:49 -0500 (0:00:00.056) 0:05:41.892 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the pools was created] *************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:15 Friday 17 January 2025 04:15:49 -0500 (0:00:00.057) 0:05:41.950 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that encryption is correctly set] ********************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:25 Friday 17 January 2025 04:15:49 -0500 (0:00:00.055) 0:05:42.005 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that Clevis/Tang encryption is correctly set] ********************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:34 Friday 17 January 2025 04:15:49 -0500 (0:00:00.057) 0:05:42.063 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Reset variable used by test] ********************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:44 Friday 17 January 2025 04:15:49 -0500 (0:00:00.056) 0:05:42.120 ******** ok: [managed-node1] => { "ansible_facts": { "storage_test_stratis_report": null }, "changed": false } TASK [Clean up test variables] ************************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:101 Friday 17 January 2025 04:15:49 -0500 (0:00:00.058) 0:05:42.178 ******** ok: [managed-node1] => { "ansible_facts": { "__pvs_lvm_len": null, "_storage_test_expected_pv_count": null, "_storage_test_expected_pv_type": null, "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [Verify the volumes] ****************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-volumes.yml:3 Friday 17 January 2025 04:15:49 -0500 (0:00:00.058) 0:05:42.237 ******** included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml for managed-node1 TASK [Set storage volume test variables] *************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:2 Friday 17 January 2025 04:15:49 -0500 (0:00:00.110) 0:05:42.347 ******** ok: [managed-node1] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [Run test verify for {{ storage_test_volume_subset }}] ******************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:19 Friday 17 January 2025 04:15:49 -0500 (0:00:00.146) 0:05:42.494 ******** included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml for managed-node1 included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml for managed-node1 included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml for managed-node1 included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml for managed-node1 included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml for managed-node1 included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml for managed-node1 included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml for managed-node1 included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml for managed-node1 TASK [Get expected mount device based on device type] ************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:7 Friday 17 January 2025 04:15:50 -0500 (0:00:00.279) 0:05:42.774 ******** ok: [managed-node1] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/luks-0989597c-795f-4f25-b2f2-121f3eb103c2" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:11 Friday 17 January 2025 04:15:50 -0500 (0:00:00.053) 0:05:42.827 ******** ok: [managed-node1] => { "ansible_facts": { "storage_test_mount_expected_mount_point": "/opt/test1", "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Get information about the mountpoint directory] ************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:19 Friday 17 January 2025 04:15:50 -0500 (0:00:00.047) 0:05:42.874 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by device] ******************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:28 Friday 17 January 2025 04:15:50 -0500 (0:00:00.037) 0:05:42.912 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Verify mount directory user] ********************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:36 Friday 17 January 2025 04:15:50 -0500 (0:00:00.052) 0:05:42.964 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory group] ******************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:42 Friday 17 January 2025 04:15:50 -0500 (0:00:00.049) 0:05:43.013 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory permissions] ************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:48 Friday 17 January 2025 04:15:50 -0500 (0:00:00.059) 0:05:43.073 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get path of test volume device] ****************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:57 Friday 17 January 2025 04:15:50 -0500 (0:00:00.056) 0:05:43.129 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:63 Friday 17 January 2025 04:15:50 -0500 (0:00:00.059) 0:05:43.188 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:69 Friday 17 January 2025 04:15:50 -0500 (0:00:00.058) 0:05:43.247 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:79 Friday 17 January 2025 04:15:50 -0500 (0:00:00.063) 0:05:43.311 ******** ok: [managed-node1] => { "ansible_facts": { "storage_test_found_mount_stat": null, "storage_test_mount_expected_mount_point": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:2 Friday 17 January 2025 04:15:50 -0500 (0:00:00.060) 0:05:43.372 ******** ok: [managed-node1] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "/dev/mapper/luks-0989597c-795f-4f25-b2f2-121f3eb103c2 " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:17 Friday 17 January 2025 04:15:50 -0500 (0:00:00.093) 0:05:43.466 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:24 Friday 17 January 2025 04:15:50 -0500 (0:00:00.066) 0:05:43.532 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:33 Friday 17 January 2025 04:15:50 -0500 (0:00:00.068) 0:05:43.600 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify fingerprint] ****************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:45 Friday 17 January 2025 04:15:50 -0500 (0:00:00.059) 0:05:43.660 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Clean up variables] ****************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:52 Friday 17 January 2025 04:15:51 -0500 (0:00:00.084) 0:05:43.745 ******** ok: [managed-node1] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:6 Friday 17 January 2025 04:15:51 -0500 (0:00:00.072) 0:05:43.817 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:14 Friday 17 January 2025 04:15:51 -0500 (0:00:00.075) 0:05:43.893 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:3 Friday 17 January 2025 04:15:51 -0500 (0:00:00.079) 0:05:43.972 ******** ok: [managed-node1] => { "changed": false, "stat": { "atime": 1737105338.7050211, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1737105338.7050211, "dev": 5, "device_type": 2049, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 91332, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/blockdevice", "mode": "0660", "mtime": 1737105338.7050211, "nlink": 1, "path": "/dev/sda1", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:9 Friday 17 January 2025 04:15:51 -0500 (0:00:00.397) 0:05:44.370 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:16 Friday 17 January 2025 04:15:51 -0500 (0:00:00.071) 0:05:44.441 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Make sure we got info about this volume] ********************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:23 Friday 17 January 2025 04:15:51 -0500 (0:00:00.057) 0:05:44.499 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Process volume type (set initial value) (1/2)] *************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:29 Friday 17 January 2025 04:15:51 -0500 (0:00:00.066) 0:05:44.565 ******** ok: [managed-node1] => { "ansible_facts": { "st_volume_type": "partition" }, "changed": false } TASK [Process volume type (get RAID value) (2/2)] ****************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:33 Friday 17 January 2025 04:15:51 -0500 (0:00:00.065) 0:05:44.630 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:38 Friday 17 January 2025 04:15:51 -0500 (0:00:00.059) 0:05:44.689 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:3 Friday 17 January 2025 04:15:52 -0500 (0:00:00.068) 0:05:44.758 ******** ok: [managed-node1] => { "changed": false, "stat": { "atime": 1737105338.8270211, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1737105338.8270211, "dev": 5, "device_type": 64768, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 91386, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1737105338.8270211, "nlink": 1, "path": "/dev/mapper/luks-0989597c-795f-4f25-b2f2-121f3eb103c2", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Ensure cryptsetup is present] ******************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:10 Friday 17 January 2025 04:15:52 -0500 (0:00:00.402) 0:05:45.161 ******** ok: [managed-node1] => { "changed": false, "rc": 0, "results": [ "cryptsetup-2.0.3-6.el7.x86_64 providing cryptsetup is already installed" ] } TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:16 Friday 17 January 2025 04:15:53 -0500 (0:00:00.668) 0:05:45.829 ******** ok: [managed-node1] => { "changed": false, "cmd": [ "cryptsetup", "luksDump", "/dev/sda1" ], "delta": "0:00:00.026861", "end": "2025-01-17 04:15:53.489014", "rc": 0, "start": "2025-01-17 04:15:53.462153" } STDOUT: LUKS header information Version: 2 Epoch: 3 Metadata area: 12288 bytes UUID: 0989597c-795f-4f25-b2f2-121f3eb103c2 Label: (no label) Subsystem: (no subsystem) Flags: (no flags) Data segments: 0: crypt offset: 4194304 [bytes] length: (whole device) cipher: aes-xts-plain64 sector: 512 [bytes] Keyslots: 0: luks2 Key: 512 bits Priority: normal Cipher: aes-xts-plain64 PBKDF: argon2i Time cost: 4 Memory: 667697 Threads: 2 Salt: 8c b2 f3 76 fd fb de 1a 12 c8 cb de da c9 ac b8 a5 4c 8a 10 4a b4 5b 23 81 15 31 f2 19 cb 02 50 AF stripes: 4000 Area offset:32768 [bytes] Area length:258048 [bytes] Digest ID: 0 Tokens: Digests: 0: pbkdf2 Hash: sha256 Iterations: 23206 Salt: c6 08 7d d3 e5 6d f6 02 99 23 b7 11 2d 23 33 d4 e5 49 83 e0 b8 a7 b9 f7 80 b6 9a 6c 2f 1d 1b 99 Digest: 16 69 5f 9d 30 99 a0 41 da a1 51 0f b3 58 1d 51 f8 9d 04 0e 60 a4 68 05 f6 4f 98 19 04 ac 9e 51 TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:22 Friday 17 January 2025 04:15:53 -0500 (0:00:00.421) 0:05:46.250 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:29 Friday 17 January 2025 04:15:53 -0500 (0:00:00.050) 0:05:46.301 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:40 Friday 17 January 2025 04:15:53 -0500 (0:00:00.051) 0:05:46.352 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:46 Friday 17 January 2025 04:15:53 -0500 (0:00:00.049) 0:05:46.402 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Check LUKS version] ****************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:51 Friday 17 January 2025 04:15:53 -0500 (0:00:00.050) 0:05:46.453 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Check LUKS key size] ***************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:63 Friday 17 January 2025 04:15:53 -0500 (0:00:00.068) 0:05:46.521 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:75 Friday 17 January 2025 04:15:53 -0500 (0:00:00.059) 0:05:46.581 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set test variables] ****************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:87 Friday 17 January 2025 04:15:53 -0500 (0:00:00.060) 0:05:46.641 ******** ok: [managed-node1] => { "ansible_facts": { "_storage_test_crypttab_entries": [ "luks-0989597c-795f-4f25-b2f2-121f3eb103c2 /dev/sda1 VALUE_SPECIFIED_IN_NO_LOG_PARAMETER" ], "_storage_test_expected_crypttab_entries": "1", "_storage_test_expected_crypttab_key_file": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:93 Friday 17 January 2025 04:15:54 -0500 (0:00:00.075) 0:05:46.717 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:100 Friday 17 January 2025 04:15:54 -0500 (0:00:00.074) 0:05:46.791 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:108 Friday 17 January 2025 04:15:54 -0500 (0:00:00.075) 0:05:46.867 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Check key file of crypttab entry] **************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:116 Friday 17 January 2025 04:15:54 -0500 (0:00:00.085) 0:05:46.953 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Clear test variables] **************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:124 Friday 17 January 2025 04:15:54 -0500 (0:00:00.069) 0:05:47.022 ******** ok: [managed-node1] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [Get information about RAID] ********************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:8 Friday 17 January 2025 04:15:54 -0500 (0:00:00.061) 0:05:47.084 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:14 Friday 17 January 2025 04:15:54 -0500 (0:00:00.056) 0:05:47.140 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:19 Friday 17 January 2025 04:15:54 -0500 (0:00:00.142) 0:05:47.283 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:24 Friday 17 January 2025 04:15:54 -0500 (0:00:00.058) 0:05:47.341 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set chunk size regex] **************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:29 Friday 17 January 2025 04:15:54 -0500 (0:00:00.056) 0:05:47.398 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:37 Friday 17 January 2025 04:15:54 -0500 (0:00:00.057) 0:05:47.455 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:46 Friday 17 January 2025 04:15:54 -0500 (0:00:00.056) 0:05:47.512 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:54 Friday 17 January 2025 04:15:54 -0500 (0:00:00.057) 0:05:47.570 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:62 Friday 17 January 2025 04:15:54 -0500 (0:00:00.063) 0:05:47.633 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:70 Friday 17 January 2025 04:15:54 -0500 (0:00:00.054) 0:05:47.688 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the actual size of the volume] ************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:3 Friday 17 January 2025 04:15:55 -0500 (0:00:00.057) 0:05:47.746 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested size of the volume] ********************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:11 Friday 17 January 2025 04:15:55 -0500 (0:00:00.063) 0:05:47.810 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected size] ********************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:20 Friday 17 January 2025 04:15:55 -0500 (0:00:00.060) 0:05:47.870 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show expected size] ****************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:28 Friday 17 January 2025 04:15:55 -0500 (0:00:00.060) 0:05:47.930 ******** ok: [managed-node1] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [Get the size of parent/pool device] ************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:32 Friday 17 January 2025 04:15:55 -0500 (0:00:00.091) 0:05:48.022 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show test pool] ********************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:46 Friday 17 January 2025 04:15:55 -0500 (0:00:00.059) 0:05:48.081 ******** skipping: [managed-node1] => {} TASK [Show test blockinfo] ***************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:50 Friday 17 January 2025 04:15:55 -0500 (0:00:00.059) 0:05:48.141 ******** skipping: [managed-node1] => {} TASK [Show test pool size] ***************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:54 Friday 17 January 2025 04:15:55 -0500 (0:00:00.052) 0:05:48.194 ******** skipping: [managed-node1] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:58 Friday 17 January 2025 04:15:55 -0500 (0:00:00.057) 0:05:48.251 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default thin pool reserved space values] ********************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:67 Friday 17 January 2025 04:15:55 -0500 (0:00:00.051) 0:05:48.303 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default minimal thin pool reserved space size] *************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:71 Friday 17 January 2025 04:15:55 -0500 (0:00:00.044) 0:05:48.347 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default maximal thin pool reserved space size] *************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:76 Friday 17 January 2025 04:15:55 -0500 (0:00:00.037) 0:05:48.385 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate maximum usable space in thin pool] ***************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:82 Friday 17 January 2025 04:15:55 -0500 (0:00:00.038) 0:05:48.424 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply upper size limit to max usable thin pool space] ******************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:86 Friday 17 January 2025 04:15:55 -0500 (0:00:00.045) 0:05:48.469 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply lower size limit to max usable thin pool space] ******************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:91 Friday 17 January 2025 04:15:55 -0500 (0:00:00.052) 0:05:48.522 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Convert maximum usable thin pool space from int to Size] ***************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:96 Friday 17 January 2025 04:15:55 -0500 (0:00:00.052) 0:05:48.574 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show max thin pool size] ************************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:101 Friday 17 January 2025 04:15:55 -0500 (0:00:00.051) 0:05:48.625 ******** skipping: [managed-node1] => {} TASK [Show volume thin pool size] ********************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:105 Friday 17 January 2025 04:15:55 -0500 (0:00:00.048) 0:05:48.674 ******** skipping: [managed-node1] => {} TASK [Show test volume size] *************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:109 Friday 17 January 2025 04:15:56 -0500 (0:00:00.051) 0:05:48.725 ******** skipping: [managed-node1] => {} TASK [Establish base value for expected thin pool size] ************************ task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:113 Friday 17 January 2025 04:15:56 -0500 (0:00:00.040) 0:05:48.765 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:120 Friday 17 January 2025 04:15:56 -0500 (0:00:00.038) 0:05:48.804 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected thin pool volume size] ***************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:127 Friday 17 January 2025 04:15:56 -0500 (0:00:00.038) 0:05:48.842 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected thin pool volume size based on percentage value] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:131 Friday 17 January 2025 04:15:56 -0500 (0:00:00.054) 0:05:48.897 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Replace expected volume size with calculated value] ********************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:137 Friday 17 January 2025 04:15:56 -0500 (0:00:00.055) 0:05:48.953 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show actual size] ******************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:143 Friday 17 January 2025 04:15:56 -0500 (0:00:00.055) 0:05:49.008 ******** ok: [managed-node1] => { "storage_test_actual_size": { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } } TASK [Show expected size] ****************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:147 Friday 17 January 2025 04:15:56 -0500 (0:00:00.079) 0:05:49.088 ******** ok: [managed-node1] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [Assert expected size is actual size] ************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:151 Friday 17 January 2025 04:15:56 -0500 (0:00:00.059) 0:05:49.147 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about the LV] ******************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:5 Friday 17 January 2025 04:15:56 -0500 (0:00:00.048) 0:05:49.196 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV segment type] ***************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:13 Friday 17 January 2025 04:15:56 -0500 (0:00:00.053) 0:05:49.249 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check segment type] ****************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:17 Friday 17 January 2025 04:15:56 -0500 (0:00:00.049) 0:05:49.299 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV cache size] ******************************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:24 Friday 17 January 2025 04:15:56 -0500 (0:00:00.042) 0:05:49.342 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested cache size] ****************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:31 Friday 17 January 2025 04:15:56 -0500 (0:00:00.039) 0:05:49.382 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected cache size] ************************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:37 Friday 17 January 2025 04:15:56 -0500 (0:00:00.042) 0:05:49.424 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:42 Friday 17 January 2025 04:15:56 -0500 (0:00:00.054) 0:05:49.479 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:25 Friday 17 January 2025 04:15:56 -0500 (0:00:00.060) 0:05:49.540 ******** ok: [managed-node1] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:44 Friday 17 January 2025 04:15:56 -0500 (0:00:00.057) 0:05:49.597 ******** TASK [Clean up variable namespace] ********************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:54 Friday 17 January 2025 04:15:56 -0500 (0:00:00.052) 0:05:49.650 ******** ok: [managed-node1] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null }, "changed": false } TASK [Remove the key file] ***************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:358 Friday 17 January 2025 04:15:57 -0500 (0:00:00.059) 0:05:49.709 ******** ok: [managed-node1] => { "changed": false, "path": "/tmp/storage_testyWoW78lukskey", "state": "absent" } TASK [Test for correct handling of new encrypted volume w/ no key] ************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:368 Friday 17 January 2025 04:15:57 -0500 (0:00:00.418) 0:05:50.128 ******** included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml for managed-node1 TASK [Store global variable value copy] **************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:4 Friday 17 January 2025 04:15:57 -0500 (0:00:00.107) 0:05:50.236 ******** ok: [managed-node1] => { "ansible_facts": { "storage_pools_global": [], "storage_safe_mode_global": true, "storage_volumes_global": [] }, "changed": false } TASK [Verify role raises correct error] **************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:10 Friday 17 January 2025 04:15:57 -0500 (0:00:00.064) 0:05:50.301 ******** TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Friday 17 January 2025 04:15:57 -0500 (0:00:00.088) 0:05:50.389 ******** included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node1 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Friday 17 January 2025 04:15:57 -0500 (0:00:00.093) 0:05:50.483 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Friday 17 January 2025 04:15:57 -0500 (0:00:00.068) 0:05:50.552 ******** skipping: [managed-node1] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node1] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node1] => (item=CentOS_7.yml) => { "ansible_facts": { "__storage_blivet_diskvolume_mkfs_option_map": { "ext2": "-F", "ext3": "-F", "ext4": "-F" }, "blivet_package_list": [ "python-enum34", "python-blivet3", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "{{ 'libblockdev-s390' if ansible_architecture == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_7.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_7.yml" } skipping: [managed-node1] => (item=CentOS_7.9.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS_7.9.yml", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Friday 17 January 2025 04:15:57 -0500 (0:00:00.130) 0:05:50.682 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Friday 17 January 2025 04:15:58 -0500 (0:00:00.057) 0:05:50.740 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Friday 17 January 2025 04:15:58 -0500 (0:00:00.057) 0:05:50.797 ******** ok: [managed-node1] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Friday 17 January 2025 04:15:58 -0500 (0:00:00.056) 0:05:50.854 ******** ok: [managed-node1] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Friday 17 January 2025 04:15:58 -0500 (0:00:00.045) 0:05:50.899 ******** included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node1 TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2 Friday 17 January 2025 04:15:58 -0500 (0:00:00.182) 0:05:51.081 ******** ok: [managed-node1] => { "changed": false, "rc": 0, "results": [ "python-enum34-1.0.4-1.el7.noarch providing python-enum34 is already installed", "1:python2-blivet3-3.1.3-3.el7.noarch providing python-blivet3 is already installed", "libblockdev-crypto-2.18-5.el7.x86_64 providing libblockdev-crypto is already installed", "libblockdev-dm-2.18-5.el7.x86_64 providing libblockdev-dm is already installed", "libblockdev-lvm-2.18-5.el7.x86_64 providing libblockdev-lvm is already installed", "libblockdev-mdraid-2.18-5.el7.x86_64 providing libblockdev-mdraid is already installed", "libblockdev-swap-2.18-5.el7.x86_64 providing libblockdev-swap is already installed", "libblockdev-2.18-5.el7.x86_64 providing libblockdev is already installed" ] } TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Friday 17 January 2025 04:16:02 -0500 (0:00:03.852) 0:05:54.933 ******** ok: [managed-node1] => { "storage_pools": [ { "disks": [ "sda" ], "name": "foo", "type": "lvm", "volumes": [ { "encryption": true, "encryption_luks_version": "luks2", "mount_point": "/opt/test1", "name": "test1", "size": "4g" } ] } ] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:14 Friday 17 January 2025 04:16:02 -0500 (0:00:00.048) 0:05:54.982 ******** ok: [managed-node1] => { "storage_volumes": [] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19 Friday 17 January 2025 04:16:02 -0500 (0:00:00.046) 0:05:55.028 ******** ok: [managed-node1] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "cryptsetup", "lvm2" ], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:31 Friday 17 January 2025 04:16:06 -0500 (0:00:04.058) 0:05:59.087 ******** included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node1 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Friday 17 January 2025 04:16:06 -0500 (0:00:00.095) 0:05:59.182 ******** TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Friday 17 January 2025 04:16:06 -0500 (0:00:00.042) 0:05:59.225 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Friday 17 January 2025 04:16:06 -0500 (0:00:00.045) 0:05:59.271 ******** TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:37 Friday 17 January 2025 04:16:06 -0500 (0:00:00.035) 0:05:59.306 ******** ok: [managed-node1] => { "changed": false, "rc": 0, "results": [ "cryptsetup-2.0.3-6.el7.x86_64 providing cryptsetup is already installed", "7:lvm2-2.02.187-6.el7_9.5.x86_64 providing lvm2 is already installed", "kpartx-0.4.9-136.el7_9.x86_64 providing kpartx is already installed" ] } TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:51 Friday 17 January 2025 04:16:07 -0500 (0:00:00.678) 0:05:59.984 ******** ok: [managed-node1] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "arp-ethers.service": { "name": "arp-ethers.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "brandbot.service": { "name": "brandbot.service", "source": "systemd", "state": "inactive", "status": "static" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "console-shell.service": { "name": "console-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.import1.service": { "name": "dbus-org.freedesktop.import1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.machine1.service": { "name": "dbus-org.freedesktop.machine1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dmraid-activation.service": { "name": "dmraid-activation.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "ebtables.service": { "name": "ebtables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iprdump.service": { "name": "iprdump.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprinit.service": { "name": "iprinit.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprupdate.service": { "name": "iprupdate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-lvmetad.service": { "name": "lvm2-lvmetad.service", "source": "systemd", "state": "running", "status": "static" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "netconsole": { "name": "netconsole", "source": "sysv", "state": "stopped", "status": "disabled" }, "network": { "name": "network", "source": "sysv", "state": "running", "status": "enabled" }, "network.service": { "name": "network.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-config.service": { "name": "nfs-config.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-idmap.service": { "name": "nfs-idmap.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-lock.service": { "name": "nfs-lock.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-rquotad.service": { "name": "nfs-rquotad.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-secure.service": { "name": "nfs-secure.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs.service": { "name": "nfs.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfslock.service": { "name": "nfslock.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "postfix.service": { "name": "postfix.service", "source": "systemd", "state": "running", "status": "enabled" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rhel-autorelabel-mark.service": { "name": "rhel-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-autorelabel.service": { "name": "rhel-autorelabel.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-configure.service": { "name": "rhel-configure.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-dmesg.service": { "name": "rhel-dmesg.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-domainname.service": { "name": "rhel-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-import-state.service": { "name": "rhel-import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-loadmodules.service": { "name": "rhel-loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-readonly.service": { "name": "rhel-readonly.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-rquotad.service": { "name": "rpc-rquotad.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpcgssd.service": { "name": "rpcgssd.service", "source": "systemd", "state": "inactive", "status": "static" }, "rpcidmapd.service": { "name": "rpcidmapd.service", "source": "systemd", "state": "inactive", "status": "static" }, "rsyncd.service": { "name": "rsyncd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyncd@.service": { "name": "rsyncd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-policy-migrate-local-changes@.service": { "name": "selinux-policy-migrate-local-changes@.service", "source": "systemd", "state": "unknown", "status": "static" }, "selinux-policy-migrate-local-changes@targeted.service": { "name": "selinux-policy-migrate-local-changes@targeted.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "unknown" }, "sshd-keygen.service": { "name": "sshd-keygen.service", "source": "systemd", "state": "stopped", "status": "static" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-bootchart.service": { "name": "systemd-bootchart.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-importd.service": { "name": "systemd-importd.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-machined.service": { "name": "systemd-machined.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-nspawn@.service": { "name": "systemd-nspawn@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-readahead-collect.service": { "name": "systemd-readahead-collect.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-readahead-done.service": { "name": "systemd-readahead-done.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "systemd-readahead-drop.service": { "name": "systemd-readahead-drop.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "systemd-readahead-replay.service": { "name": "systemd-readahead-replay.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill@.service": { "name": "systemd-rfkill@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-shutdownd.service": { "name": "systemd-shutdownd.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "wpa_supplicant.service": { "name": "wpa_supplicant.service", "source": "systemd", "state": "inactive", "status": "disabled" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Set storage_cryptsetup_services] ***** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:57 Friday 17 January 2025 04:16:08 -0500 (0:00:00.993) 0:06:00.978 ******** ok: [managed-node1] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:63 Friday 17 January 2025 04:16:08 -0500 (0:00:00.055) 0:06:01.033 ******** TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:69 Friday 17 January 2025 04:16:08 -0500 (0:00:00.034) 0:06:01.068 ******** fatal: [managed-node1]: FAILED! => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } MSG: encrypted volume 'test1' missing key/password TASK [fedora.linux_system_roles.storage : Failed message] ********************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:109 Friday 17 January 2025 04:16:12 -0500 (0:00:04.026) 0:06:05.094 ******** fatal: [managed-node1]: FAILED! => { "changed": false } MSG: {u'_ansible_no_log': False, u'crypts': [], u'pools': [], u'leaves': [], u'changed': False, u'actions': [], u'failed': True, u'volumes': [], u'invocation': {u'module_args': {u'packages_only': False, u'disklabel_type': None, u'diskvolume_mkfs_option_map': {u'ext4': u'-F', u'ext3': u'-F', u'ext2': u'-F'}, u'safe_mode': False, u'pools': [{u'raid_metadata_version': None, u'encryption_key_size': None, u'encryption_key': None, u'encryption_luks_version': None, u'encryption_tang_url': None, u'raid_spare_count': None, u'grow_to_fill': False, u'encryption_tang_thumbprint': None, u'name': u'foo', u'encryption_password': None, u'encryption': False, u'disks': [u'sda'], u'raid_level': None, u'raid_device_count': None, u'state': u'present', u'volumes': [{u'raid_metadata_version': None, u'mount_device_identifier': u'uuid', u'fs_type': u'xfs', u'mount_options': u'defaults', u'size': u'4g', u'mount_point': u'/opt/test1', u'compression': None, u'encryption_password': None, u'encryption': True, u'raid_level': None, u'raid_device_count': None, u'state': u'present', u'vdo_pool_size': None, u'mount_mode': None, u'thin_pool_name': None, u'type': u'lvm', u'encryption_key_size': None, u'deduplication': None, u'encryption_cipher': None, u'encryption_key': None, u'fs_label': u'', u'encryption_luks_version': u'luks2', u'raid_stripe_size': None, u'mount_passno': 0, u'mount_user': None, u'raid_spare_count': None, u'name': u'test1', u'cache_mode': None, u'raid_disks': [], u'mount_group': None, u'fs_overwrite_existing': True, u'disks': [], u'cached': False, u'thin_pool_size': None, u'thin': False, u'mount_check': 0, u'cache_size': 0, u'raid_chunk_size': None, u'cache_devices': [], u'fs_create_options': u''}], u'shared': False, u'encryption_clevis_pin': None, u'type': u'lvm', u'encryption_cipher': None, u'raid_chunk_size': None}], u'volumes': [], u'pool_defaults': {u'raid_metadata_version': None, u'encryption_cipher': None, u'encryption_key': None, u'encryption_luks_version': None, u'raid_spare_count': None, u'grow_to_fill': False, u'encryption_password': None, u'encryption': False, u'disks': [], u'raid_level': None, u'raid_device_count': None, u'state': u'present', u'volumes': [], u'shared': False, u'type': u'lvm', u'encryption_key_size': None, u'raid_chunk_size': None}, u'volume_defaults': {u'raid_metadata_version': None, u'raid_level': None, u'fs_type': u'xfs', u'mount_options': u'defaults', u'size': 0, u'mount_point': u'', u'compression': None, u'encryption_password': None, u'encryption': False, u'mount_device_identifier': u'uuid', u'raid_device_count': None, u'state': u'present', u'vdo_pool_size': None, u'thin_pool_name': None, u'type': u'lvm', u'encryption_key_size': None, u'encryption_cipher': None, u'encryption_key': None, u'fs_label': u'', u'encryption_luks_version': None, u'raid_stripe_size': None, u'cache_size': 0, u'raid_spare_count': None, u'cache_mode': None, u'deduplication': None, u'cached': False, u'fs_overwrite_existing': True, u'disks': [], u'thin_pool_size': None, u'thin': None, u'mount_check': 0, u'mount_passno': 0, u'raid_chunk_size': None, u'cache_devices': [], u'fs_create_options': u''}, u'use_partitions': None}}, u'mounts': [], u'packages': [], u'msg': u"encrypted volume 'test1' missing key/password"} TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:113 Friday 17 January 2025 04:16:12 -0500 (0:00:00.076) 0:06:05.171 ******** TASK [Check that we failed in the role] **************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:23 Friday 17 January 2025 04:16:12 -0500 (0:00:00.040) 0:06:05.211 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Verify the blivet output and error message are correct] ****************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:28 Friday 17 January 2025 04:16:12 -0500 (0:00:00.052) 0:06:05.264 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Verify correct exception or error message] ******************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:39 Friday 17 January 2025 04:16:12 -0500 (0:00:00.066) 0:06:05.330 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Create an encrypted lvm volume w/ default fs] **************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:387 Friday 17 January 2025 04:16:12 -0500 (0:00:00.039) 0:06:05.369 ******** TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Friday 17 January 2025 04:16:12 -0500 (0:00:00.078) 0:06:05.448 ******** included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node1 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Friday 17 January 2025 04:16:12 -0500 (0:00:00.059) 0:06:05.508 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Friday 17 January 2025 04:16:12 -0500 (0:00:00.060) 0:06:05.569 ******** skipping: [managed-node1] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node1] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node1] => (item=CentOS_7.yml) => { "ansible_facts": { "__storage_blivet_diskvolume_mkfs_option_map": { "ext2": "-F", "ext3": "-F", "ext4": "-F" }, "blivet_package_list": [ "python-enum34", "python-blivet3", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "{{ 'libblockdev-s390' if ansible_architecture == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_7.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_7.yml" } skipping: [managed-node1] => (item=CentOS_7.9.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS_7.9.yml", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Friday 17 January 2025 04:16:12 -0500 (0:00:00.111) 0:06:05.681 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Friday 17 January 2025 04:16:13 -0500 (0:00:00.046) 0:06:05.727 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Friday 17 January 2025 04:16:13 -0500 (0:00:00.043) 0:06:05.771 ******** ok: [managed-node1] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Friday 17 January 2025 04:16:13 -0500 (0:00:00.044) 0:06:05.816 ******** ok: [managed-node1] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Friday 17 January 2025 04:16:13 -0500 (0:00:00.047) 0:06:05.863 ******** included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node1 TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2 Friday 17 January 2025 04:16:13 -0500 (0:00:00.094) 0:06:05.957 ******** ok: [managed-node1] => { "changed": false, "rc": 0, "results": [ "python-enum34-1.0.4-1.el7.noarch providing python-enum34 is already installed", "1:python2-blivet3-3.1.3-3.el7.noarch providing python-blivet3 is already installed", "libblockdev-crypto-2.18-5.el7.x86_64 providing libblockdev-crypto is already installed", "libblockdev-dm-2.18-5.el7.x86_64 providing libblockdev-dm is already installed", "libblockdev-lvm-2.18-5.el7.x86_64 providing libblockdev-lvm is already installed", "libblockdev-mdraid-2.18-5.el7.x86_64 providing libblockdev-mdraid is already installed", "libblockdev-swap-2.18-5.el7.x86_64 providing libblockdev-swap is already installed", "libblockdev-2.18-5.el7.x86_64 providing libblockdev is already installed" ] } TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Friday 17 January 2025 04:16:17 -0500 (0:00:03.901) 0:06:09.858 ******** ok: [managed-node1] => { "storage_pools": [ { "disks": [ "sda" ], "name": "foo", "type": "lvm", "volumes": [ { "encryption": true, "encryption_cipher": "aes-xts-plain64", "encryption_key_size": 512, "encryption_luks_version": "luks2", "encryption_password": "yabbadabbadoo", "mount_point": "/opt/test1", "name": "test1", "size": "4g" } ] } ] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:14 Friday 17 January 2025 04:16:17 -0500 (0:00:00.064) 0:06:09.923 ******** ok: [managed-node1] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19 Friday 17 January 2025 04:16:17 -0500 (0:00:00.061) 0:06:09.984 ******** ok: [managed-node1] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "cryptsetup", "lvm2" ], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:31 Friday 17 January 2025 04:16:21 -0500 (0:00:03.859) 0:06:13.843 ******** included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node1 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Friday 17 January 2025 04:16:21 -0500 (0:00:00.101) 0:06:13.945 ******** TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Friday 17 January 2025 04:16:21 -0500 (0:00:00.052) 0:06:13.998 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Friday 17 January 2025 04:16:21 -0500 (0:00:00.057) 0:06:14.056 ******** TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:37 Friday 17 January 2025 04:16:21 -0500 (0:00:00.055) 0:06:14.111 ******** ok: [managed-node1] => { "changed": false, "rc": 0, "results": [ "cryptsetup-2.0.3-6.el7.x86_64 providing cryptsetup is already installed", "7:lvm2-2.02.187-6.el7_9.5.x86_64 providing lvm2 is already installed", "kpartx-0.4.9-136.el7_9.x86_64 providing kpartx is already installed" ] } TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:51 Friday 17 January 2025 04:16:22 -0500 (0:00:00.769) 0:06:14.880 ******** ok: [managed-node1] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "arp-ethers.service": { "name": "arp-ethers.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "brandbot.service": { "name": "brandbot.service", "source": "systemd", "state": "inactive", "status": "static" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "console-shell.service": { "name": "console-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.import1.service": { "name": "dbus-org.freedesktop.import1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.machine1.service": { "name": "dbus-org.freedesktop.machine1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dmraid-activation.service": { "name": "dmraid-activation.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "ebtables.service": { "name": "ebtables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iprdump.service": { "name": "iprdump.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprinit.service": { "name": "iprinit.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprupdate.service": { "name": "iprupdate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-lvmetad.service": { "name": "lvm2-lvmetad.service", "source": "systemd", "state": "running", "status": "static" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "netconsole": { "name": "netconsole", "source": "sysv", "state": "stopped", "status": "disabled" }, "network": { "name": "network", "source": "sysv", "state": "running", "status": "enabled" }, "network.service": { "name": "network.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-config.service": { "name": "nfs-config.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-idmap.service": { "name": "nfs-idmap.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-lock.service": { "name": "nfs-lock.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-rquotad.service": { "name": "nfs-rquotad.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-secure.service": { "name": "nfs-secure.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs.service": { "name": "nfs.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfslock.service": { "name": "nfslock.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "postfix.service": { "name": "postfix.service", "source": "systemd", "state": "running", "status": "enabled" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rhel-autorelabel-mark.service": { "name": "rhel-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-autorelabel.service": { "name": "rhel-autorelabel.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-configure.service": { "name": "rhel-configure.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-dmesg.service": { "name": "rhel-dmesg.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-domainname.service": { "name": "rhel-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-import-state.service": { "name": "rhel-import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-loadmodules.service": { "name": "rhel-loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-readonly.service": { "name": "rhel-readonly.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-rquotad.service": { "name": "rpc-rquotad.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpcgssd.service": { "name": "rpcgssd.service", "source": "systemd", "state": "inactive", "status": "static" }, "rpcidmapd.service": { "name": "rpcidmapd.service", "source": "systemd", "state": "inactive", "status": "static" }, "rsyncd.service": { "name": "rsyncd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyncd@.service": { "name": "rsyncd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-policy-migrate-local-changes@.service": { "name": "selinux-policy-migrate-local-changes@.service", "source": "systemd", "state": "unknown", "status": "static" }, "selinux-policy-migrate-local-changes@targeted.service": { "name": "selinux-policy-migrate-local-changes@targeted.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "unknown" }, "sshd-keygen.service": { "name": "sshd-keygen.service", "source": "systemd", "state": "stopped", "status": "static" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-bootchart.service": { "name": "systemd-bootchart.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-importd.service": { "name": "systemd-importd.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-machined.service": { "name": "systemd-machined.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-nspawn@.service": { "name": "systemd-nspawn@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-readahead-collect.service": { "name": "systemd-readahead-collect.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-readahead-done.service": { "name": "systemd-readahead-done.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "systemd-readahead-drop.service": { "name": "systemd-readahead-drop.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "systemd-readahead-replay.service": { "name": "systemd-readahead-replay.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill@.service": { "name": "systemd-rfkill@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-shutdownd.service": { "name": "systemd-shutdownd.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "wpa_supplicant.service": { "name": "wpa_supplicant.service", "source": "systemd", "state": "inactive", "status": "disabled" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Set storage_cryptsetup_services] ***** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:57 Friday 17 January 2025 04:16:23 -0500 (0:00:01.059) 0:06:15.940 ******** ok: [managed-node1] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:63 Friday 17 January 2025 04:16:23 -0500 (0:00:00.075) 0:06:16.016 ******** TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:69 Friday 17 January 2025 04:16:23 -0500 (0:00:00.051) 0:06:16.068 ******** changed: [managed-node1] => { "actions": [ { "action": "destroy format", "device": "/dev/mapper/luks-0989597c-795f-4f25-b2f2-121f3eb103c2", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/luks-0989597c-795f-4f25-b2f2-121f3eb103c2", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda1", "fs_type": "luks" }, { "action": "destroy device", "device": "/dev/sda1", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda", "fs_type": "disklabel" }, { "action": "create format", "device": "/dev/sda", "fs_type": "lvmpv" }, { "action": "create device", "device": "/dev/foo", "fs_type": null }, { "action": "create device", "device": "/dev/mapper/foo-test1", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/foo-test1", "fs_type": "luks" }, { "action": "create device", "device": "/dev/mapper/luks-eb5a5901-32b6-427f-885e-6ee3011725ec", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/luks-eb5a5901-32b6-427f-885e-6ee3011725ec", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/sda1", "name": "luks-0989597c-795f-4f25-b2f2-121f3eb103c2", "password": "-", "state": "absent" }, { "backing_device": "/dev/mapper/foo-test1", "name": "luks-eb5a5901-32b6-427f-885e-6ee3011725ec", "password": "-", "state": "present" } ], "leaves": [ "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1", "/dev/mapper/luks-eb5a5901-32b6-427f-885e-6ee3011725ec" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-0989597c-795f-4f25-b2f2-121f3eb103c2", "state": "absent" }, { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-eb5a5901-32b6-427f-885e-6ee3011725ec", "state": "mounted" } ], "packages": [ "cryptsetup", "xfsprogs", "e2fsprogs", "lvm2" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/luks-eb5a5901-32b6-427f-885e-6ee3011725ec", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/luks-eb5a5901-32b6-427f-885e-6ee3011725ec", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": true, "encryption_cipher": "aes-xts-plain64", "encryption_key": null, "encryption_key_size": 512, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } TASK [fedora.linux_system_roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:83 Friday 17 January 2025 04:16:36 -0500 (0:00:13.210) 0:06:29.278 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if /etc/fstab is present] ****** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:90 Friday 17 January 2025 04:16:36 -0500 (0:00:00.076) 0:06:29.355 ******** ok: [managed-node1] => { "changed": false, "stat": { "atime": 1737105342.189024, "attr_flags": "e", "attributes": [ "extents" ], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "ef352bdeada504723cb2abfbe13930cb2265c3ee", "ctime": 1737105342.1870239, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 263644, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0644", "mtime": 1737105342.1870239, "nlink": 1, "path": "/etc/fstab", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 1311, "uid": 0, "version": "18446744072031193646", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Add fingerprint to /etc/fstab if present] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:95 Friday 17 January 2025 04:16:37 -0500 (0:00:00.571) 0:06:29.927 ******** ok: [managed-node1] => { "backup": "", "changed": false } TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:113 Friday 17 January 2025 04:16:37 -0500 (0:00:00.446) 0:06:30.374 ******** TASK [fedora.linux_system_roles.storage : Show blivet_output] ****************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:119 Friday 17 January 2025 04:16:37 -0500 (0:00:00.113) 0:06:30.487 ******** ok: [managed-node1] => { "blivet_output": { "actions": [ { "action": "destroy format", "device": "/dev/mapper/luks-0989597c-795f-4f25-b2f2-121f3eb103c2", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/luks-0989597c-795f-4f25-b2f2-121f3eb103c2", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda1", "fs_type": "luks" }, { "action": "destroy device", "device": "/dev/sda1", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda", "fs_type": "disklabel" }, { "action": "create format", "device": "/dev/sda", "fs_type": "lvmpv" }, { "action": "create device", "device": "/dev/foo", "fs_type": null }, { "action": "create device", "device": "/dev/mapper/foo-test1", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/foo-test1", "fs_type": "luks" }, { "action": "create device", "device": "/dev/mapper/luks-eb5a5901-32b6-427f-885e-6ee3011725ec", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/luks-eb5a5901-32b6-427f-885e-6ee3011725ec", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/sda1", "name": "luks-0989597c-795f-4f25-b2f2-121f3eb103c2", "password": "-", "state": "absent" }, { "backing_device": "/dev/mapper/foo-test1", "name": "luks-eb5a5901-32b6-427f-885e-6ee3011725ec", "password": "-", "state": "present" } ], "failed": false, "leaves": [ "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1", "/dev/mapper/luks-eb5a5901-32b6-427f-885e-6ee3011725ec" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-0989597c-795f-4f25-b2f2-121f3eb103c2", "state": "absent" }, { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-eb5a5901-32b6-427f-885e-6ee3011725ec", "state": "mounted" } ], "packages": [ "cryptsetup", "xfsprogs", "e2fsprogs", "lvm2" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/luks-eb5a5901-32b6-427f-885e-6ee3011725ec", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/luks-eb5a5901-32b6-427f-885e-6ee3011725ec", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": true, "encryption_cipher": "aes-xts-plain64", "encryption_key": null, "encryption_key_size": 512, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } } TASK [fedora.linux_system_roles.storage : Set the list of pools for test verification] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:128 Friday 17 January 2025 04:16:37 -0500 (0:00:00.129) 0:06:30.617 ******** ok: [managed-node1] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/luks-eb5a5901-32b6-427f-885e-6ee3011725ec", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/luks-eb5a5901-32b6-427f-885e-6ee3011725ec", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": true, "encryption_cipher": "aes-xts-plain64", "encryption_key": null, "encryption_key_size": 512, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ] }, "changed": false } TASK [fedora.linux_system_roles.storage : Set the list of volumes for test verification] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:132 Friday 17 January 2025 04:16:38 -0500 (0:00:00.090) 0:06:30.707 ******** ok: [managed-node1] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Remove obsolete mounts] ************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:148 Friday 17 January 2025 04:16:38 -0500 (0:00:00.094) 0:06:30.802 ******** changed: [managed-node1] => (item={u'src': u'/dev/mapper/luks-0989597c-795f-4f25-b2f2-121f3eb103c2', u'state': u'absent', u'fstype': u'xfs', u'path': u'/opt/test1'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-0989597c-795f-4f25-b2f2-121f3eb103c2", "state": "absent" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/luks-0989597c-795f-4f25-b2f2-121f3eb103c2" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:159 Friday 17 January 2025 04:16:38 -0500 (0:00:00.468) 0:06:31.270 ******** ok: [managed-node1] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Set up new/current mounts] *********** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:164 Friday 17 January 2025 04:16:39 -0500 (0:00:00.606) 0:06:31.877 ******** changed: [managed-node1] => (item={u'src': u'/dev/mapper/luks-eb5a5901-32b6-427f-885e-6ee3011725ec', u'group': None, u'dump': 0, u'passno': 0, u'fstype': u'xfs', u'state': u'mounted', u'mode': None, u'owner': None, u'path': u'/opt/test1', u'opts': u'defaults'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-eb5a5901-32b6-427f-885e-6ee3011725ec", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/luks-eb5a5901-32b6-427f-885e-6ee3011725ec" } TASK [fedora.linux_system_roles.storage : Manage mount ownership/permissions] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:175 Friday 17 January 2025 04:16:39 -0500 (0:00:00.641) 0:06:32.518 ******** skipping: [managed-node1] => (item={u'src': u'/dev/mapper/luks-eb5a5901-32b6-427f-885e-6ee3011725ec', u'group': None, u'dump': 0, u'passno': 0, u'fstype': u'xfs', u'state': u'mounted', u'mode': None, u'owner': None, u'path': u'/opt/test1', u'opts': u'defaults'}) => { "ansible_loop_var": "mount_info", "changed": false, "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-eb5a5901-32b6-427f-885e-6ee3011725ec", "state": "mounted" }, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:187 Friday 17 January 2025 04:16:39 -0500 (0:00:00.085) 0:06:32.604 ******** ok: [managed-node1] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Retrieve facts for the /etc/crypttab file] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:195 Friday 17 January 2025 04:16:40 -0500 (0:00:00.584) 0:06:33.188 ******** ok: [managed-node1] => { "changed": false, "stat": { "atime": 1737105346.0990272, "attr_flags": "e", "attributes": [ "extents" ], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "d921110cf87ede81f1ca8eb386d20ce2f697c66a", "ctime": 1737105344.0390255, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 263648, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0600", "mtime": 1737105344.0380256, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 88, "uid": 0, "version": "18446744072031194513", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Manage /etc/crypttab to account for changes we just made] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:200 Friday 17 January 2025 04:16:41 -0500 (0:00:00.619) 0:06:33.808 ******** changed: [managed-node1] => (item={u'state': u'absent', u'password': u'-', u'name': u'luks-0989597c-795f-4f25-b2f2-121f3eb103c2', u'backing_device': u'/dev/sda1'}) => { "ansible_loop_var": "entry", "backup": "", "changed": true, "entry": { "backing_device": "/dev/sda1", "name": "luks-0989597c-795f-4f25-b2f2-121f3eb103c2", "password": "-", "state": "absent" }, "found": 1 } MSG: 1 line(s) removed changed: [managed-node1] => (item={u'state': u'present', u'password': u'-', u'name': u'luks-eb5a5901-32b6-427f-885e-6ee3011725ec', u'backing_device': u'/dev/mapper/foo-test1'}) => { "ansible_loop_var": "entry", "backup": "", "changed": true, "entry": { "backing_device": "/dev/mapper/foo-test1", "name": "luks-eb5a5901-32b6-427f-885e-6ee3011725ec", "password": "-", "state": "present" } } MSG: line added TASK [fedora.linux_system_roles.storage : Update facts] ************************ task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:222 Friday 17 January 2025 04:16:42 -0500 (0:00:01.249) 0:06:35.058 ******** ok: [managed-node1] TASK [Verify role results] ***************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:406 Friday 17 January 2025 04:16:43 -0500 (0:00:00.856) 0:06:35.915 ******** included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml for managed-node1 TASK [Print out pool information] ********************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:2 Friday 17 January 2025 04:16:43 -0500 (0:00:00.170) 0:06:36.086 ******** ok: [managed-node1] => { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/luks-eb5a5901-32b6-427f-885e-6ee3011725ec", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/luks-eb5a5901-32b6-427f-885e-6ee3011725ec", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": true, "encryption_cipher": "aes-xts-plain64", "encryption_key": null, "encryption_key_size": 512, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ] } TASK [Print out volume information] ******************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:7 Friday 17 January 2025 04:16:43 -0500 (0:00:00.100) 0:06:36.186 ******** skipping: [managed-node1] => {} TASK [Collect info about the volumes.] ***************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:15 Friday 17 January 2025 04:16:43 -0500 (0:00:00.086) 0:06:36.272 ******** ok: [managed-node1] => { "changed": false, "info": { "/dev/mapper/foo-test1": { "fstype": "crypto_LUKS", "label": "", "mountpoint": "", "name": "/dev/mapper/foo-test1", "size": "4G", "type": "lvm", "uuid": "eb5a5901-32b6-427f-885e-6ee3011725ec" }, "/dev/mapper/luks-eb5a5901-32b6-427f-885e-6ee3011725ec": { "fstype": "xfs", "label": "", "mountpoint": "/opt/test1", "name": "/dev/mapper/luks-eb5a5901-32b6-427f-885e-6ee3011725ec", "size": "4G", "type": "crypt", "uuid": "3787e9fb-e091-45d9-9a5a-737d63dc8d5c" }, "/dev/sda": { "fstype": "LVM2_member", "label": "", "mountpoint": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "30b7yk-4Y3c-DHs7-HP7U-2G7i-eOL6-HqzhXx" }, "/dev/sdb": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdd": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdd", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sde": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sde", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdf": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdf", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdg": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdg", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdh": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdh", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdi": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdi", "size": "10G", "type": "disk", "uuid": "" }, "/dev/xvda": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/xvda", "size": "250G", "type": "disk", "uuid": "" }, "/dev/xvda1": { "fstype": "ext4", "label": "", "mountpoint": "/", "name": "/dev/xvda1", "size": "250G", "type": "partition", "uuid": "c7b7d6a5-fd01-4b9b-bcca-153eaff9d312" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:20 Friday 17 January 2025 04:16:44 -0500 (0:00:00.492) 0:06:36.765 ******** ok: [managed-node1] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.003276", "end": "2025-01-17 04:16:44.549139", "rc": 0, "start": "2025-01-17 04:16:44.545863" } STDOUT: # system_role:storage # # /etc/fstab # Created by anaconda on Thu Jun 20 10:23:46 2024 # # Accessible filesystems, by reference, are maintained under '/dev/disk' # See man pages fstab(5), findfs(8), mount(8) and/or blkid(8) for more info # UUID=c7b7d6a5-fd01-4b9b-bcca-153eaff9d312 / ext4 defaults 1 1 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat /mnt/redhat nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/qa /mnt/qa nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 vtap-eng01.storage.rdu2.redhat.com:/vol/engarchive /mnt/engarchive nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/tpsdist /mnt/tpsdist nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat/brewroot /mnt/brew nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_brew_scratch_nfs/scratch /mnt/brew_scratch nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 /dev/mapper/luks-eb5a5901-32b6-427f-885e-6ee3011725ec /opt/test1 xfs defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:25 Friday 17 January 2025 04:16:44 -0500 (0:00:00.646) 0:06:37.412 ******** ok: [managed-node1] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.002790", "end": "2025-01-17 04:16:45.197769", "failed_when_result": false, "rc": 0, "start": "2025-01-17 04:16:45.194979" } STDOUT: luks-eb5a5901-32b6-427f-885e-6ee3011725ec /dev/mapper/foo-test1 - TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:34 Friday 17 January 2025 04:16:45 -0500 (0:00:00.601) 0:06:38.013 ******** included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml for managed-node1 TASK [Set _storage_pool_tests] ************************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:5 Friday 17 January 2025 04:16:45 -0500 (0:00:00.158) 0:06:38.172 ******** ok: [managed-node1] => { "ansible_facts": { "_storage_pool_tests": [ "members", "volumes" ] }, "changed": false } TASK [Get VG shared value status] ********************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:18 Friday 17 January 2025 04:16:45 -0500 (0:00:00.071) 0:06:38.244 ******** ok: [managed-node1] => { "changed": false, "cmd": [ "vgs", "--noheadings", "--binary", "-o", "shared", "foo" ], "delta": "0:00:00.018949", "end": "2025-01-17 04:16:45.963758", "rc": 0, "start": "2025-01-17 04:16:45.944809" } STDOUT: 0 TASK [Verify that VG shared value checks out] ********************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:24 Friday 17 January 2025 04:16:46 -0500 (0:00:00.585) 0:06:38.830 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Verify pool subset] ****************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:34 Friday 17 January 2025 04:16:46 -0500 (0:00:00.075) 0:06:38.905 ******** included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml for managed-node1 included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-volumes.yml for managed-node1 TASK [Set test variables] ****************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:2 Friday 17 January 2025 04:16:46 -0500 (0:00:00.156) 0:06:39.062 ******** ok: [managed-node1] => { "ansible_facts": { "_storage_test_expected_pv_count": "1", "_storage_test_pool_pvs_lvm": [ "/dev/sda" ] }, "changed": false } TASK [Get the canonical device path for each member device] ******************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:8 Friday 17 January 2025 04:16:46 -0500 (0:00:00.112) 0:06:39.175 ******** ok: [managed-node1] => (item=/dev/sda) => { "ansible_loop_var": "pv", "changed": false, "device": "/dev/sda", "pv": "/dev/sda" } TASK [Set pvs lvm length] ****************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:17 Friday 17 January 2025 04:16:47 -0500 (0:00:00.864) 0:06:40.039 ******** ok: [managed-node1] => { "ansible_facts": { "__pvs_lvm_len": "1" }, "changed": false } TASK [Set pool pvs] ************************************************************ task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:22 Friday 17 January 2025 04:16:47 -0500 (0:00:00.157) 0:06:40.197 ******** ok: [managed-node1] => { "ansible_facts": { "_storage_test_pool_pvs": [ "/dev/sda" ] }, "changed": false } TASK [Verify PV count] ********************************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:27 Friday 17 January 2025 04:16:47 -0500 (0:00:00.067) 0:06:40.264 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Set expected pv type] **************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:36 Friday 17 January 2025 04:16:47 -0500 (0:00:00.078) 0:06:40.343 ******** ok: [managed-node1] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [Set expected pv type] **************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:41 Friday 17 January 2025 04:16:47 -0500 (0:00:00.086) 0:06:40.430 ******** ok: [managed-node1] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [Set expected pv type] **************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:46 Friday 17 January 2025 04:16:47 -0500 (0:00:00.090) 0:06:40.520 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check the type of each PV] *********************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:51 Friday 17 January 2025 04:16:47 -0500 (0:00:00.107) 0:06:40.628 ******** ok: [managed-node1] => (item=/dev/sda) => { "ansible_loop_var": "pv", "changed": false, "pv": "/dev/sda" } MSG: All assertions passed TASK [Check that blivet supports PV grow to fill] ****************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:64 Friday 17 January 2025 04:16:48 -0500 (0:00:00.166) 0:06:40.794 ******** ok: [managed-node1] => { "changed": false, "rc": 0 } STDOUT: False STDERR: Shared connection to 10.31.46.65 closed. TASK [Verify that PVs fill the whole devices when they should] ***************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:73 Friday 17 January 2025 04:16:48 -0500 (0:00:00.557) 0:06:41.352 ******** skipping: [managed-node1] => (item=/dev/sda) => { "ansible_loop_var": "st_pool_pv", "changed": false, "skip_reason": "Conditional result was False", "st_pool_pv": "/dev/sda" } TASK [Check MD RAID] *********************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:83 Friday 17 January 2025 04:16:48 -0500 (0:00:00.067) 0:06:41.420 ******** included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml for managed-node1 TASK [Get information about RAID] ********************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:8 Friday 17 January 2025 04:16:48 -0500 (0:00:00.150) 0:06:41.571 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:14 Friday 17 January 2025 04:16:48 -0500 (0:00:00.076) 0:06:41.648 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:19 Friday 17 January 2025 04:16:49 -0500 (0:00:00.056) 0:06:41.704 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:24 Friday 17 January 2025 04:16:49 -0500 (0:00:00.084) 0:06:41.788 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md chunk size regex] ************************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:29 Friday 17 January 2025 04:16:49 -0500 (0:00:00.067) 0:06:41.856 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:37 Friday 17 January 2025 04:16:49 -0500 (0:00:00.114) 0:06:41.970 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:46 Friday 17 January 2025 04:16:49 -0500 (0:00:00.104) 0:06:42.075 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:55 Friday 17 January 2025 04:16:49 -0500 (0:00:00.063) 0:06:42.138 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:64 Friday 17 January 2025 04:16:49 -0500 (0:00:00.076) 0:06:42.214 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:74 Friday 17 January 2025 04:16:49 -0500 (0:00:00.113) 0:06:42.328 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Reset variables used by tests] ******************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:83 Friday 17 January 2025 04:16:49 -0500 (0:00:00.076) 0:06:42.404 ******** ok: [managed-node1] => { "ansible_facts": { "storage_test_md_active_devices_re": null, "storage_test_md_chunk_size_re": null, "storage_test_md_metadata_version_re": null, "storage_test_md_spare_devices_re": null }, "changed": false } TASK [Check LVM RAID] ********************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:86 Friday 17 January 2025 04:16:49 -0500 (0:00:00.075) 0:06:42.480 ******** included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-lvmraid.yml for managed-node1 TASK [Validate pool member LVM RAID settings] ********************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-lvmraid.yml:2 Friday 17 January 2025 04:16:49 -0500 (0:00:00.127) 0:06:42.608 ******** included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml for managed-node1 TASK [Get information about the LV] ******************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:8 Friday 17 January 2025 04:16:50 -0500 (0:00:00.127) 0:06:42.736 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV segment type] ***************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:16 Friday 17 January 2025 04:16:50 -0500 (0:00:00.046) 0:06:42.782 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check segment type] ****************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:20 Friday 17 January 2025 04:16:50 -0500 (0:00:00.066) 0:06:42.849 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV stripe size] ****************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:27 Friday 17 January 2025 04:16:50 -0500 (0:00:00.065) 0:06:42.915 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested stripe size] ***************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:31 Friday 17 January 2025 04:16:50 -0500 (0:00:00.064) 0:06:42.979 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected stripe size] ************************************************ task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:37 Friday 17 January 2025 04:16:50 -0500 (0:00:00.059) 0:06:43.039 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check stripe size] ******************************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:42 Friday 17 January 2025 04:16:50 -0500 (0:00:00.058) 0:06:43.098 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check Thin Pools] ******************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:89 Friday 17 January 2025 04:16:50 -0500 (0:00:00.060) 0:06:43.158 ******** included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-thin.yml for managed-node1 TASK [Validate pool member thinpool settings] ********************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-thin.yml:2 Friday 17 January 2025 04:16:50 -0500 (0:00:00.130) 0:06:43.289 ******** included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml for managed-node1 TASK [Get information about thinpool] ****************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml:8 Friday 17 January 2025 04:16:50 -0500 (0:00:00.156) 0:06:43.445 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is in correct thinpool (when thinp name is provided)] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml:16 Friday 17 January 2025 04:16:50 -0500 (0:00:00.093) 0:06:43.539 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is in thinpool (when thinp name is not provided)] ****** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml:22 Friday 17 January 2025 04:16:50 -0500 (0:00:00.062) 0:06:43.601 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Reset variable used by test] ********************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml:26 Friday 17 January 2025 04:16:50 -0500 (0:00:00.062) 0:06:43.663 ******** ok: [managed-node1] => { "ansible_facts": { "storage_test_thin_status": null }, "changed": false } TASK [Check member encryption] ************************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:92 Friday 17 January 2025 04:16:51 -0500 (0:00:00.096) 0:06:43.759 ******** included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml for managed-node1 TASK [Set test variables] ****************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:5 Friday 17 January 2025 04:16:51 -0500 (0:00:00.160) 0:06:43.920 ******** ok: [managed-node1] => { "ansible_facts": { "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Validate pool member LUKS settings] ************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:10 Friday 17 January 2025 04:16:51 -0500 (0:00:00.073) 0:06:43.993 ******** skipping: [managed-node1] => (item=/dev/sda) => { "_storage_test_pool_member_path": "/dev/sda", "ansible_loop_var": "_storage_test_pool_member_path", "changed": false, "skip_reason": "Conditional result was False" } TASK [Validate pool member crypttab entries] *********************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:17 Friday 17 January 2025 04:16:51 -0500 (0:00:00.062) 0:06:44.056 ******** included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml for managed-node1 TASK [Set variables used by tests] ********************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:2 Friday 17 January 2025 04:16:51 -0500 (0:00:00.107) 0:06:44.163 ******** ok: [managed-node1] => { "ansible_facts": { "_storage_test_crypttab_entries": [] }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:6 Friday 17 January 2025 04:16:51 -0500 (0:00:00.065) 0:06:44.228 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:14 Friday 17 January 2025 04:16:51 -0500 (0:00:00.067) 0:06:44.296 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:23 Friday 17 January 2025 04:16:51 -0500 (0:00:00.077) 0:06:44.374 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:32 Friday 17 January 2025 04:16:51 -0500 (0:00:00.083) 0:06:44.457 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clear test variables] **************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:41 Friday 17 January 2025 04:16:51 -0500 (0:00:00.157) 0:06:44.615 ******** ok: [managed-node1] => { "ansible_facts": { "_storage_test_crypttab_entries": null }, "changed": false } TASK [Clear test variables] **************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:24 Friday 17 January 2025 04:16:51 -0500 (0:00:00.062) 0:06:44.677 ******** ok: [managed-node1] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_crypttab_key_file": null }, "changed": false } TASK [Check VDO] *************************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:95 Friday 17 January 2025 04:16:52 -0500 (0:00:00.068) 0:06:44.746 ******** included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-vdo.yml for managed-node1 TASK [Validate pool member VDO settings] *************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-vdo.yml:2 Friday 17 January 2025 04:16:52 -0500 (0:00:00.137) 0:06:44.883 ******** included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml for managed-node1 TASK [Get information about VDO deduplication] ********************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:8 Friday 17 January 2025 04:16:52 -0500 (0:00:00.137) 0:06:45.021 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is off] *************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:15 Friday 17 January 2025 04:16:52 -0500 (0:00:00.063) 0:06:45.085 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is on] **************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:21 Friday 17 January 2025 04:16:52 -0500 (0:00:00.063) 0:06:45.148 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about VDO compression] *********************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:27 Friday 17 January 2025 04:16:52 -0500 (0:00:00.062) 0:06:45.211 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is off] *************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:34 Friday 17 January 2025 04:16:52 -0500 (0:00:00.062) 0:06:45.273 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is on] **************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:40 Friday 17 January 2025 04:16:52 -0500 (0:00:00.062) 0:06:45.335 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clear test variables] **************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:46 Friday 17 January 2025 04:16:52 -0500 (0:00:00.061) 0:06:45.397 ******** ok: [managed-node1] => { "ansible_facts": { "storage_test_vdo_status": null }, "changed": false } TASK [Check Stratis] *********************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:98 Friday 17 January 2025 04:16:52 -0500 (0:00:00.057) 0:06:45.455 ******** included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml for managed-node1 TASK [Run 'stratis report'] **************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:6 Friday 17 January 2025 04:16:52 -0500 (0:00:00.152) 0:06:45.608 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about Stratis] ******************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:11 Friday 17 January 2025 04:16:52 -0500 (0:00:00.059) 0:06:45.668 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the pools was created] *************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:15 Friday 17 January 2025 04:16:53 -0500 (0:00:00.068) 0:06:45.736 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that encryption is correctly set] ********************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:25 Friday 17 January 2025 04:16:53 -0500 (0:00:00.081) 0:06:45.818 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that Clevis/Tang encryption is correctly set] ********************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:34 Friday 17 January 2025 04:16:53 -0500 (0:00:00.084) 0:06:45.902 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Reset variable used by test] ********************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:44 Friday 17 January 2025 04:16:53 -0500 (0:00:00.065) 0:06:45.968 ******** ok: [managed-node1] => { "ansible_facts": { "storage_test_stratis_report": null }, "changed": false } TASK [Clean up test variables] ************************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:101 Friday 17 January 2025 04:16:53 -0500 (0:00:00.073) 0:06:46.042 ******** ok: [managed-node1] => { "ansible_facts": { "__pvs_lvm_len": null, "_storage_test_expected_pv_count": null, "_storage_test_expected_pv_type": null, "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [Verify the volumes] ****************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-volumes.yml:3 Friday 17 January 2025 04:16:53 -0500 (0:00:00.063) 0:06:46.105 ******** included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml for managed-node1 TASK [Set storage volume test variables] *************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:2 Friday 17 January 2025 04:16:53 -0500 (0:00:00.175) 0:06:46.281 ******** ok: [managed-node1] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [Run test verify for {{ storage_test_volume_subset }}] ******************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:19 Friday 17 January 2025 04:16:53 -0500 (0:00:00.074) 0:06:46.355 ******** included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml for managed-node1 included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml for managed-node1 included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml for managed-node1 included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml for managed-node1 included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml for managed-node1 included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml for managed-node1 included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml for managed-node1 included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml for managed-node1 TASK [Get expected mount device based on device type] ************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:7 Friday 17 January 2025 04:16:53 -0500 (0:00:00.340) 0:06:46.696 ******** ok: [managed-node1] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/luks-eb5a5901-32b6-427f-885e-6ee3011725ec" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:11 Friday 17 January 2025 04:16:54 -0500 (0:00:00.079) 0:06:46.775 ******** ok: [managed-node1] => { "ansible_facts": { "storage_test_mount_expected_mount_point": "/opt/test1", "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Get information about the mountpoint directory] ************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:19 Friday 17 January 2025 04:16:54 -0500 (0:00:00.074) 0:06:46.849 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by device] ******************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:28 Friday 17 January 2025 04:16:54 -0500 (0:00:00.069) 0:06:46.919 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Verify mount directory user] ********************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:36 Friday 17 January 2025 04:16:54 -0500 (0:00:00.112) 0:06:47.031 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory group] ******************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:42 Friday 17 January 2025 04:16:54 -0500 (0:00:00.076) 0:06:47.108 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory permissions] ************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:48 Friday 17 January 2025 04:16:54 -0500 (0:00:00.060) 0:06:47.169 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get path of test volume device] ****************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:57 Friday 17 January 2025 04:16:54 -0500 (0:00:00.069) 0:06:47.238 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:63 Friday 17 January 2025 04:16:54 -0500 (0:00:00.056) 0:06:47.295 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:69 Friday 17 January 2025 04:16:54 -0500 (0:00:00.057) 0:06:47.353 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:79 Friday 17 January 2025 04:16:54 -0500 (0:00:00.074) 0:06:47.428 ******** ok: [managed-node1] => { "ansible_facts": { "storage_test_found_mount_stat": null, "storage_test_mount_expected_mount_point": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:2 Friday 17 January 2025 04:16:54 -0500 (0:00:00.113) 0:06:47.541 ******** ok: [managed-node1] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "/dev/mapper/luks-eb5a5901-32b6-427f-885e-6ee3011725ec " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:17 Friday 17 January 2025 04:16:55 -0500 (0:00:00.158) 0:06:47.700 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:24 Friday 17 January 2025 04:16:55 -0500 (0:00:00.116) 0:06:47.817 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:33 Friday 17 January 2025 04:16:55 -0500 (0:00:00.076) 0:06:47.893 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify fingerprint] ****************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:45 Friday 17 January 2025 04:16:55 -0500 (0:00:00.071) 0:06:47.964 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Clean up variables] ****************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:52 Friday 17 January 2025 04:16:55 -0500 (0:00:00.081) 0:06:48.046 ******** ok: [managed-node1] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:6 Friday 17 January 2025 04:16:55 -0500 (0:00:00.048) 0:06:48.095 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:14 Friday 17 January 2025 04:16:55 -0500 (0:00:00.055) 0:06:48.150 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:3 Friday 17 January 2025 04:16:55 -0500 (0:00:00.056) 0:06:48.207 ******** ok: [managed-node1] => { "changed": false, "stat": { "atime": 1737105396.3190718, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1737105396.3190718, "dev": 5, "device_type": 64768, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 100994, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1737105396.3190718, "nlink": 1, "path": "/dev/mapper/foo-test1", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:9 Friday 17 January 2025 04:16:56 -0500 (0:00:00.491) 0:06:48.698 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:16 Friday 17 January 2025 04:16:56 -0500 (0:00:00.108) 0:06:48.807 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Make sure we got info about this volume] ********************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:23 Friday 17 January 2025 04:16:56 -0500 (0:00:00.100) 0:06:48.907 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Process volume type (set initial value) (1/2)] *************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:29 Friday 17 January 2025 04:16:56 -0500 (0:00:00.094) 0:06:49.002 ******** ok: [managed-node1] => { "ansible_facts": { "st_volume_type": "lvm" }, "changed": false } TASK [Process volume type (get RAID value) (2/2)] ****************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:33 Friday 17 January 2025 04:16:56 -0500 (0:00:00.112) 0:06:49.115 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:38 Friday 17 January 2025 04:16:56 -0500 (0:00:00.066) 0:06:49.181 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:3 Friday 17 January 2025 04:16:56 -0500 (0:00:00.085) 0:06:49.267 ******** ok: [managed-node1] => { "changed": false, "stat": { "atime": 1737105396.429072, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1737105396.429072, "dev": 5, "device_type": 64769, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 101064, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1737105396.429072, "nlink": 1, "path": "/dev/mapper/luks-eb5a5901-32b6-427f-885e-6ee3011725ec", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Ensure cryptsetup is present] ******************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:10 Friday 17 January 2025 04:16:57 -0500 (0:00:00.578) 0:06:49.845 ******** ok: [managed-node1] => { "changed": false, "rc": 0, "results": [ "cryptsetup-2.0.3-6.el7.x86_64 providing cryptsetup is already installed" ] } TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:16 Friday 17 January 2025 04:16:58 -0500 (0:00:00.892) 0:06:50.738 ******** ok: [managed-node1] => { "changed": false, "cmd": [ "cryptsetup", "luksDump", "/dev/mapper/foo-test1" ], "delta": "0:00:00.026294", "end": "2025-01-17 04:16:58.549407", "rc": 0, "start": "2025-01-17 04:16:58.523113" } STDOUT: LUKS header information Version: 2 Epoch: 3 Metadata area: 12288 bytes UUID: eb5a5901-32b6-427f-885e-6ee3011725ec Label: (no label) Subsystem: (no subsystem) Flags: (no flags) Data segments: 0: crypt offset: 4194304 [bytes] length: (whole device) cipher: aes-xts-plain64 sector: 512 [bytes] Keyslots: 0: luks2 Key: 512 bits Priority: normal Cipher: aes-xts-plain64 PBKDF: argon2i Time cost: 4 Memory: 667697 Threads: 2 Salt: 6f 9a 3e 9e e2 0e 46 09 0b ee 55 de 2a c2 30 3d a3 72 ce ce ba f4 8f f2 ae a5 4b 29 89 7d 31 8c AF stripes: 4000 Area offset:32768 [bytes] Area length:258048 [bytes] Digest ID: 0 Tokens: Digests: 0: pbkdf2 Hash: sha256 Iterations: 23141 Salt: 19 79 c4 14 3b ce 5b bf 2d ae cd bf 0a 2c 95 77 51 24 2d fd 1a e4 0f 39 25 31 bf 94 11 34 3e 49 Digest: e1 b4 3c 15 00 ed 6e 18 67 dc 86 c7 a5 e7 10 8d 14 b0 93 e4 06 27 e1 9e 9b 78 64 e2 5f 21 0d 5d TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:22 Friday 17 January 2025 04:16:58 -0500 (0:00:00.612) 0:06:51.351 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:29 Friday 17 January 2025 04:16:58 -0500 (0:00:00.086) 0:06:51.437 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:40 Friday 17 January 2025 04:16:58 -0500 (0:00:00.078) 0:06:51.516 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:46 Friday 17 January 2025 04:16:58 -0500 (0:00:00.097) 0:06:51.613 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Check LUKS version] ****************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:51 Friday 17 January 2025 04:16:59 -0500 (0:00:00.094) 0:06:51.707 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Check LUKS key size] ***************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:63 Friday 17 January 2025 04:16:59 -0500 (0:00:00.079) 0:06:51.787 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Check LUKS cipher] ******************************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:75 Friday 17 January 2025 04:16:59 -0500 (0:00:00.084) 0:06:51.872 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Set test variables] ****************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:87 Friday 17 January 2025 04:16:59 -0500 (0:00:00.079) 0:06:51.951 ******** ok: [managed-node1] => { "ansible_facts": { "_storage_test_crypttab_entries": [ "luks-eb5a5901-32b6-427f-885e-6ee3011725ec /dev/mapper/foo-test1 -" ], "_storage_test_expected_crypttab_entries": "1", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:93 Friday 17 January 2025 04:16:59 -0500 (0:00:00.085) 0:06:52.037 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:100 Friday 17 January 2025 04:16:59 -0500 (0:00:00.094) 0:06:52.132 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:108 Friday 17 January 2025 04:16:59 -0500 (0:00:00.078) 0:06:52.210 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Check key file of crypttab entry] **************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:116 Friday 17 January 2025 04:16:59 -0500 (0:00:00.073) 0:06:52.284 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Clear test variables] **************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:124 Friday 17 January 2025 04:16:59 -0500 (0:00:00.077) 0:06:52.361 ******** ok: [managed-node1] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [Get information about RAID] ********************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:8 Friday 17 January 2025 04:16:59 -0500 (0:00:00.073) 0:06:52.435 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:14 Friday 17 January 2025 04:16:59 -0500 (0:00:00.100) 0:06:52.536 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:19 Friday 17 January 2025 04:16:59 -0500 (0:00:00.116) 0:06:52.652 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:24 Friday 17 January 2025 04:17:00 -0500 (0:00:00.068) 0:06:52.721 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set chunk size regex] **************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:29 Friday 17 January 2025 04:17:00 -0500 (0:00:00.060) 0:06:52.781 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:37 Friday 17 January 2025 04:17:00 -0500 (0:00:00.073) 0:06:52.854 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:46 Friday 17 January 2025 04:17:00 -0500 (0:00:00.056) 0:06:52.910 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:54 Friday 17 January 2025 04:17:00 -0500 (0:00:00.059) 0:06:52.970 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:62 Friday 17 January 2025 04:17:00 -0500 (0:00:00.058) 0:06:53.028 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:70 Friday 17 January 2025 04:17:00 -0500 (0:00:00.098) 0:06:53.127 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the actual size of the volume] ************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:3 Friday 17 January 2025 04:17:00 -0500 (0:00:00.102) 0:06:53.229 ******** ok: [managed-node1] => { "bytes": 4294967296, "changed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } TASK [Parse the requested size of the volume] ********************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:11 Friday 17 January 2025 04:17:01 -0500 (0:00:00.673) 0:06:53.902 ******** ok: [managed-node1] => { "bytes": 4294967296, "changed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } TASK [Establish base value for expected size] ********************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:20 Friday 17 January 2025 04:17:01 -0500 (0:00:00.398) 0:06:54.301 ******** ok: [managed-node1] => { "ansible_facts": { "storage_test_expected_size": "4294967296" }, "changed": false } TASK [Show expected size] ****************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:28 Friday 17 January 2025 04:17:01 -0500 (0:00:00.074) 0:06:54.376 ******** ok: [managed-node1] => { "storage_test_expected_size": "4294967296" } TASK [Get the size of parent/pool device] ************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:32 Friday 17 January 2025 04:17:01 -0500 (0:00:00.063) 0:06:54.439 ******** ok: [managed-node1] => { "bytes": 10737418240, "changed": false, "lvm": "10g", "parted": "10GiB", "size": "10 GiB" } TASK [Show test pool] ********************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:46 Friday 17 January 2025 04:17:02 -0500 (0:00:00.398) 0:06:54.838 ******** skipping: [managed-node1] => {} TASK [Show test blockinfo] ***************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:50 Friday 17 January 2025 04:17:02 -0500 (0:00:00.070) 0:06:54.908 ******** skipping: [managed-node1] => {} TASK [Show test pool size] ***************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:54 Friday 17 January 2025 04:17:02 -0500 (0:00:00.054) 0:06:54.963 ******** skipping: [managed-node1] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:58 Friday 17 January 2025 04:17:02 -0500 (0:00:00.053) 0:06:55.016 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default thin pool reserved space values] ********************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:67 Friday 17 January 2025 04:17:02 -0500 (0:00:00.059) 0:06:55.076 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default minimal thin pool reserved space size] *************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:71 Friday 17 January 2025 04:17:02 -0500 (0:00:00.042) 0:06:55.118 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default maximal thin pool reserved space size] *************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:76 Friday 17 January 2025 04:17:02 -0500 (0:00:00.040) 0:06:55.158 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate maximum usable space in thin pool] ***************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:82 Friday 17 January 2025 04:17:02 -0500 (0:00:00.039) 0:06:55.198 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply upper size limit to max usable thin pool space] ******************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:86 Friday 17 January 2025 04:17:02 -0500 (0:00:00.053) 0:06:55.251 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply lower size limit to max usable thin pool space] ******************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:91 Friday 17 January 2025 04:17:02 -0500 (0:00:00.039) 0:06:55.290 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Convert maximum usable thin pool space from int to Size] ***************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:96 Friday 17 January 2025 04:17:02 -0500 (0:00:00.042) 0:06:55.332 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show max thin pool size] ************************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:101 Friday 17 January 2025 04:17:02 -0500 (0:00:00.039) 0:06:55.372 ******** skipping: [managed-node1] => {} TASK [Show volume thin pool size] ********************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:105 Friday 17 January 2025 04:17:02 -0500 (0:00:00.040) 0:06:55.412 ******** skipping: [managed-node1] => {} TASK [Show test volume size] *************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:109 Friday 17 January 2025 04:17:02 -0500 (0:00:00.039) 0:06:55.452 ******** skipping: [managed-node1] => {} TASK [Establish base value for expected thin pool size] ************************ task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:113 Friday 17 January 2025 04:17:02 -0500 (0:00:00.039) 0:06:55.491 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:120 Friday 17 January 2025 04:17:02 -0500 (0:00:00.038) 0:06:55.530 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected thin pool volume size] ***************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:127 Friday 17 January 2025 04:17:02 -0500 (0:00:00.041) 0:06:55.572 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected thin pool volume size based on percentage value] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:131 Friday 17 January 2025 04:17:02 -0500 (0:00:00.040) 0:06:55.612 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Replace expected volume size with calculated value] ********************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:137 Friday 17 January 2025 04:17:02 -0500 (0:00:00.039) 0:06:55.652 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show actual size] ******************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:143 Friday 17 January 2025 04:17:02 -0500 (0:00:00.039) 0:06:55.691 ******** ok: [managed-node1] => { "storage_test_actual_size": { "bytes": 4294967296, "changed": false, "failed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } } TASK [Show expected size] ****************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:147 Friday 17 January 2025 04:17:03 -0500 (0:00:00.043) 0:06:55.734 ******** ok: [managed-node1] => { "storage_test_expected_size": "4294967296" } TASK [Assert expected size is actual size] ************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:151 Friday 17 January 2025 04:17:03 -0500 (0:00:00.053) 0:06:55.788 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Get information about the LV] ******************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:5 Friday 17 January 2025 04:17:03 -0500 (0:00:00.085) 0:06:55.873 ******** ok: [managed-node1] => { "changed": false, "cmd": [ "lvs", "--noheadings", "--nameprefixes", "--units=b", "--nosuffix", "--unquoted", "-o", "name,attr,cache_total_blocks,chunk_size,segtype", "foo/test1" ], "delta": "0:00:00.018946", "end": "2025-01-17 04:17:03.495833", "rc": 0, "start": "2025-01-17 04:17:03.476887" } STDOUT: LVM2_LV_NAME=test1 LVM2_LV_ATTR=-wi-ao---- LVM2_CACHE_TOTAL_BLOCKS= LVM2_CHUNK_SIZE=0 LVM2_SEGTYPE=linear TASK [Set LV segment type] ***************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:13 Friday 17 January 2025 04:17:03 -0500 (0:00:00.415) 0:06:56.289 ******** ok: [managed-node1] => { "ansible_facts": { "storage_test_lv_segtype": [ "linear" ] }, "changed": false } TASK [Check segment type] ****************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:17 Friday 17 January 2025 04:17:03 -0500 (0:00:00.088) 0:06:56.377 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Set LV cache size] ******************************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:24 Friday 17 January 2025 04:17:03 -0500 (0:00:00.065) 0:06:56.443 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested cache size] ****************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:31 Friday 17 January 2025 04:17:03 -0500 (0:00:00.052) 0:06:56.496 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected cache size] ************************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:37 Friday 17 January 2025 04:17:03 -0500 (0:00:00.053) 0:06:56.549 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:42 Friday 17 January 2025 04:17:03 -0500 (0:00:00.051) 0:06:56.601 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:25 Friday 17 January 2025 04:17:03 -0500 (0:00:00.044) 0:06:56.646 ******** ok: [managed-node1] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:44 Friday 17 January 2025 04:17:03 -0500 (0:00:00.039) 0:06:56.685 ******** TASK [Clean up variable namespace] ********************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:54 Friday 17 January 2025 04:17:04 -0500 (0:00:00.036) 0:06:56.721 ******** ok: [managed-node1] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null }, "changed": false } TASK [Verify preservation of encryption settings on existing LVM volume] ******* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:409 Friday 17 January 2025 04:17:04 -0500 (0:00:00.041) 0:06:56.762 ******** TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Friday 17 January 2025 04:17:04 -0500 (0:00:00.085) 0:06:56.848 ******** included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node1 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Friday 17 January 2025 04:17:04 -0500 (0:00:00.129) 0:06:56.977 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Friday 17 January 2025 04:17:04 -0500 (0:00:00.048) 0:06:57.026 ******** skipping: [managed-node1] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node1] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node1] => (item=CentOS_7.yml) => { "ansible_facts": { "__storage_blivet_diskvolume_mkfs_option_map": { "ext2": "-F", "ext3": "-F", "ext4": "-F" }, "blivet_package_list": [ "python-enum34", "python-blivet3", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "{{ 'libblockdev-s390' if ansible_architecture == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_7.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_7.yml" } skipping: [managed-node1] => (item=CentOS_7.9.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS_7.9.yml", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Friday 17 January 2025 04:17:04 -0500 (0:00:00.094) 0:06:57.121 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Friday 17 January 2025 04:17:04 -0500 (0:00:00.044) 0:06:57.165 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Friday 17 January 2025 04:17:04 -0500 (0:00:00.065) 0:06:57.231 ******** ok: [managed-node1] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Friday 17 January 2025 04:17:04 -0500 (0:00:00.059) 0:06:57.291 ******** ok: [managed-node1] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Friday 17 January 2025 04:17:04 -0500 (0:00:00.069) 0:06:57.360 ******** included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node1 TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2 Friday 17 January 2025 04:17:04 -0500 (0:00:00.149) 0:06:57.510 ******** ok: [managed-node1] => { "changed": false, "rc": 0, "results": [ "python-enum34-1.0.4-1.el7.noarch providing python-enum34 is already installed", "1:python2-blivet3-3.1.3-3.el7.noarch providing python-blivet3 is already installed", "libblockdev-crypto-2.18-5.el7.x86_64 providing libblockdev-crypto is already installed", "libblockdev-dm-2.18-5.el7.x86_64 providing libblockdev-dm is already installed", "libblockdev-lvm-2.18-5.el7.x86_64 providing libblockdev-lvm is already installed", "libblockdev-mdraid-2.18-5.el7.x86_64 providing libblockdev-mdraid is already installed", "libblockdev-swap-2.18-5.el7.x86_64 providing libblockdev-swap is already installed", "libblockdev-2.18-5.el7.x86_64 providing libblockdev is already installed" ] } TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Friday 17 January 2025 04:17:08 -0500 (0:00:03.956) 0:07:01.467 ******** ok: [managed-node1] => { "storage_pools": [ { "disks": [ "sda" ], "name": "foo", "type": "lvm", "volumes": [ { "mount_point": "/opt/test1", "name": "test1", "size": "4g" } ] } ] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:14 Friday 17 January 2025 04:17:08 -0500 (0:00:00.071) 0:07:01.538 ******** ok: [managed-node1] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19 Friday 17 January 2025 04:17:08 -0500 (0:00:00.083) 0:07:01.622 ******** ok: [managed-node1] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "lvm2" ], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:31 Friday 17 January 2025 04:17:13 -0500 (0:00:04.234) 0:07:05.856 ******** included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node1 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Friday 17 January 2025 04:17:13 -0500 (0:00:00.090) 0:07:05.947 ******** TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Friday 17 January 2025 04:17:13 -0500 (0:00:00.055) 0:07:06.003 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Friday 17 January 2025 04:17:13 -0500 (0:00:00.063) 0:07:06.066 ******** TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:37 Friday 17 January 2025 04:17:13 -0500 (0:00:00.055) 0:07:06.122 ******** ok: [managed-node1] => { "changed": false, "rc": 0, "results": [ "7:lvm2-2.02.187-6.el7_9.5.x86_64 providing lvm2 is already installed", "kpartx-0.4.9-136.el7_9.x86_64 providing kpartx is already installed" ] } TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:51 Friday 17 January 2025 04:17:14 -0500 (0:00:00.767) 0:07:06.889 ******** ok: [managed-node1] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "arp-ethers.service": { "name": "arp-ethers.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "brandbot.service": { "name": "brandbot.service", "source": "systemd", "state": "inactive", "status": "static" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "console-shell.service": { "name": "console-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.import1.service": { "name": "dbus-org.freedesktop.import1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.machine1.service": { "name": "dbus-org.freedesktop.machine1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dmraid-activation.service": { "name": "dmraid-activation.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "ebtables.service": { "name": "ebtables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iprdump.service": { "name": "iprdump.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprinit.service": { "name": "iprinit.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprupdate.service": { "name": "iprupdate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-lvmetad.service": { "name": "lvm2-lvmetad.service", "source": "systemd", "state": "running", "status": "static" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "lvm2-pvscan@8:0.service": { "name": "lvm2-pvscan@8:0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "netconsole": { "name": "netconsole", "source": "sysv", "state": "stopped", "status": "disabled" }, "network": { "name": "network", "source": "sysv", "state": "running", "status": "enabled" }, "network.service": { "name": "network.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-config.service": { "name": "nfs-config.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-idmap.service": { "name": "nfs-idmap.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-lock.service": { "name": "nfs-lock.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-rquotad.service": { "name": "nfs-rquotad.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-secure.service": { "name": "nfs-secure.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs.service": { "name": "nfs.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfslock.service": { "name": "nfslock.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "postfix.service": { "name": "postfix.service", "source": "systemd", "state": "running", "status": "enabled" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rhel-autorelabel-mark.service": { "name": "rhel-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-autorelabel.service": { "name": "rhel-autorelabel.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-configure.service": { "name": "rhel-configure.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-dmesg.service": { "name": "rhel-dmesg.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-domainname.service": { "name": "rhel-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-import-state.service": { "name": "rhel-import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-loadmodules.service": { "name": "rhel-loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-readonly.service": { "name": "rhel-readonly.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-rquotad.service": { "name": "rpc-rquotad.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpcgssd.service": { "name": "rpcgssd.service", "source": "systemd", "state": "inactive", "status": "static" }, "rpcidmapd.service": { "name": "rpcidmapd.service", "source": "systemd", "state": "inactive", "status": "static" }, "rsyncd.service": { "name": "rsyncd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyncd@.service": { "name": "rsyncd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-policy-migrate-local-changes@.service": { "name": "selinux-policy-migrate-local-changes@.service", "source": "systemd", "state": "unknown", "status": "static" }, "selinux-policy-migrate-local-changes@targeted.service": { "name": "selinux-policy-migrate-local-changes@targeted.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "unknown" }, "sshd-keygen.service": { "name": "sshd-keygen.service", "source": "systemd", "state": "stopped", "status": "static" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-bootchart.service": { "name": "systemd-bootchart.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-cryptsetup@luks\\x2d0989597c\\x2d795f\\x2d4f25\\x2db2f2\\x2d121f3eb103c2.service": { "name": "systemd-cryptsetup@luks\\x2d0989597c\\x2d795f\\x2d4f25\\x2db2f2\\x2d121f3eb103c2.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-importd.service": { "name": "systemd-importd.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-machined.service": { "name": "systemd-machined.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-nspawn@.service": { "name": "systemd-nspawn@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-readahead-collect.service": { "name": "systemd-readahead-collect.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-readahead-done.service": { "name": "systemd-readahead-done.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "systemd-readahead-drop.service": { "name": "systemd-readahead-drop.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "systemd-readahead-replay.service": { "name": "systemd-readahead-replay.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill@.service": { "name": "systemd-rfkill@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-shutdownd.service": { "name": "systemd-shutdownd.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "wpa_supplicant.service": { "name": "wpa_supplicant.service", "source": "systemd", "state": "inactive", "status": "disabled" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Set storage_cryptsetup_services] ***** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:57 Friday 17 January 2025 04:17:15 -0500 (0:00:01.044) 0:07:07.934 ******** ok: [managed-node1] => { "ansible_facts": { "storage_cryptsetup_services": [ "systemd-cryptsetup@luks\\x2d0989597c\\x2d795f\\x2d4f25\\x2db2f2\\x2d121f3eb103c2.service" ] }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:63 Friday 17 January 2025 04:17:15 -0500 (0:00:00.085) 0:07:08.019 ******** changed: [managed-node1] => (item=systemd-cryptsetup@luks\x2d0989597c\x2d795f\x2d4f25\x2db2f2\x2d121f3eb103c2.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2d0989597c\\x2d795f\\x2d4f25\\x2db2f2\\x2d121f3eb103c2.service", "name": "systemd-cryptsetup@luks\\x2d0989597c\\x2d795f\\x2d4f25\\x2db2f2\\x2d121f3eb103c2.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "After": "cryptsetup-pre.target systemd-readahead-collect.service dev-sda1.device systemd-readahead-replay.service systemd-journald.socket system-systemd\\x2dcryptsetup.slice", "AllowIsolate": "no", "AmbientCapabilities": "0", "AssertResult": "no", "AssertTimestampMonotonic": "0", "Before": "cryptsetup.target umount.target", "BindsTo": "dev-sda1.device", "BlockIOAccounting": "no", "BlockIOWeight": "18446744073709551615", "CPUAccounting": "no", "CPUQuotaPerSecUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "18446744073709551615", "CanIsolate": "no", "CanReload": "no", "CanStart": "yes", "CanStop": "yes", "CapabilityBoundingSet": "18446744073709551615", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "Conflicts": "umount.target", "ControlPID": "0", "DefaultDependencies": "no", "Delegate": "no", "Description": "Cryptography Setup for luks-0989597c-795f-4f25-b2f2-121f3eb103c2", "DevicePolicy": "auto", "Documentation": "man:crypttab(5) man:systemd-cryptsetup-generator(8) man:systemd-cryptsetup@.service(8)", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup attach luks-0989597c-795f-4f25-b2f2-121f3eb103c2 /dev/sda1 VALUE_SPECIFIED_IN_NO_LOG_PARAMETER ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStop": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup detach luks-0989597c-795f-4f25-b2f2-121f3eb103c2 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/run/systemd/generator/systemd-cryptsetup@luks\\x2d0989597c\\x2d795f\\x2d4f25\\x2db2f2\\x2d121f3eb103c2.service", "GuessMainPID": "yes", "IOScheduling": "0", "Id": "systemd-cryptsetup@luks\\x2d0989597c\\x2d795f\\x2d4f25\\x2db2f2\\x2d121f3eb103c2.service", "IgnoreOnIsolate": "yes", "IgnoreOnSnapshot": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobTimeoutAction": "none", "JobTimeoutUSec": "0", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "18446744073709551615", "LimitCORE": "18446744073709551615", "LimitCPU": "18446744073709551615", "LimitDATA": "18446744073709551615", "LimitFSIZE": "18446744073709551615", "LimitLOCKS": "18446744073709551615", "LimitMEMLOCK": "65536", "LimitMSGQUEUE": "819200", "LimitNICE": "0", "LimitNOFILE": "4096", "LimitNPROC": "14311", "LimitRSS": "18446744073709551615", "LimitRTPRIO": "0", "LimitRTTIME": "18446744073709551615", "LimitSIGPENDING": "14311", "LimitSTACK": "18446744073709551615", "LoadState": "loaded", "MainPID": "0", "MemoryAccounting": "no", "MemoryCurrent": "18446744073709551615", "MemoryLimit": "18446744073709551615", "MountFlags": "0", "Names": "systemd-cryptsetup@luks\\x2d0989597c\\x2d795f\\x2d4f25\\x2db2f2\\x2d121f3eb103c2.service", "NeedDaemonReload": "yes", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "PrivateDevices": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "ProtectHome": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "yes", "RequiredBy": "cryptsetup.target", "Requires": "system-systemd\\x2dcryptsetup.slice", "Restart": "no", "RestartUSec": "100ms", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "Slice": "system-systemd\\x2dcryptsetup.slice", "SourcePath": "/etc/crypttab", "StandardError": "inherit", "StandardInput": "null", "StandardOutput": "journal", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitInterval": "10000000", "StartupBlockIOWeight": "18446744073709551615", "StartupCPUShares": "18446744073709551615", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "no", "TasksCurrent": "18446744073709551615", "TasksMax": "18446744073709551615", "TimeoutStartUSec": "0", "TimeoutStopUSec": "0", "TimerSlackNSec": "50000", "Transient": "no", "Type": "oneshot", "UMask": "0022", "UnitFilePreset": "disabled", "UnitFileState": "bad", "WantedBy": "dev-sda1.device", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:69 Friday 17 January 2025 04:17:15 -0500 (0:00:00.585) 0:07:08.605 ******** ok: [managed-node1] => { "actions": [], "changed": false, "crypts": [], "leaves": [ "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1", "/dev/mapper/luks-eb5a5901-32b6-427f-885e-6ee3011725ec" ], "mounts": [ { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-eb5a5901-32b6-427f-885e-6ee3011725ec", "state": "mounted" } ], "packages": [ "cryptsetup", "xfsprogs", "e2fsprogs", "lvm2" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/luks-eb5a5901-32b6-427f-885e-6ee3011725ec", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/luks-eb5a5901-32b6-427f-885e-6ee3011725ec", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks2", "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } TASK [fedora.linux_system_roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:83 Friday 17 January 2025 04:17:20 -0500 (0:00:04.305) 0:07:12.911 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if /etc/fstab is present] ****** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:90 Friday 17 January 2025 04:17:20 -0500 (0:00:00.054) 0:07:12.966 ******** ok: [managed-node1] => { "changed": false, "stat": { "atime": 1737105399.703075, "attr_flags": "e", "attributes": [ "extents" ], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "f6ffd111ea1768ce77f1edc39e4c6ca3c9b99946", "ctime": 1737105399.7010748, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 263644, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0644", "mtime": 1737105399.7010748, "nlink": 1, "path": "/etc/fstab", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 1311, "uid": 0, "version": "18446744072031193646", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Add fingerprint to /etc/fstab if present] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:95 Friday 17 January 2025 04:17:20 -0500 (0:00:00.330) 0:07:13.297 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:113 Friday 17 January 2025 04:17:20 -0500 (0:00:00.042) 0:07:13.340 ******** changed: [managed-node1] => (item=systemd-cryptsetup@luks\x2d0989597c\x2d795f\x2d4f25\x2db2f2\x2d121f3eb103c2.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2d0989597c\\x2d795f\\x2d4f25\\x2db2f2\\x2d121f3eb103c2.service", "name": "systemd-cryptsetup@luks\\x2d0989597c\\x2d795f\\x2d4f25\\x2db2f2\\x2d121f3eb103c2.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AmbientCapabilities": "0", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "18446744073709551615", "CPUAccounting": "no", "CPUQuotaPerSecUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "18446744073709551615", "CanIsolate": "no", "CanReload": "no", "CanStart": "yes", "CanStop": "yes", "CapabilityBoundingSet": "18446744073709551615", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ControlPID": "0", "DefaultDependencies": "yes", "Delegate": "no", "Description": "systemd-cryptsetup@luks\\x2d0989597c\\x2d795f\\x2d4f25\\x2db2f2\\x2d121f3eb103c2.service", "DevicePolicy": "auto", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/dev/null", "GuessMainPID": "yes", "IOScheduling": "0", "Id": "systemd-cryptsetup@luks\\x2d0989597c\\x2d795f\\x2d4f25\\x2db2f2\\x2d121f3eb103c2.service", "IgnoreOnIsolate": "no", "IgnoreOnSnapshot": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobTimeoutAction": "none", "JobTimeoutUSec": "0", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "18446744073709551615", "LimitCORE": "18446744073709551615", "LimitCPU": "18446744073709551615", "LimitDATA": "18446744073709551615", "LimitFSIZE": "18446744073709551615", "LimitLOCKS": "18446744073709551615", "LimitMEMLOCK": "65536", "LimitMSGQUEUE": "819200", "LimitNICE": "0", "LimitNOFILE": "1048576", "LimitNPROC": "14311", "LimitRSS": "18446744073709551615", "LimitRTPRIO": "0", "LimitRTTIME": "18446744073709551615", "LimitSIGPENDING": "14311", "LimitSTACK": "18446744073709551615", "LoadState": "masked", "MainPID": "0", "MemoryAccounting": "no", "MemoryCurrent": "18446744073709551615", "MemoryLimit": "18446744073709551615", "MountFlags": "0", "Names": "systemd-cryptsetup@luks\\x2d0989597c\\x2d795f\\x2d4f25\\x2db2f2\\x2d121f3eb103c2.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "PrivateDevices": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "ProtectHome": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "no", "Restart": "no", "RestartUSec": "100ms", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitInterval": "10000000", "StartupBlockIOWeight": "18446744073709551615", "StartupCPUShares": "18446744073709551615", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "no", "TasksCurrent": "18446744073709551615", "TasksMax": "18446744073709551615", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UMask": "0022", "UnitFileState": "bad", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } TASK [fedora.linux_system_roles.storage : Show blivet_output] ****************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:119 Friday 17 January 2025 04:17:21 -0500 (0:00:00.488) 0:07:13.829 ******** ok: [managed-node1] => { "blivet_output": { "actions": [], "changed": false, "crypts": [], "failed": false, "leaves": [ "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1", "/dev/mapper/luks-eb5a5901-32b6-427f-885e-6ee3011725ec" ], "mounts": [ { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-eb5a5901-32b6-427f-885e-6ee3011725ec", "state": "mounted" } ], "packages": [ "cryptsetup", "xfsprogs", "e2fsprogs", "lvm2" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/luks-eb5a5901-32b6-427f-885e-6ee3011725ec", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/luks-eb5a5901-32b6-427f-885e-6ee3011725ec", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks2", "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } } TASK [fedora.linux_system_roles.storage : Set the list of pools for test verification] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:128 Friday 17 January 2025 04:17:21 -0500 (0:00:00.051) 0:07:13.880 ******** ok: [managed-node1] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/luks-eb5a5901-32b6-427f-885e-6ee3011725ec", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/luks-eb5a5901-32b6-427f-885e-6ee3011725ec", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks2", "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ] }, "changed": false } TASK [fedora.linux_system_roles.storage : Set the list of volumes for test verification] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:132 Friday 17 January 2025 04:17:21 -0500 (0:00:00.048) 0:07:13.929 ******** ok: [managed-node1] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Remove obsolete mounts] ************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:148 Friday 17 January 2025 04:17:21 -0500 (0:00:00.055) 0:07:13.985 ******** TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:159 Friday 17 January 2025 04:17:21 -0500 (0:00:00.038) 0:07:14.023 ******** ok: [managed-node1] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Set up new/current mounts] *********** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:164 Friday 17 January 2025 04:17:21 -0500 (0:00:00.458) 0:07:14.482 ******** ok: [managed-node1] => (item={u'src': u'/dev/mapper/luks-eb5a5901-32b6-427f-885e-6ee3011725ec', u'group': None, u'dump': 0, u'passno': 0, u'fstype': u'xfs', u'state': u'mounted', u'mode': None, u'owner': None, u'path': u'/opt/test1', u'opts': u'defaults'}) => { "ansible_loop_var": "mount_info", "changed": false, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-eb5a5901-32b6-427f-885e-6ee3011725ec", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/luks-eb5a5901-32b6-427f-885e-6ee3011725ec" } TASK [fedora.linux_system_roles.storage : Manage mount ownership/permissions] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:175 Friday 17 January 2025 04:17:22 -0500 (0:00:00.438) 0:07:14.920 ******** skipping: [managed-node1] => (item={u'src': u'/dev/mapper/luks-eb5a5901-32b6-427f-885e-6ee3011725ec', u'group': None, u'dump': 0, u'passno': 0, u'fstype': u'xfs', u'state': u'mounted', u'mode': None, u'owner': None, u'path': u'/opt/test1', u'opts': u'defaults'}) => { "ansible_loop_var": "mount_info", "changed": false, "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-eb5a5901-32b6-427f-885e-6ee3011725ec", "state": "mounted" }, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:187 Friday 17 January 2025 04:17:22 -0500 (0:00:00.048) 0:07:14.969 ******** ok: [managed-node1] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Retrieve facts for the /etc/crypttab file] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:195 Friday 17 January 2025 04:17:22 -0500 (0:00:00.473) 0:07:15.443 ******** ok: [managed-node1] => { "changed": false, "stat": { "atime": 1737105405.1960802, "attr_flags": "e", "attributes": [ "extents" ], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "552cf075acd7d67244861700a71df5d7c42dce5c", "ctime": 1737105402.2070773, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 263648, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0600", "mtime": 1737105402.2060773, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 66, "uid": 0, "version": "18446744072031194679", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Manage /etc/crypttab to account for changes we just made] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:200 Friday 17 January 2025 04:17:23 -0500 (0:00:00.369) 0:07:15.812 ******** TASK [fedora.linux_system_roles.storage : Update facts] ************************ task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:222 Friday 17 January 2025 04:17:23 -0500 (0:00:00.054) 0:07:15.868 ******** ok: [managed-node1] TASK [Assert preservation of encryption settings on existing LVM volume] ******* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:423 Friday 17 January 2025 04:17:23 -0500 (0:00:00.719) 0:07:16.587 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Verify role results] ***************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:430 Friday 17 January 2025 04:17:23 -0500 (0:00:00.054) 0:07:16.642 ******** included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml for managed-node1 TASK [Print out pool information] ********************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:2 Friday 17 January 2025 04:17:24 -0500 (0:00:00.074) 0:07:16.716 ******** ok: [managed-node1] => { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/luks-eb5a5901-32b6-427f-885e-6ee3011725ec", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/luks-eb5a5901-32b6-427f-885e-6ee3011725ec", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks2", "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ] } TASK [Print out volume information] ******************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:7 Friday 17 January 2025 04:17:24 -0500 (0:00:00.063) 0:07:16.780 ******** skipping: [managed-node1] => {} TASK [Collect info about the volumes.] ***************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:15 Friday 17 January 2025 04:17:24 -0500 (0:00:00.050) 0:07:16.830 ******** ok: [managed-node1] => { "changed": false, "info": { "/dev/mapper/foo-test1": { "fstype": "crypto_LUKS", "label": "", "mountpoint": "", "name": "/dev/mapper/foo-test1", "size": "4G", "type": "lvm", "uuid": "eb5a5901-32b6-427f-885e-6ee3011725ec" }, "/dev/mapper/luks-eb5a5901-32b6-427f-885e-6ee3011725ec": { "fstype": "xfs", "label": "", "mountpoint": "/opt/test1", "name": "/dev/mapper/luks-eb5a5901-32b6-427f-885e-6ee3011725ec", "size": "4G", "type": "crypt", "uuid": "3787e9fb-e091-45d9-9a5a-737d63dc8d5c" }, "/dev/sda": { "fstype": "LVM2_member", "label": "", "mountpoint": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "30b7yk-4Y3c-DHs7-HP7U-2G7i-eOL6-HqzhXx" }, "/dev/sdb": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdd": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdd", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sde": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sde", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdf": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdf", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdg": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdg", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdh": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdh", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdi": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdi", "size": "10G", "type": "disk", "uuid": "" }, "/dev/xvda": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/xvda", "size": "250G", "type": "disk", "uuid": "" }, "/dev/xvda1": { "fstype": "ext4", "label": "", "mountpoint": "/", "name": "/dev/xvda1", "size": "250G", "type": "partition", "uuid": "c7b7d6a5-fd01-4b9b-bcca-153eaff9d312" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:20 Friday 17 January 2025 04:17:24 -0500 (0:00:00.341) 0:07:17.172 ******** ok: [managed-node1] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.003443", "end": "2025-01-17 04:17:24.729699", "rc": 0, "start": "2025-01-17 04:17:24.726256" } STDOUT: # system_role:storage # # /etc/fstab # Created by anaconda on Thu Jun 20 10:23:46 2024 # # Accessible filesystems, by reference, are maintained under '/dev/disk' # See man pages fstab(5), findfs(8), mount(8) and/or blkid(8) for more info # UUID=c7b7d6a5-fd01-4b9b-bcca-153eaff9d312 / ext4 defaults 1 1 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat /mnt/redhat nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/qa /mnt/qa nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 vtap-eng01.storage.rdu2.redhat.com:/vol/engarchive /mnt/engarchive nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/tpsdist /mnt/tpsdist nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat/brewroot /mnt/brew nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_brew_scratch_nfs/scratch /mnt/brew_scratch nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 /dev/mapper/luks-eb5a5901-32b6-427f-885e-6ee3011725ec /opt/test1 xfs defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:25 Friday 17 January 2025 04:17:24 -0500 (0:00:00.320) 0:07:17.492 ******** ok: [managed-node1] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.002752", "end": "2025-01-17 04:17:25.055518", "failed_when_result": false, "rc": 0, "start": "2025-01-17 04:17:25.052766" } STDOUT: luks-eb5a5901-32b6-427f-885e-6ee3011725ec /dev/mapper/foo-test1 - TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:34 Friday 17 January 2025 04:17:25 -0500 (0:00:00.358) 0:07:17.850 ******** included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml for managed-node1 TASK [Set _storage_pool_tests] ************************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:5 Friday 17 January 2025 04:17:25 -0500 (0:00:00.104) 0:07:17.955 ******** ok: [managed-node1] => { "ansible_facts": { "_storage_pool_tests": [ "members", "volumes" ] }, "changed": false } TASK [Get VG shared value status] ********************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:18 Friday 17 January 2025 04:17:25 -0500 (0:00:00.051) 0:07:18.007 ******** ok: [managed-node1] => { "changed": false, "cmd": [ "vgs", "--noheadings", "--binary", "-o", "shared", "foo" ], "delta": "0:00:00.018213", "end": "2025-01-17 04:17:25.593612", "rc": 0, "start": "2025-01-17 04:17:25.575399" } STDOUT: 0 TASK [Verify that VG shared value checks out] ********************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:24 Friday 17 January 2025 04:17:25 -0500 (0:00:00.362) 0:07:18.370 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Verify pool subset] ****************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:34 Friday 17 January 2025 04:17:25 -0500 (0:00:00.082) 0:07:18.453 ******** included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml for managed-node1 included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-volumes.yml for managed-node1 TASK [Set test variables] ****************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:2 Friday 17 January 2025 04:17:25 -0500 (0:00:00.122) 0:07:18.576 ******** ok: [managed-node1] => { "ansible_facts": { "_storage_test_expected_pv_count": "1", "_storage_test_pool_pvs_lvm": [ "/dev/sda" ] }, "changed": false } TASK [Get the canonical device path for each member device] ******************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:8 Friday 17 January 2025 04:17:25 -0500 (0:00:00.066) 0:07:18.642 ******** ok: [managed-node1] => (item=/dev/sda) => { "ansible_loop_var": "pv", "changed": false, "device": "/dev/sda", "pv": "/dev/sda" } TASK [Set pvs lvm length] ****************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:17 Friday 17 January 2025 04:17:26 -0500 (0:00:00.348) 0:07:18.991 ******** ok: [managed-node1] => { "ansible_facts": { "__pvs_lvm_len": "1" }, "changed": false } TASK [Set pool pvs] ************************************************************ task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:22 Friday 17 January 2025 04:17:26 -0500 (0:00:00.059) 0:07:19.051 ******** ok: [managed-node1] => { "ansible_facts": { "_storage_test_pool_pvs": [ "/dev/sda" ] }, "changed": false } TASK [Verify PV count] ********************************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:27 Friday 17 January 2025 04:17:26 -0500 (0:00:00.080) 0:07:19.132 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Set expected pv type] **************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:36 Friday 17 January 2025 04:17:26 -0500 (0:00:00.057) 0:07:19.189 ******** ok: [managed-node1] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [Set expected pv type] **************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:41 Friday 17 January 2025 04:17:26 -0500 (0:00:00.060) 0:07:19.250 ******** ok: [managed-node1] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [Set expected pv type] **************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:46 Friday 17 January 2025 04:17:26 -0500 (0:00:00.046) 0:07:19.296 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check the type of each PV] *********************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:51 Friday 17 January 2025 04:17:26 -0500 (0:00:00.038) 0:07:19.335 ******** ok: [managed-node1] => (item=/dev/sda) => { "ansible_loop_var": "pv", "changed": false, "pv": "/dev/sda" } MSG: All assertions passed TASK [Check that blivet supports PV grow to fill] ****************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:64 Friday 17 January 2025 04:17:26 -0500 (0:00:00.064) 0:07:19.399 ******** ok: [managed-node1] => { "changed": false, "rc": 0 } STDOUT: False STDERR: Shared connection to 10.31.46.65 closed. TASK [Verify that PVs fill the whole devices when they should] ***************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:73 Friday 17 January 2025 04:17:26 -0500 (0:00:00.280) 0:07:19.680 ******** skipping: [managed-node1] => (item=/dev/sda) => { "ansible_loop_var": "st_pool_pv", "changed": false, "skip_reason": "Conditional result was False", "st_pool_pv": "/dev/sda" } TASK [Check MD RAID] *********************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:83 Friday 17 January 2025 04:17:27 -0500 (0:00:00.051) 0:07:19.731 ******** included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml for managed-node1 TASK [Get information about RAID] ********************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:8 Friday 17 January 2025 04:17:27 -0500 (0:00:00.139) 0:07:19.871 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:14 Friday 17 January 2025 04:17:27 -0500 (0:00:00.040) 0:07:19.911 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:19 Friday 17 January 2025 04:17:27 -0500 (0:00:00.041) 0:07:19.952 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:24 Friday 17 January 2025 04:17:27 -0500 (0:00:00.038) 0:07:19.991 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md chunk size regex] ************************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:29 Friday 17 January 2025 04:17:27 -0500 (0:00:00.038) 0:07:20.029 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:37 Friday 17 January 2025 04:17:27 -0500 (0:00:00.038) 0:07:20.068 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:46 Friday 17 January 2025 04:17:27 -0500 (0:00:00.041) 0:07:20.110 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:55 Friday 17 January 2025 04:17:27 -0500 (0:00:00.049) 0:07:20.159 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:64 Friday 17 January 2025 04:17:27 -0500 (0:00:00.055) 0:07:20.215 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:74 Friday 17 January 2025 04:17:27 -0500 (0:00:00.063) 0:07:20.279 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Reset variables used by tests] ******************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:83 Friday 17 January 2025 04:17:27 -0500 (0:00:00.059) 0:07:20.339 ******** ok: [managed-node1] => { "ansible_facts": { "storage_test_md_active_devices_re": null, "storage_test_md_chunk_size_re": null, "storage_test_md_metadata_version_re": null, "storage_test_md_spare_devices_re": null }, "changed": false } TASK [Check LVM RAID] ********************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:86 Friday 17 January 2025 04:17:27 -0500 (0:00:00.063) 0:07:20.403 ******** included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-lvmraid.yml for managed-node1 TASK [Validate pool member LVM RAID settings] ********************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-lvmraid.yml:2 Friday 17 January 2025 04:17:27 -0500 (0:00:00.125) 0:07:20.528 ******** included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml for managed-node1 TASK [Get information about the LV] ******************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:8 Friday 17 January 2025 04:17:27 -0500 (0:00:00.104) 0:07:20.633 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV segment type] ***************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:16 Friday 17 January 2025 04:17:27 -0500 (0:00:00.051) 0:07:20.684 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check segment type] ****************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:20 Friday 17 January 2025 04:17:28 -0500 (0:00:00.047) 0:07:20.731 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV stripe size] ****************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:27 Friday 17 January 2025 04:17:28 -0500 (0:00:00.042) 0:07:20.773 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested stripe size] ***************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:31 Friday 17 January 2025 04:17:28 -0500 (0:00:00.039) 0:07:20.813 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected stripe size] ************************************************ task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:37 Friday 17 January 2025 04:17:28 -0500 (0:00:00.039) 0:07:20.852 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check stripe size] ******************************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:42 Friday 17 January 2025 04:17:28 -0500 (0:00:00.039) 0:07:20.892 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check Thin Pools] ******************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:89 Friday 17 January 2025 04:17:28 -0500 (0:00:00.050) 0:07:20.942 ******** included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-thin.yml for managed-node1 TASK [Validate pool member thinpool settings] ********************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-thin.yml:2 Friday 17 January 2025 04:17:28 -0500 (0:00:00.097) 0:07:21.040 ******** included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml for managed-node1 TASK [Get information about thinpool] ****************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml:8 Friday 17 January 2025 04:17:28 -0500 (0:00:00.081) 0:07:21.121 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is in correct thinpool (when thinp name is provided)] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml:16 Friday 17 January 2025 04:17:28 -0500 (0:00:00.038) 0:07:21.159 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is in thinpool (when thinp name is not provided)] ****** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml:22 Friday 17 January 2025 04:17:28 -0500 (0:00:00.044) 0:07:21.204 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Reset variable used by test] ********************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml:26 Friday 17 January 2025 04:17:28 -0500 (0:00:00.043) 0:07:21.247 ******** ok: [managed-node1] => { "ansible_facts": { "storage_test_thin_status": null }, "changed": false } TASK [Check member encryption] ************************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:92 Friday 17 January 2025 04:17:28 -0500 (0:00:00.037) 0:07:21.285 ******** included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml for managed-node1 TASK [Set test variables] ****************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:5 Friday 17 January 2025 04:17:28 -0500 (0:00:00.088) 0:07:21.373 ******** ok: [managed-node1] => { "ansible_facts": { "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Validate pool member LUKS settings] ************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:10 Friday 17 January 2025 04:17:28 -0500 (0:00:00.045) 0:07:21.419 ******** skipping: [managed-node1] => (item=/dev/sda) => { "_storage_test_pool_member_path": "/dev/sda", "ansible_loop_var": "_storage_test_pool_member_path", "changed": false, "skip_reason": "Conditional result was False" } TASK [Validate pool member crypttab entries] *********************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:17 Friday 17 January 2025 04:17:28 -0500 (0:00:00.044) 0:07:21.463 ******** included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml for managed-node1 TASK [Set variables used by tests] ********************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:2 Friday 17 January 2025 04:17:28 -0500 (0:00:00.076) 0:07:21.540 ******** ok: [managed-node1] => { "ansible_facts": { "_storage_test_crypttab_entries": [] }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:6 Friday 17 January 2025 04:17:28 -0500 (0:00:00.044) 0:07:21.584 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:14 Friday 17 January 2025 04:17:28 -0500 (0:00:00.047) 0:07:21.632 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:23 Friday 17 January 2025 04:17:28 -0500 (0:00:00.038) 0:07:21.670 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:32 Friday 17 January 2025 04:17:29 -0500 (0:00:00.037) 0:07:21.708 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clear test variables] **************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:41 Friday 17 January 2025 04:17:29 -0500 (0:00:00.038) 0:07:21.746 ******** ok: [managed-node1] => { "ansible_facts": { "_storage_test_crypttab_entries": null }, "changed": false } TASK [Clear test variables] **************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:24 Friday 17 January 2025 04:17:29 -0500 (0:00:00.042) 0:07:21.788 ******** ok: [managed-node1] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_crypttab_key_file": null }, "changed": false } TASK [Check VDO] *************************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:95 Friday 17 January 2025 04:17:29 -0500 (0:00:00.039) 0:07:21.828 ******** included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-vdo.yml for managed-node1 TASK [Validate pool member VDO settings] *************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-vdo.yml:2 Friday 17 January 2025 04:17:29 -0500 (0:00:00.104) 0:07:21.933 ******** included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml for managed-node1 TASK [Get information about VDO deduplication] ********************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:8 Friday 17 January 2025 04:17:29 -0500 (0:00:00.111) 0:07:22.044 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is off] *************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:15 Friday 17 January 2025 04:17:29 -0500 (0:00:00.048) 0:07:22.092 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is on] **************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:21 Friday 17 January 2025 04:17:29 -0500 (0:00:00.050) 0:07:22.142 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about VDO compression] *********************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:27 Friday 17 January 2025 04:17:29 -0500 (0:00:00.038) 0:07:22.181 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is off] *************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:34 Friday 17 January 2025 04:17:29 -0500 (0:00:00.037) 0:07:22.219 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is on] **************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:40 Friday 17 January 2025 04:17:29 -0500 (0:00:00.058) 0:07:22.277 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clear test variables] **************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:46 Friday 17 January 2025 04:17:29 -0500 (0:00:00.052) 0:07:22.330 ******** ok: [managed-node1] => { "ansible_facts": { "storage_test_vdo_status": null }, "changed": false } TASK [Check Stratis] *********************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:98 Friday 17 January 2025 04:17:29 -0500 (0:00:00.054) 0:07:22.385 ******** included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml for managed-node1 TASK [Run 'stratis report'] **************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:6 Friday 17 January 2025 04:17:29 -0500 (0:00:00.120) 0:07:22.506 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about Stratis] ******************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:11 Friday 17 January 2025 04:17:29 -0500 (0:00:00.111) 0:07:22.617 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the pools was created] *************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:15 Friday 17 January 2025 04:17:29 -0500 (0:00:00.039) 0:07:22.657 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that encryption is correctly set] ********************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:25 Friday 17 January 2025 04:17:30 -0500 (0:00:00.048) 0:07:22.706 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that Clevis/Tang encryption is correctly set] ********************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:34 Friday 17 January 2025 04:17:30 -0500 (0:00:00.058) 0:07:22.764 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Reset variable used by test] ********************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:44 Friday 17 January 2025 04:17:30 -0500 (0:00:00.057) 0:07:22.822 ******** ok: [managed-node1] => { "ansible_facts": { "storage_test_stratis_report": null }, "changed": false } TASK [Clean up test variables] ************************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:101 Friday 17 January 2025 04:17:30 -0500 (0:00:00.057) 0:07:22.879 ******** ok: [managed-node1] => { "ansible_facts": { "__pvs_lvm_len": null, "_storage_test_expected_pv_count": null, "_storage_test_expected_pv_type": null, "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [Verify the volumes] ****************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-volumes.yml:3 Friday 17 January 2025 04:17:30 -0500 (0:00:00.062) 0:07:22.941 ******** included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml for managed-node1 TASK [Set storage volume test variables] *************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:2 Friday 17 January 2025 04:17:30 -0500 (0:00:00.115) 0:07:23.057 ******** ok: [managed-node1] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [Run test verify for {{ storage_test_volume_subset }}] ******************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:19 Friday 17 January 2025 04:17:30 -0500 (0:00:00.069) 0:07:23.127 ******** included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml for managed-node1 included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml for managed-node1 included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml for managed-node1 included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml for managed-node1 included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml for managed-node1 included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml for managed-node1 included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml for managed-node1 included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml for managed-node1 TASK [Get expected mount device based on device type] ************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:7 Friday 17 January 2025 04:17:30 -0500 (0:00:00.310) 0:07:23.438 ******** ok: [managed-node1] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/luks-eb5a5901-32b6-427f-885e-6ee3011725ec" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:11 Friday 17 January 2025 04:17:30 -0500 (0:00:00.072) 0:07:23.510 ******** ok: [managed-node1] => { "ansible_facts": { "storage_test_mount_expected_mount_point": "/opt/test1", "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Get information about the mountpoint directory] ************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:19 Friday 17 January 2025 04:17:30 -0500 (0:00:00.074) 0:07:23.585 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by device] ******************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:28 Friday 17 January 2025 04:17:30 -0500 (0:00:00.060) 0:07:23.645 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Verify mount directory user] ********************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:36 Friday 17 January 2025 04:17:31 -0500 (0:00:00.069) 0:07:23.714 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory group] ******************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:42 Friday 17 January 2025 04:17:31 -0500 (0:00:00.057) 0:07:23.772 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory permissions] ************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:48 Friday 17 January 2025 04:17:31 -0500 (0:00:00.054) 0:07:23.827 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get path of test volume device] ****************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:57 Friday 17 January 2025 04:17:31 -0500 (0:00:00.054) 0:07:23.881 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:63 Friday 17 January 2025 04:17:31 -0500 (0:00:00.044) 0:07:23.926 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:69 Friday 17 January 2025 04:17:31 -0500 (0:00:00.047) 0:07:23.973 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:79 Friday 17 January 2025 04:17:31 -0500 (0:00:00.039) 0:07:24.013 ******** ok: [managed-node1] => { "ansible_facts": { "storage_test_found_mount_stat": null, "storage_test_mount_expected_mount_point": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:2 Friday 17 January 2025 04:17:31 -0500 (0:00:00.045) 0:07:24.059 ******** ok: [managed-node1] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "/dev/mapper/luks-eb5a5901-32b6-427f-885e-6ee3011725ec " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:17 Friday 17 January 2025 04:17:31 -0500 (0:00:00.096) 0:07:24.155 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:24 Friday 17 January 2025 04:17:31 -0500 (0:00:00.075) 0:07:24.230 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:33 Friday 17 January 2025 04:17:31 -0500 (0:00:00.069) 0:07:24.300 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify fingerprint] ****************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:45 Friday 17 January 2025 04:17:31 -0500 (0:00:00.134) 0:07:24.434 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Clean up variables] ****************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:52 Friday 17 January 2025 04:17:31 -0500 (0:00:00.117) 0:07:24.552 ******** ok: [managed-node1] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:6 Friday 17 January 2025 04:17:31 -0500 (0:00:00.064) 0:07:24.616 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:14 Friday 17 January 2025 04:17:32 -0500 (0:00:00.096) 0:07:24.712 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:3 Friday 17 January 2025 04:17:32 -0500 (0:00:00.084) 0:07:24.797 ******** ok: [managed-node1] => { "changed": false, "stat": { "atime": 1737105418.5390928, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1737105396.3190718, "dev": 5, "device_type": 64768, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 100994, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1737105396.3190718, "nlink": 1, "path": "/dev/mapper/foo-test1", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:9 Friday 17 January 2025 04:17:32 -0500 (0:00:00.477) 0:07:25.274 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:16 Friday 17 January 2025 04:17:32 -0500 (0:00:00.103) 0:07:25.377 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Make sure we got info about this volume] ********************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:23 Friday 17 January 2025 04:17:32 -0500 (0:00:00.066) 0:07:25.443 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Process volume type (set initial value) (1/2)] *************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:29 Friday 17 January 2025 04:17:32 -0500 (0:00:00.098) 0:07:25.542 ******** ok: [managed-node1] => { "ansible_facts": { "st_volume_type": "lvm" }, "changed": false } TASK [Process volume type (get RAID value) (2/2)] ****************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:33 Friday 17 January 2025 04:17:32 -0500 (0:00:00.081) 0:07:25.623 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:38 Friday 17 January 2025 04:17:33 -0500 (0:00:00.097) 0:07:25.721 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:3 Friday 17 January 2025 04:17:33 -0500 (0:00:00.089) 0:07:25.810 ******** ok: [managed-node1] => { "changed": false, "stat": { "atime": 1737105396.429072, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1737105396.429072, "dev": 5, "device_type": 64769, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 101064, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1737105396.429072, "nlink": 1, "path": "/dev/mapper/luks-eb5a5901-32b6-427f-885e-6ee3011725ec", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Ensure cryptsetup is present] ******************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:10 Friday 17 January 2025 04:17:33 -0500 (0:00:00.423) 0:07:26.233 ******** ok: [managed-node1] => { "changed": false, "rc": 0, "results": [ "cryptsetup-2.0.3-6.el7.x86_64 providing cryptsetup is already installed" ] } TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:16 Friday 17 January 2025 04:17:34 -0500 (0:00:00.949) 0:07:27.182 ******** ok: [managed-node1] => { "changed": false, "cmd": [ "cryptsetup", "luksDump", "/dev/mapper/foo-test1" ], "delta": "0:00:00.026997", "end": "2025-01-17 04:17:34.856260", "rc": 0, "start": "2025-01-17 04:17:34.829263" } STDOUT: LUKS header information Version: 2 Epoch: 3 Metadata area: 12288 bytes UUID: eb5a5901-32b6-427f-885e-6ee3011725ec Label: (no label) Subsystem: (no subsystem) Flags: (no flags) Data segments: 0: crypt offset: 4194304 [bytes] length: (whole device) cipher: aes-xts-plain64 sector: 512 [bytes] Keyslots: 0: luks2 Key: 512 bits Priority: normal Cipher: aes-xts-plain64 PBKDF: argon2i Time cost: 4 Memory: 667697 Threads: 2 Salt: 6f 9a 3e 9e e2 0e 46 09 0b ee 55 de 2a c2 30 3d a3 72 ce ce ba f4 8f f2 ae a5 4b 29 89 7d 31 8c AF stripes: 4000 Area offset:32768 [bytes] Area length:258048 [bytes] Digest ID: 0 Tokens: Digests: 0: pbkdf2 Hash: sha256 Iterations: 23141 Salt: 19 79 c4 14 3b ce 5b bf 2d ae cd bf 0a 2c 95 77 51 24 2d fd 1a e4 0f 39 25 31 bf 94 11 34 3e 49 Digest: e1 b4 3c 15 00 ed 6e 18 67 dc 86 c7 a5 e7 10 8d 14 b0 93 e4 06 27 e1 9e 9b 78 64 e2 5f 21 0d 5d TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:22 Friday 17 January 2025 04:17:34 -0500 (0:00:00.477) 0:07:27.660 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:29 Friday 17 January 2025 04:17:35 -0500 (0:00:00.124) 0:07:27.785 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:40 Friday 17 January 2025 04:17:35 -0500 (0:00:00.092) 0:07:27.878 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:46 Friday 17 January 2025 04:17:35 -0500 (0:00:00.076) 0:07:27.954 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Check LUKS version] ****************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:51 Friday 17 January 2025 04:17:35 -0500 (0:00:00.070) 0:07:28.025 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Check LUKS key size] ***************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:63 Friday 17 January 2025 04:17:35 -0500 (0:00:00.148) 0:07:28.174 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:75 Friday 17 January 2025 04:17:35 -0500 (0:00:00.064) 0:07:28.239 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set test variables] ****************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:87 Friday 17 January 2025 04:17:35 -0500 (0:00:00.099) 0:07:28.338 ******** ok: [managed-node1] => { "ansible_facts": { "_storage_test_crypttab_entries": [ "luks-eb5a5901-32b6-427f-885e-6ee3011725ec /dev/mapper/foo-test1 -" ], "_storage_test_expected_crypttab_entries": "1", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:93 Friday 17 January 2025 04:17:35 -0500 (0:00:00.085) 0:07:28.424 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:100 Friday 17 January 2025 04:17:35 -0500 (0:00:00.069) 0:07:28.493 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:108 Friday 17 January 2025 04:17:35 -0500 (0:00:00.075) 0:07:28.569 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Check key file of crypttab entry] **************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:116 Friday 17 January 2025 04:17:35 -0500 (0:00:00.078) 0:07:28.647 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Clear test variables] **************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:124 Friday 17 January 2025 04:17:36 -0500 (0:00:00.073) 0:07:28.721 ******** ok: [managed-node1] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [Get information about RAID] ********************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:8 Friday 17 January 2025 04:17:36 -0500 (0:00:00.066) 0:07:28.788 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:14 Friday 17 January 2025 04:17:36 -0500 (0:00:00.068) 0:07:28.857 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:19 Friday 17 January 2025 04:17:36 -0500 (0:00:00.081) 0:07:28.938 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:24 Friday 17 January 2025 04:17:36 -0500 (0:00:00.075) 0:07:29.014 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set chunk size regex] **************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:29 Friday 17 January 2025 04:17:36 -0500 (0:00:00.065) 0:07:29.079 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:37 Friday 17 January 2025 04:17:36 -0500 (0:00:00.063) 0:07:29.142 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:46 Friday 17 January 2025 04:17:36 -0500 (0:00:00.057) 0:07:29.200 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:54 Friday 17 January 2025 04:17:36 -0500 (0:00:00.061) 0:07:29.262 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:62 Friday 17 January 2025 04:17:36 -0500 (0:00:00.069) 0:07:29.331 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:70 Friday 17 January 2025 04:17:36 -0500 (0:00:00.070) 0:07:29.402 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the actual size of the volume] ************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:3 Friday 17 January 2025 04:17:36 -0500 (0:00:00.175) 0:07:29.578 ******** ok: [managed-node1] => { "bytes": 4294967296, "changed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } TASK [Parse the requested size of the volume] ********************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:11 Friday 17 January 2025 04:17:37 -0500 (0:00:00.483) 0:07:30.061 ******** ok: [managed-node1] => { "bytes": 4294967296, "changed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } TASK [Establish base value for expected size] ********************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:20 Friday 17 January 2025 04:17:37 -0500 (0:00:00.487) 0:07:30.549 ******** ok: [managed-node1] => { "ansible_facts": { "storage_test_expected_size": "4294967296" }, "changed": false } TASK [Show expected size] ****************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:28 Friday 17 January 2025 04:17:37 -0500 (0:00:00.121) 0:07:30.670 ******** ok: [managed-node1] => { "storage_test_expected_size": "4294967296" } TASK [Get the size of parent/pool device] ************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:32 Friday 17 January 2025 04:17:38 -0500 (0:00:00.102) 0:07:30.773 ******** ok: [managed-node1] => { "bytes": 10737418240, "changed": false, "lvm": "10g", "parted": "10GiB", "size": "10 GiB" } TASK [Show test pool] ********************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:46 Friday 17 January 2025 04:17:38 -0500 (0:00:00.431) 0:07:31.204 ******** skipping: [managed-node1] => {} TASK [Show test blockinfo] ***************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:50 Friday 17 January 2025 04:17:38 -0500 (0:00:00.069) 0:07:31.273 ******** skipping: [managed-node1] => {} TASK [Show test pool size] ***************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:54 Friday 17 January 2025 04:17:38 -0500 (0:00:00.065) 0:07:31.339 ******** skipping: [managed-node1] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:58 Friday 17 January 2025 04:17:38 -0500 (0:00:00.100) 0:07:31.440 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default thin pool reserved space values] ********************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:67 Friday 17 January 2025 04:17:38 -0500 (0:00:00.114) 0:07:31.555 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default minimal thin pool reserved space size] *************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:71 Friday 17 January 2025 04:17:39 -0500 (0:00:00.161) 0:07:31.716 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default maximal thin pool reserved space size] *************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:76 Friday 17 January 2025 04:17:39 -0500 (0:00:00.119) 0:07:31.835 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate maximum usable space in thin pool] ***************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:82 Friday 17 January 2025 04:17:39 -0500 (0:00:00.093) 0:07:31.929 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply upper size limit to max usable thin pool space] ******************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:86 Friday 17 January 2025 04:17:39 -0500 (0:00:00.074) 0:07:32.004 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply lower size limit to max usable thin pool space] ******************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:91 Friday 17 January 2025 04:17:39 -0500 (0:00:00.068) 0:07:32.072 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Convert maximum usable thin pool space from int to Size] ***************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:96 Friday 17 January 2025 04:17:39 -0500 (0:00:00.079) 0:07:32.152 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show max thin pool size] ************************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:101 Friday 17 January 2025 04:17:39 -0500 (0:00:00.094) 0:07:32.246 ******** skipping: [managed-node1] => {} TASK [Show volume thin pool size] ********************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:105 Friday 17 January 2025 04:17:39 -0500 (0:00:00.060) 0:07:32.306 ******** skipping: [managed-node1] => {} TASK [Show test volume size] *************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:109 Friday 17 January 2025 04:17:39 -0500 (0:00:00.062) 0:07:32.369 ******** skipping: [managed-node1] => {} TASK [Establish base value for expected thin pool size] ************************ task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:113 Friday 17 January 2025 04:17:39 -0500 (0:00:00.070) 0:07:32.440 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:120 Friday 17 January 2025 04:17:39 -0500 (0:00:00.082) 0:07:32.523 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected thin pool volume size] ***************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:127 Friday 17 January 2025 04:17:39 -0500 (0:00:00.070) 0:07:32.593 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected thin pool volume size based on percentage value] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:131 Friday 17 January 2025 04:17:39 -0500 (0:00:00.056) 0:07:32.649 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Replace expected volume size with calculated value] ********************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:137 Friday 17 January 2025 04:17:40 -0500 (0:00:00.048) 0:07:32.698 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show actual size] ******************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:143 Friday 17 January 2025 04:17:40 -0500 (0:00:00.045) 0:07:32.743 ******** ok: [managed-node1] => { "storage_test_actual_size": { "bytes": 4294967296, "changed": false, "failed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } } TASK [Show expected size] ****************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:147 Friday 17 January 2025 04:17:40 -0500 (0:00:00.056) 0:07:32.800 ******** ok: [managed-node1] => { "storage_test_expected_size": "4294967296" } TASK [Assert expected size is actual size] ************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:151 Friday 17 January 2025 04:17:40 -0500 (0:00:00.077) 0:07:32.877 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Get information about the LV] ******************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:5 Friday 17 January 2025 04:17:40 -0500 (0:00:00.090) 0:07:32.968 ******** ok: [managed-node1] => { "changed": false, "cmd": [ "lvs", "--noheadings", "--nameprefixes", "--units=b", "--nosuffix", "--unquoted", "-o", "name,attr,cache_total_blocks,chunk_size,segtype", "foo/test1" ], "delta": "0:00:00.018486", "end": "2025-01-17 04:17:40.679236", "rc": 0, "start": "2025-01-17 04:17:40.660750" } STDOUT: LVM2_LV_NAME=test1 LVM2_LV_ATTR=-wi-ao---- LVM2_CACHE_TOTAL_BLOCKS= LVM2_CHUNK_SIZE=0 LVM2_SEGTYPE=linear TASK [Set LV segment type] ***************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:13 Friday 17 January 2025 04:17:40 -0500 (0:00:00.513) 0:07:33.481 ******** ok: [managed-node1] => { "ansible_facts": { "storage_test_lv_segtype": [ "linear" ] }, "changed": false } TASK [Check segment type] ****************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:17 Friday 17 January 2025 04:17:40 -0500 (0:00:00.094) 0:07:33.576 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Set LV cache size] ******************************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:24 Friday 17 January 2025 04:17:40 -0500 (0:00:00.073) 0:07:33.650 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested cache size] ****************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:31 Friday 17 January 2025 04:17:41 -0500 (0:00:00.083) 0:07:33.733 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected cache size] ************************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:37 Friday 17 January 2025 04:17:41 -0500 (0:00:00.061) 0:07:33.795 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:42 Friday 17 January 2025 04:17:41 -0500 (0:00:00.063) 0:07:33.858 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:25 Friday 17 January 2025 04:17:41 -0500 (0:00:00.067) 0:07:33.925 ******** ok: [managed-node1] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:44 Friday 17 January 2025 04:17:41 -0500 (0:00:00.059) 0:07:33.985 ******** TASK [Clean up variable namespace] ********************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:54 Friday 17 January 2025 04:17:41 -0500 (0:00:00.053) 0:07:34.038 ******** ok: [managed-node1] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null }, "changed": false } TASK [Create a file] *********************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/create-test-file.yml:12 Friday 17 January 2025 04:17:41 -0500 (0:00:00.082) 0:07:34.120 ******** changed: [managed-node1] => { "changed": true, "dest": "/opt/test1/quux", "gid": 0, "group": "root", "mode": "0644", "owner": "root", "secontext": "unconfined_u:object_r:unlabeled_t:s0", "size": 0, "state": "file", "uid": 0 } TASK [Test for correct handling of safe_mode] ********************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:436 Friday 17 January 2025 04:17:42 -0500 (0:00:00.624) 0:07:34.745 ******** included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml for managed-node1 TASK [Store global variable value copy] **************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:4 Friday 17 January 2025 04:17:42 -0500 (0:00:00.143) 0:07:34.888 ******** ok: [managed-node1] => { "ansible_facts": { "storage_pools_global": [], "storage_safe_mode_global": true, "storage_volumes_global": [] }, "changed": false } TASK [Verify role raises correct error] **************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:10 Friday 17 January 2025 04:17:42 -0500 (0:00:00.073) 0:07:34.961 ******** TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Friday 17 January 2025 04:17:42 -0500 (0:00:00.104) 0:07:35.066 ******** included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node1 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Friday 17 January 2025 04:17:42 -0500 (0:00:00.111) 0:07:35.177 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Friday 17 January 2025 04:17:42 -0500 (0:00:00.135) 0:07:35.313 ******** skipping: [managed-node1] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node1] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node1] => (item=CentOS_7.yml) => { "ansible_facts": { "__storage_blivet_diskvolume_mkfs_option_map": { "ext2": "-F", "ext3": "-F", "ext4": "-F" }, "blivet_package_list": [ "python-enum34", "python-blivet3", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "{{ 'libblockdev-s390' if ansible_architecture == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_7.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_7.yml" } skipping: [managed-node1] => (item=CentOS_7.9.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS_7.9.yml", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Friday 17 January 2025 04:17:42 -0500 (0:00:00.203) 0:07:35.517 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Friday 17 January 2025 04:17:42 -0500 (0:00:00.116) 0:07:35.633 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Friday 17 January 2025 04:17:43 -0500 (0:00:00.076) 0:07:35.710 ******** ok: [managed-node1] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Friday 17 January 2025 04:17:43 -0500 (0:00:00.059) 0:07:35.770 ******** ok: [managed-node1] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Friday 17 January 2025 04:17:43 -0500 (0:00:00.111) 0:07:35.881 ******** included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node1 TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2 Friday 17 January 2025 04:17:43 -0500 (0:00:00.190) 0:07:36.072 ******** ok: [managed-node1] => { "changed": false, "rc": 0, "results": [ "python-enum34-1.0.4-1.el7.noarch providing python-enum34 is already installed", "1:python2-blivet3-3.1.3-3.el7.noarch providing python-blivet3 is already installed", "libblockdev-crypto-2.18-5.el7.x86_64 providing libblockdev-crypto is already installed", "libblockdev-dm-2.18-5.el7.x86_64 providing libblockdev-dm is already installed", "libblockdev-lvm-2.18-5.el7.x86_64 providing libblockdev-lvm is already installed", "libblockdev-mdraid-2.18-5.el7.x86_64 providing libblockdev-mdraid is already installed", "libblockdev-swap-2.18-5.el7.x86_64 providing libblockdev-swap is already installed", "libblockdev-2.18-5.el7.x86_64 providing libblockdev is already installed" ] } TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Friday 17 January 2025 04:17:47 -0500 (0:00:04.276) 0:07:40.349 ******** ok: [managed-node1] => { "storage_pools": [ { "disks": [ "sda" ], "name": "foo", "type": "lvm", "volumes": [ { "encryption": false, "encryption_luks_version": "luks2", "encryption_password": "yabbadabbadoo", "mount_point": "/opt/test1", "name": "test1", "size": "4g" } ] } ] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:14 Friday 17 January 2025 04:17:47 -0500 (0:00:00.103) 0:07:40.453 ******** ok: [managed-node1] => { "storage_volumes": [] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19 Friday 17 January 2025 04:17:47 -0500 (0:00:00.123) 0:07:40.576 ******** ok: [managed-node1] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "lvm2" ], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:31 Friday 17 January 2025 04:17:52 -0500 (0:00:04.661) 0:07:45.238 ******** included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node1 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Friday 17 January 2025 04:17:52 -0500 (0:00:00.200) 0:07:45.439 ******** TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Friday 17 January 2025 04:17:52 -0500 (0:00:00.070) 0:07:45.510 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Friday 17 January 2025 04:17:52 -0500 (0:00:00.061) 0:07:45.572 ******** TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:37 Friday 17 January 2025 04:17:52 -0500 (0:00:00.053) 0:07:45.626 ******** ok: [managed-node1] => { "changed": false, "rc": 0, "results": [ "7:lvm2-2.02.187-6.el7_9.5.x86_64 providing lvm2 is already installed", "kpartx-0.4.9-136.el7_9.x86_64 providing kpartx is already installed" ] } TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:51 Friday 17 January 2025 04:17:53 -0500 (0:00:00.954) 0:07:46.580 ******** ok: [managed-node1] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "arp-ethers.service": { "name": "arp-ethers.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "brandbot.service": { "name": "brandbot.service", "source": "systemd", "state": "inactive", "status": "static" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "console-shell.service": { "name": "console-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.import1.service": { "name": "dbus-org.freedesktop.import1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.machine1.service": { "name": "dbus-org.freedesktop.machine1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dmraid-activation.service": { "name": "dmraid-activation.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "ebtables.service": { "name": "ebtables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iprdump.service": { "name": "iprdump.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprinit.service": { "name": "iprinit.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprupdate.service": { "name": "iprupdate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-lvmetad.service": { "name": "lvm2-lvmetad.service", "source": "systemd", "state": "running", "status": "static" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "lvm2-pvscan@8:0.service": { "name": "lvm2-pvscan@8:0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "netconsole": { "name": "netconsole", "source": "sysv", "state": "stopped", "status": "disabled" }, "network": { "name": "network", "source": "sysv", "state": "running", "status": "enabled" }, "network.service": { "name": "network.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-config.service": { "name": "nfs-config.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-idmap.service": { "name": "nfs-idmap.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-lock.service": { "name": "nfs-lock.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-rquotad.service": { "name": "nfs-rquotad.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-secure.service": { "name": "nfs-secure.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs.service": { "name": "nfs.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfslock.service": { "name": "nfslock.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "postfix.service": { "name": "postfix.service", "source": "systemd", "state": "running", "status": "enabled" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rhel-autorelabel-mark.service": { "name": "rhel-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-autorelabel.service": { "name": "rhel-autorelabel.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-configure.service": { "name": "rhel-configure.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-dmesg.service": { "name": "rhel-dmesg.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-domainname.service": { "name": "rhel-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-import-state.service": { "name": "rhel-import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-loadmodules.service": { "name": "rhel-loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-readonly.service": { "name": "rhel-readonly.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-rquotad.service": { "name": "rpc-rquotad.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpcgssd.service": { "name": "rpcgssd.service", "source": "systemd", "state": "inactive", "status": "static" }, "rpcidmapd.service": { "name": "rpcidmapd.service", "source": "systemd", "state": "inactive", "status": "static" }, "rsyncd.service": { "name": "rsyncd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyncd@.service": { "name": "rsyncd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-policy-migrate-local-changes@.service": { "name": "selinux-policy-migrate-local-changes@.service", "source": "systemd", "state": "unknown", "status": "static" }, "selinux-policy-migrate-local-changes@targeted.service": { "name": "selinux-policy-migrate-local-changes@targeted.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "unknown" }, "sshd-keygen.service": { "name": "sshd-keygen.service", "source": "systemd", "state": "stopped", "status": "static" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-bootchart.service": { "name": "systemd-bootchart.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-cryptsetup@luks\\x2deb5a5901\\x2d32b6\\x2d427f\\x2d885e\\x2d6ee3011725ec.service": { "name": "systemd-cryptsetup@luks\\x2deb5a5901\\x2d32b6\\x2d427f\\x2d885e\\x2d6ee3011725ec.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-importd.service": { "name": "systemd-importd.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-machined.service": { "name": "systemd-machined.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-nspawn@.service": { "name": "systemd-nspawn@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-readahead-collect.service": { "name": "systemd-readahead-collect.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-readahead-done.service": { "name": "systemd-readahead-done.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "systemd-readahead-drop.service": { "name": "systemd-readahead-drop.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "systemd-readahead-replay.service": { "name": "systemd-readahead-replay.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill@.service": { "name": "systemd-rfkill@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-shutdownd.service": { "name": "systemd-shutdownd.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "wpa_supplicant.service": { "name": "wpa_supplicant.service", "source": "systemd", "state": "inactive", "status": "disabled" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Set storage_cryptsetup_services] ***** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:57 Friday 17 January 2025 04:17:55 -0500 (0:00:01.307) 0:07:47.888 ******** ok: [managed-node1] => { "ansible_facts": { "storage_cryptsetup_services": [ "systemd-cryptsetup@luks\\x2deb5a5901\\x2d32b6\\x2d427f\\x2d885e\\x2d6ee3011725ec.service" ] }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:63 Friday 17 January 2025 04:17:55 -0500 (0:00:00.097) 0:07:47.986 ******** changed: [managed-node1] => (item=systemd-cryptsetup@luks\x2deb5a5901\x2d32b6\x2d427f\x2d885e\x2d6ee3011725ec.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2deb5a5901\\x2d32b6\\x2d427f\\x2d885e\\x2d6ee3011725ec.service", "name": "systemd-cryptsetup@luks\\x2deb5a5901\\x2d32b6\\x2d427f\\x2d885e\\x2d6ee3011725ec.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "After": "systemd-readahead-replay.service dev-mapper-foo\\x2dtest1.device systemd-journald.socket cryptsetup-pre.target systemd-readahead-collect.service system-systemd\\x2dcryptsetup.slice", "AllowIsolate": "no", "AmbientCapabilities": "0", "AssertResult": "no", "AssertTimestampMonotonic": "0", "Before": "cryptsetup.target umount.target", "BindsTo": "dev-mapper-foo\\x2dtest1.device", "BlockIOAccounting": "no", "BlockIOWeight": "18446744073709551615", "CPUAccounting": "no", "CPUQuotaPerSecUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "18446744073709551615", "CanIsolate": "no", "CanReload": "no", "CanStart": "yes", "CanStop": "yes", "CapabilityBoundingSet": "18446744073709551615", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "Conflicts": "umount.target", "ControlPID": "0", "DefaultDependencies": "no", "Delegate": "no", "Description": "Cryptography Setup for luks-eb5a5901-32b6-427f-885e-6ee3011725ec", "DevicePolicy": "auto", "Documentation": "man:crypttab(5) man:systemd-cryptsetup-generator(8) man:systemd-cryptsetup@.service(8)", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup attach luks-eb5a5901-32b6-427f-885e-6ee3011725ec /dev/mapper/foo-test1 - ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStop": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup detach luks-eb5a5901-32b6-427f-885e-6ee3011725ec ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/run/systemd/generator/systemd-cryptsetup@luks\\x2deb5a5901\\x2d32b6\\x2d427f\\x2d885e\\x2d6ee3011725ec.service", "GuessMainPID": "yes", "IOScheduling": "0", "Id": "systemd-cryptsetup@luks\\x2deb5a5901\\x2d32b6\\x2d427f\\x2d885e\\x2d6ee3011725ec.service", "IgnoreOnIsolate": "yes", "IgnoreOnSnapshot": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobTimeoutAction": "none", "JobTimeoutUSec": "0", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "18446744073709551615", "LimitCORE": "18446744073709551615", "LimitCPU": "18446744073709551615", "LimitDATA": "18446744073709551615", "LimitFSIZE": "18446744073709551615", "LimitLOCKS": "18446744073709551615", "LimitMEMLOCK": "65536", "LimitMSGQUEUE": "819200", "LimitNICE": "0", "LimitNOFILE": "4096", "LimitNPROC": "14311", "LimitRSS": "18446744073709551615", "LimitRTPRIO": "0", "LimitRTTIME": "18446744073709551615", "LimitSIGPENDING": "14311", "LimitSTACK": "18446744073709551615", "LoadState": "loaded", "MainPID": "0", "MemoryAccounting": "no", "MemoryCurrent": "18446744073709551615", "MemoryLimit": "18446744073709551615", "MountFlags": "0", "Names": "systemd-cryptsetup@luks\\x2deb5a5901\\x2d32b6\\x2d427f\\x2d885e\\x2d6ee3011725ec.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "PrivateDevices": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "ProtectHome": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "yes", "RequiredBy": "cryptsetup.target dev-mapper-luks\\x2deb5a5901\\x2d32b6\\x2d427f\\x2d885e\\x2d6ee3011725ec.device", "Requires": "system-systemd\\x2dcryptsetup.slice", "Restart": "no", "RestartUSec": "100ms", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "Slice": "system-systemd\\x2dcryptsetup.slice", "SourcePath": "/etc/crypttab", "StandardError": "inherit", "StandardInput": "null", "StandardOutput": "journal", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitInterval": "10000000", "StartupBlockIOWeight": "18446744073709551615", "StartupCPUShares": "18446744073709551615", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "no", "TasksCurrent": "18446744073709551615", "TasksMax": "18446744073709551615", "TimeoutStartUSec": "0", "TimeoutStopUSec": "0", "TimerSlackNSec": "50000", "Transient": "no", "Type": "oneshot", "UMask": "0022", "UnitFilePreset": "disabled", "UnitFileState": "bad", "WantedBy": "dev-mapper-foo\\x2dtest1.device", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:69 Friday 17 January 2025 04:17:55 -0500 (0:00:00.629) 0:07:48.616 ******** fatal: [managed-node1]: FAILED! => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } MSG: cannot remove existing formatting on device 'luks-eb5a5901-32b6-427f-885e-6ee3011725ec' in safe mode due to encryption removal TASK [fedora.linux_system_roles.storage : Failed message] ********************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:109 Friday 17 January 2025 04:18:00 -0500 (0:00:04.453) 0:07:53.069 ******** fatal: [managed-node1]: FAILED! => { "changed": false } MSG: {u'_ansible_no_log': False, u'crypts': [], u'pools': [], u'leaves': [], u'changed': False, u'actions': [], u'failed': True, u'volumes': [], u'invocation': {u'module_args': {u'packages_only': False, u'disklabel_type': None, u'diskvolume_mkfs_option_map': {u'ext4': u'-F', u'ext3': u'-F', u'ext2': u'-F'}, u'safe_mode': True, u'pools': [{u'raid_metadata_version': None, u'encryption_key_size': None, u'encryption_key': None, u'encryption_luks_version': None, u'encryption_tang_url': None, u'raid_spare_count': None, u'grow_to_fill': False, u'encryption_tang_thumbprint': None, u'name': u'foo', u'encryption_password': None, u'encryption': False, u'disks': [u'sda'], u'raid_level': None, u'raid_device_count': None, u'state': u'present', u'volumes': [{u'raid_metadata_version': None, u'mount_device_identifier': u'uuid', u'fs_type': u'xfs', u'mount_options': u'defaults', u'size': u'4g', u'mount_point': u'/opt/test1', u'compression': None, u'encryption_password': u'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', u'encryption': False, u'raid_level': None, u'raid_device_count': None, u'state': u'present', u'vdo_pool_size': None, u'mount_mode': None, u'thin_pool_name': None, u'type': u'lvm', u'encryption_key_size': 0, u'deduplication': None, u'encryption_cipher': None, u'encryption_key': None, u'fs_label': u'', u'encryption_luks_version': u'luks2', u'raid_stripe_size': None, u'mount_passno': 0, u'mount_user': None, u'raid_spare_count': None, u'name': u'test1', u'cache_mode': None, u'raid_disks': [], u'mount_group': None, u'fs_overwrite_existing': True, u'disks': [u'sda'], u'cached': False, u'thin_pool_size': None, u'thin': False, u'mount_check': 0, u'cache_size': 0, u'raid_chunk_size': None, u'cache_devices': [], u'fs_create_options': u''}], u'shared': False, u'encryption_clevis_pin': None, u'type': u'lvm', u'encryption_cipher': None, u'raid_chunk_size': None}], u'volumes': [], u'pool_defaults': {u'raid_metadata_version': None, u'encryption_cipher': None, u'encryption_key': None, u'encryption_luks_version': None, u'raid_spare_count': None, u'grow_to_fill': False, u'encryption_password': None, u'encryption': False, u'disks': [], u'raid_level': None, u'raid_device_count': None, u'state': u'present', u'volumes': [], u'shared': False, u'type': u'lvm', u'encryption_key_size': None, u'raid_chunk_size': None}, u'volume_defaults': {u'raid_metadata_version': None, u'raid_level': None, u'fs_type': u'xfs', u'mount_options': u'defaults', u'size': 0, u'mount_point': u'', u'compression': None, u'encryption_password': None, u'encryption': False, u'mount_device_identifier': u'uuid', u'raid_device_count': None, u'state': u'present', u'vdo_pool_size': None, u'thin_pool_name': None, u'type': u'lvm', u'encryption_key_size': None, u'encryption_cipher': None, u'encryption_key': None, u'fs_label': u'', u'encryption_luks_version': None, u'raid_stripe_size': None, u'cache_size': 0, u'raid_spare_count': None, u'cache_mode': None, u'deduplication': None, u'cached': False, u'fs_overwrite_existing': True, u'disks': [], u'thin_pool_size': None, u'thin': None, u'mount_check': 0, u'mount_passno': 0, u'raid_chunk_size': None, u'cache_devices': [], u'fs_create_options': u''}, u'use_partitions': None}}, u'mounts': [], u'packages': [], u'msg': u"cannot remove existing formatting on device 'luks-eb5a5901-32b6-427f-885e-6ee3011725ec' in safe mode due to encryption removal"} TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:113 Friday 17 January 2025 04:18:00 -0500 (0:00:00.096) 0:07:53.165 ******** changed: [managed-node1] => (item=systemd-cryptsetup@luks\x2deb5a5901\x2d32b6\x2d427f\x2d885e\x2d6ee3011725ec.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2deb5a5901\\x2d32b6\\x2d427f\\x2d885e\\x2d6ee3011725ec.service", "name": "systemd-cryptsetup@luks\\x2deb5a5901\\x2d32b6\\x2d427f\\x2d885e\\x2d6ee3011725ec.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AmbientCapabilities": "0", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "18446744073709551615", "CPUAccounting": "no", "CPUQuotaPerSecUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "18446744073709551615", "CanIsolate": "no", "CanReload": "no", "CanStart": "yes", "CanStop": "yes", "CapabilityBoundingSet": "18446744073709551615", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ControlPID": "0", "DefaultDependencies": "yes", "Delegate": "no", "Description": "systemd-cryptsetup@luks\\x2deb5a5901\\x2d32b6\\x2d427f\\x2d885e\\x2d6ee3011725ec.service", "DevicePolicy": "auto", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/dev/null", "GuessMainPID": "yes", "IOScheduling": "0", "Id": "systemd-cryptsetup@luks\\x2deb5a5901\\x2d32b6\\x2d427f\\x2d885e\\x2d6ee3011725ec.service", "IgnoreOnIsolate": "no", "IgnoreOnSnapshot": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobTimeoutAction": "none", "JobTimeoutUSec": "0", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "18446744073709551615", "LimitCORE": "18446744073709551615", "LimitCPU": "18446744073709551615", "LimitDATA": "18446744073709551615", "LimitFSIZE": "18446744073709551615", "LimitLOCKS": "18446744073709551615", "LimitMEMLOCK": "65536", "LimitMSGQUEUE": "819200", "LimitNICE": "0", "LimitNOFILE": "1048576", "LimitNPROC": "14311", "LimitRSS": "18446744073709551615", "LimitRTPRIO": "0", "LimitRTTIME": "18446744073709551615", "LimitSIGPENDING": "14311", "LimitSTACK": "18446744073709551615", "LoadState": "masked", "MainPID": "0", "MemoryAccounting": "no", "MemoryCurrent": "18446744073709551615", "MemoryLimit": "18446744073709551615", "MountFlags": "0", "Names": "systemd-cryptsetup@luks\\x2deb5a5901\\x2d32b6\\x2d427f\\x2d885e\\x2d6ee3011725ec.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "PrivateDevices": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "ProtectHome": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "no", "RequiredBy": "dev-mapper-luks\\x2deb5a5901\\x2d32b6\\x2d427f\\x2d885e\\x2d6ee3011725ec.device cryptsetup.target", "Restart": "no", "RestartUSec": "100ms", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitInterval": "10000000", "StartupBlockIOWeight": "18446744073709551615", "StartupCPUShares": "18446744073709551615", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "no", "TasksCurrent": "18446744073709551615", "TasksMax": "18446744073709551615", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UMask": "0022", "UnitFileState": "bad", "WantedBy": "dev-mapper-foo\\x2dtest1.device", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } TASK [Check that we failed in the role] **************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:23 Friday 17 January 2025 04:18:01 -0500 (0:00:00.711) 0:07:53.877 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Verify the blivet output and error message are correct] ****************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:28 Friday 17 January 2025 04:18:01 -0500 (0:00:00.066) 0:07:53.944 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Verify correct exception or error message] ******************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:39 Friday 17 January 2025 04:18:01 -0500 (0:00:00.078) 0:07:54.023 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Stat the file] *********************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml:11 Friday 17 January 2025 04:18:01 -0500 (0:00:00.057) 0:07:54.080 ******** ok: [managed-node1] => { "changed": false, "stat": { "atime": 1737105461.8931317, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1737105461.8931317, "dev": 64769, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 67, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0644", "mtime": 1737105461.8931317, "nlink": 1, "path": "/opt/test1/quux", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "version": "18446744073572680800", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Assert file presence] **************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml:16 Friday 17 January 2025 04:18:01 -0500 (0:00:00.441) 0:07:54.521 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Remove the encryption layer] ********************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:460 Friday 17 January 2025 04:18:01 -0500 (0:00:00.088) 0:07:54.610 ******** TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Friday 17 January 2025 04:18:02 -0500 (0:00:00.152) 0:07:54.763 ******** included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node1 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Friday 17 January 2025 04:18:02 -0500 (0:00:00.087) 0:07:54.850 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Friday 17 January 2025 04:18:02 -0500 (0:00:00.071) 0:07:54.921 ******** skipping: [managed-node1] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node1] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node1] => (item=CentOS_7.yml) => { "ansible_facts": { "__storage_blivet_diskvolume_mkfs_option_map": { "ext2": "-F", "ext3": "-F", "ext4": "-F" }, "blivet_package_list": [ "python-enum34", "python-blivet3", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "{{ 'libblockdev-s390' if ansible_architecture == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_7.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_7.yml" } skipping: [managed-node1] => (item=CentOS_7.9.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS_7.9.yml", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Friday 17 January 2025 04:18:02 -0500 (0:00:00.138) 0:07:55.060 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Friday 17 January 2025 04:18:02 -0500 (0:00:00.057) 0:07:55.118 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Friday 17 January 2025 04:18:02 -0500 (0:00:00.059) 0:07:55.177 ******** ok: [managed-node1] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Friday 17 January 2025 04:18:02 -0500 (0:00:00.057) 0:07:55.235 ******** ok: [managed-node1] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Friday 17 January 2025 04:18:02 -0500 (0:00:00.058) 0:07:55.294 ******** included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node1 TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2 Friday 17 January 2025 04:18:02 -0500 (0:00:00.137) 0:07:55.431 ******** ok: [managed-node1] => { "changed": false, "rc": 0, "results": [ "python-enum34-1.0.4-1.el7.noarch providing python-enum34 is already installed", "1:python2-blivet3-3.1.3-3.el7.noarch providing python-blivet3 is already installed", "libblockdev-crypto-2.18-5.el7.x86_64 providing libblockdev-crypto is already installed", "libblockdev-dm-2.18-5.el7.x86_64 providing libblockdev-dm is already installed", "libblockdev-lvm-2.18-5.el7.x86_64 providing libblockdev-lvm is already installed", "libblockdev-mdraid-2.18-5.el7.x86_64 providing libblockdev-mdraid is already installed", "libblockdev-swap-2.18-5.el7.x86_64 providing libblockdev-swap is already installed", "libblockdev-2.18-5.el7.x86_64 providing libblockdev is already installed" ] } TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Friday 17 January 2025 04:18:06 -0500 (0:00:04.002) 0:07:59.434 ******** ok: [managed-node1] => { "storage_pools": [ { "disks": [ "sda" ], "name": "foo", "type": "lvm", "volumes": [ { "encryption": false, "encryption_luks_version": "luks2", "encryption_password": "yabbadabbadoo", "mount_point": "/opt/test1", "name": "test1", "size": "4g" } ] } ] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:14 Friday 17 January 2025 04:18:06 -0500 (0:00:00.075) 0:07:59.509 ******** ok: [managed-node1] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19 Friday 17 January 2025 04:18:06 -0500 (0:00:00.068) 0:07:59.578 ******** ok: [managed-node1] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "lvm2" ], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:31 Friday 17 January 2025 04:18:11 -0500 (0:00:04.456) 0:08:04.034 ******** included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node1 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Friday 17 January 2025 04:18:11 -0500 (0:00:00.070) 0:08:04.104 ******** TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Friday 17 January 2025 04:18:11 -0500 (0:00:00.035) 0:08:04.140 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Friday 17 January 2025 04:18:11 -0500 (0:00:00.039) 0:08:04.180 ******** TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:37 Friday 17 January 2025 04:18:11 -0500 (0:00:00.035) 0:08:04.216 ******** ok: [managed-node1] => { "changed": false, "rc": 0, "results": [ "7:lvm2-2.02.187-6.el7_9.5.x86_64 providing lvm2 is already installed", "kpartx-0.4.9-136.el7_9.x86_64 providing kpartx is already installed" ] } TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:51 Friday 17 January 2025 04:18:12 -0500 (0:00:00.688) 0:08:04.904 ******** ok: [managed-node1] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "arp-ethers.service": { "name": "arp-ethers.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "brandbot.service": { "name": "brandbot.service", "source": "systemd", "state": "inactive", "status": "static" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "console-shell.service": { "name": "console-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.import1.service": { "name": "dbus-org.freedesktop.import1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.machine1.service": { "name": "dbus-org.freedesktop.machine1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dmraid-activation.service": { "name": "dmraid-activation.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "ebtables.service": { "name": "ebtables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iprdump.service": { "name": "iprdump.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprinit.service": { "name": "iprinit.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprupdate.service": { "name": "iprupdate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-lvmetad.service": { "name": "lvm2-lvmetad.service", "source": "systemd", "state": "running", "status": "static" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "lvm2-pvscan@8:0.service": { "name": "lvm2-pvscan@8:0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "netconsole": { "name": "netconsole", "source": "sysv", "state": "stopped", "status": "disabled" }, "network": { "name": "network", "source": "sysv", "state": "running", "status": "enabled" }, "network.service": { "name": "network.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-config.service": { "name": "nfs-config.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-idmap.service": { "name": "nfs-idmap.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-lock.service": { "name": "nfs-lock.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-rquotad.service": { "name": "nfs-rquotad.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-secure.service": { "name": "nfs-secure.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs.service": { "name": "nfs.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfslock.service": { "name": "nfslock.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "postfix.service": { "name": "postfix.service", "source": "systemd", "state": "running", "status": "enabled" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rhel-autorelabel-mark.service": { "name": "rhel-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-autorelabel.service": { "name": "rhel-autorelabel.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-configure.service": { "name": "rhel-configure.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-dmesg.service": { "name": "rhel-dmesg.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-domainname.service": { "name": "rhel-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-import-state.service": { "name": "rhel-import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-loadmodules.service": { "name": "rhel-loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-readonly.service": { "name": "rhel-readonly.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-rquotad.service": { "name": "rpc-rquotad.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpcgssd.service": { "name": "rpcgssd.service", "source": "systemd", "state": "inactive", "status": "static" }, "rpcidmapd.service": { "name": "rpcidmapd.service", "source": "systemd", "state": "inactive", "status": "static" }, "rsyncd.service": { "name": "rsyncd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyncd@.service": { "name": "rsyncd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-policy-migrate-local-changes@.service": { "name": "selinux-policy-migrate-local-changes@.service", "source": "systemd", "state": "unknown", "status": "static" }, "selinux-policy-migrate-local-changes@targeted.service": { "name": "selinux-policy-migrate-local-changes@targeted.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "unknown" }, "sshd-keygen.service": { "name": "sshd-keygen.service", "source": "systemd", "state": "stopped", "status": "static" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-bootchart.service": { "name": "systemd-bootchart.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-cryptsetup@luks\\x2deb5a5901\\x2d32b6\\x2d427f\\x2d885e\\x2d6ee3011725ec.service": { "name": "systemd-cryptsetup@luks\\x2deb5a5901\\x2d32b6\\x2d427f\\x2d885e\\x2d6ee3011725ec.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-importd.service": { "name": "systemd-importd.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-machined.service": { "name": "systemd-machined.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-nspawn@.service": { "name": "systemd-nspawn@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-readahead-collect.service": { "name": "systemd-readahead-collect.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-readahead-done.service": { "name": "systemd-readahead-done.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "systemd-readahead-drop.service": { "name": "systemd-readahead-drop.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "systemd-readahead-replay.service": { "name": "systemd-readahead-replay.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill@.service": { "name": "systemd-rfkill@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-shutdownd.service": { "name": "systemd-shutdownd.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "wpa_supplicant.service": { "name": "wpa_supplicant.service", "source": "systemd", "state": "inactive", "status": "disabled" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Set storage_cryptsetup_services] ***** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:57 Friday 17 January 2025 04:18:13 -0500 (0:00:01.010) 0:08:05.915 ******** ok: [managed-node1] => { "ansible_facts": { "storage_cryptsetup_services": [ "systemd-cryptsetup@luks\\x2deb5a5901\\x2d32b6\\x2d427f\\x2d885e\\x2d6ee3011725ec.service" ] }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:63 Friday 17 January 2025 04:18:13 -0500 (0:00:00.091) 0:08:06.006 ******** changed: [managed-node1] => (item=systemd-cryptsetup@luks\x2deb5a5901\x2d32b6\x2d427f\x2d885e\x2d6ee3011725ec.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2deb5a5901\\x2d32b6\\x2d427f\\x2d885e\\x2d6ee3011725ec.service", "name": "systemd-cryptsetup@luks\\x2deb5a5901\\x2d32b6\\x2d427f\\x2d885e\\x2d6ee3011725ec.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "After": "dev-mapper-foo\\x2dtest1.device systemd-readahead-replay.service cryptsetup-pre.target system-systemd\\x2dcryptsetup.slice systemd-readahead-collect.service systemd-journald.socket", "AllowIsolate": "no", "AmbientCapabilities": "0", "AssertResult": "no", "AssertTimestampMonotonic": "0", "Before": "umount.target cryptsetup.target", "BindsTo": "dev-mapper-foo\\x2dtest1.device", "BlockIOAccounting": "no", "BlockIOWeight": "18446744073709551615", "CPUAccounting": "no", "CPUQuotaPerSecUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "18446744073709551615", "CanIsolate": "no", "CanReload": "no", "CanStart": "yes", "CanStop": "yes", "CapabilityBoundingSet": "18446744073709551615", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "Conflicts": "umount.target", "ControlPID": "0", "DefaultDependencies": "no", "Delegate": "no", "Description": "Cryptography Setup for luks-eb5a5901-32b6-427f-885e-6ee3011725ec", "DevicePolicy": "auto", "Documentation": "man:crypttab(5) man:systemd-cryptsetup-generator(8) man:systemd-cryptsetup@.service(8)", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup attach luks-eb5a5901-32b6-427f-885e-6ee3011725ec /dev/mapper/foo-test1 - ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStop": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup detach luks-eb5a5901-32b6-427f-885e-6ee3011725ec ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/run/systemd/generator/systemd-cryptsetup@luks\\x2deb5a5901\\x2d32b6\\x2d427f\\x2d885e\\x2d6ee3011725ec.service", "GuessMainPID": "yes", "IOScheduling": "0", "Id": "systemd-cryptsetup@luks\\x2deb5a5901\\x2d32b6\\x2d427f\\x2d885e\\x2d6ee3011725ec.service", "IgnoreOnIsolate": "yes", "IgnoreOnSnapshot": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobTimeoutAction": "none", "JobTimeoutUSec": "0", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "18446744073709551615", "LimitCORE": "18446744073709551615", "LimitCPU": "18446744073709551615", "LimitDATA": "18446744073709551615", "LimitFSIZE": "18446744073709551615", "LimitLOCKS": "18446744073709551615", "LimitMEMLOCK": "65536", "LimitMSGQUEUE": "819200", "LimitNICE": "0", "LimitNOFILE": "4096", "LimitNPROC": "14311", "LimitRSS": "18446744073709551615", "LimitRTPRIO": "0", "LimitRTTIME": "18446744073709551615", "LimitSIGPENDING": "14311", "LimitSTACK": "18446744073709551615", "LoadState": "loaded", "MainPID": "0", "MemoryAccounting": "no", "MemoryCurrent": "18446744073709551615", "MemoryLimit": "18446744073709551615", "MountFlags": "0", "Names": "systemd-cryptsetup@luks\\x2deb5a5901\\x2d32b6\\x2d427f\\x2d885e\\x2d6ee3011725ec.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "PrivateDevices": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "ProtectHome": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "yes", "RequiredBy": "cryptsetup.target dev-mapper-luks\\x2deb5a5901\\x2d32b6\\x2d427f\\x2d885e\\x2d6ee3011725ec.device", "Requires": "system-systemd\\x2dcryptsetup.slice", "Restart": "no", "RestartUSec": "100ms", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "Slice": "system-systemd\\x2dcryptsetup.slice", "SourcePath": "/etc/crypttab", "StandardError": "inherit", "StandardInput": "null", "StandardOutput": "journal", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitInterval": "10000000", "StartupBlockIOWeight": "18446744073709551615", "StartupCPUShares": "18446744073709551615", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "no", "TasksCurrent": "18446744073709551615", "TasksMax": "18446744073709551615", "TimeoutStartUSec": "0", "TimeoutStopUSec": "0", "TimerSlackNSec": "50000", "Transient": "no", "Type": "oneshot", "UMask": "0022", "UnitFilePreset": "disabled", "UnitFileState": "bad", "WantedBy": "dev-mapper-foo\\x2dtest1.device", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:69 Friday 17 January 2025 04:18:13 -0500 (0:00:00.508) 0:08:06.515 ******** changed: [managed-node1] => { "actions": [ { "action": "destroy format", "device": "/dev/mapper/luks-eb5a5901-32b6-427f-885e-6ee3011725ec", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/luks-eb5a5901-32b6-427f-885e-6ee3011725ec", "fs_type": null }, { "action": "destroy format", "device": "/dev/mapper/foo-test1", "fs_type": "luks" }, { "action": "create format", "device": "/dev/mapper/foo-test1", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/mapper/foo-test1", "name": "luks-eb5a5901-32b6-427f-885e-6ee3011725ec", "password": "-", "state": "absent" } ], "leaves": [ "/dev/mapper/foo-test1", "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-eb5a5901-32b6-427f-885e-6ee3011725ec", "state": "absent" }, { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "mounted" } ], "packages": [ "xfsprogs", "e2fsprogs", "lvm2" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } TASK [fedora.linux_system_roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:83 Friday 17 January 2025 04:19:18 -0500 (0:01:04.993) 0:09:11.509 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if /etc/fstab is present] ****** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:90 Friday 17 January 2025 04:19:18 -0500 (0:00:00.059) 0:09:11.569 ******** ok: [managed-node1] => { "changed": false, "stat": { "atime": 1737105399.703075, "attr_flags": "e", "attributes": [ "extents" ], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "f6ffd111ea1768ce77f1edc39e4c6ca3c9b99946", "ctime": 1737105399.7010748, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 263644, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0644", "mtime": 1737105399.7010748, "nlink": 1, "path": "/etc/fstab", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 1311, "uid": 0, "version": "18446744072031193646", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Add fingerprint to /etc/fstab if present] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:95 Friday 17 January 2025 04:19:19 -0500 (0:00:00.548) 0:09:12.117 ******** ok: [managed-node1] => { "backup": "", "changed": false } TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:113 Friday 17 January 2025 04:19:19 -0500 (0:00:00.512) 0:09:12.629 ******** changed: [managed-node1] => (item=systemd-cryptsetup@luks\x2deb5a5901\x2d32b6\x2d427f\x2d885e\x2d6ee3011725ec.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2deb5a5901\\x2d32b6\\x2d427f\\x2d885e\\x2d6ee3011725ec.service", "name": "systemd-cryptsetup@luks\\x2deb5a5901\\x2d32b6\\x2d427f\\x2d885e\\x2d6ee3011725ec.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AmbientCapabilities": "0", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "18446744073709551615", "CPUAccounting": "no", "CPUQuotaPerSecUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "18446744073709551615", "CanIsolate": "no", "CanReload": "no", "CanStart": "yes", "CanStop": "yes", "CapabilityBoundingSet": "18446744073709551615", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ControlPID": "0", "DefaultDependencies": "yes", "Delegate": "no", "Description": "systemd-cryptsetup@luks\\x2deb5a5901\\x2d32b6\\x2d427f\\x2d885e\\x2d6ee3011725ec.service", "DevicePolicy": "auto", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/dev/null", "GuessMainPID": "yes", "IOScheduling": "0", "Id": "systemd-cryptsetup@luks\\x2deb5a5901\\x2d32b6\\x2d427f\\x2d885e\\x2d6ee3011725ec.service", "IgnoreOnIsolate": "no", "IgnoreOnSnapshot": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobTimeoutAction": "none", "JobTimeoutUSec": "0", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "18446744073709551615", "LimitCORE": "18446744073709551615", "LimitCPU": "18446744073709551615", "LimitDATA": "18446744073709551615", "LimitFSIZE": "18446744073709551615", "LimitLOCKS": "18446744073709551615", "LimitMEMLOCK": "65536", "LimitMSGQUEUE": "819200", "LimitNICE": "0", "LimitNOFILE": "1048576", "LimitNPROC": "14311", "LimitRSS": "18446744073709551615", "LimitRTPRIO": "0", "LimitRTTIME": "18446744073709551615", "LimitSIGPENDING": "14311", "LimitSTACK": "18446744073709551615", "LoadState": "masked", "MainPID": "0", "MemoryAccounting": "no", "MemoryCurrent": "18446744073709551615", "MemoryLimit": "18446744073709551615", "MountFlags": "0", "Names": "systemd-cryptsetup@luks\\x2deb5a5901\\x2d32b6\\x2d427f\\x2d885e\\x2d6ee3011725ec.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "PrivateDevices": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "ProtectHome": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "no", "RequiredBy": "dev-mapper-luks\\x2deb5a5901\\x2d32b6\\x2d427f\\x2d885e\\x2d6ee3011725ec.device cryptsetup.target", "Restart": "no", "RestartUSec": "100ms", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitInterval": "10000000", "StartupBlockIOWeight": "18446744073709551615", "StartupCPUShares": "18446744073709551615", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "no", "TasksCurrent": "18446744073709551615", "TasksMax": "18446744073709551615", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UMask": "0022", "UnitFileState": "bad", "WantedBy": "dev-mapper-foo\\x2dtest1.device", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } TASK [fedora.linux_system_roles.storage : Show blivet_output] ****************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:119 Friday 17 January 2025 04:19:20 -0500 (0:00:00.937) 0:09:13.567 ******** ok: [managed-node1] => { "blivet_output": { "actions": [ { "action": "destroy format", "device": "/dev/mapper/luks-eb5a5901-32b6-427f-885e-6ee3011725ec", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/luks-eb5a5901-32b6-427f-885e-6ee3011725ec", "fs_type": null }, { "action": "destroy format", "device": "/dev/mapper/foo-test1", "fs_type": "luks" }, { "action": "create format", "device": "/dev/mapper/foo-test1", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/mapper/foo-test1", "name": "luks-eb5a5901-32b6-427f-885e-6ee3011725ec", "password": "-", "state": "absent" } ], "failed": false, "leaves": [ "/dev/mapper/foo-test1", "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-eb5a5901-32b6-427f-885e-6ee3011725ec", "state": "absent" }, { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "mounted" } ], "packages": [ "xfsprogs", "e2fsprogs", "lvm2" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } } TASK [fedora.linux_system_roles.storage : Set the list of pools for test verification] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:128 Friday 17 January 2025 04:19:20 -0500 (0:00:00.096) 0:09:13.664 ******** ok: [managed-node1] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ] }, "changed": false } TASK [fedora.linux_system_roles.storage : Set the list of volumes for test verification] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:132 Friday 17 January 2025 04:19:21 -0500 (0:00:00.100) 0:09:13.764 ******** ok: [managed-node1] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Remove obsolete mounts] ************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:148 Friday 17 January 2025 04:19:21 -0500 (0:00:00.135) 0:09:13.900 ******** changed: [managed-node1] => (item={u'src': u'/dev/mapper/luks-eb5a5901-32b6-427f-885e-6ee3011725ec', u'state': u'absent', u'fstype': u'xfs', u'path': u'/opt/test1'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-eb5a5901-32b6-427f-885e-6ee3011725ec", "state": "absent" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/luks-eb5a5901-32b6-427f-885e-6ee3011725ec" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:159 Friday 17 January 2025 04:19:21 -0500 (0:00:00.436) 0:09:14.336 ******** ok: [managed-node1] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Set up new/current mounts] *********** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:164 Friday 17 January 2025 04:19:22 -0500 (0:00:00.597) 0:09:14.934 ******** changed: [managed-node1] => (item={u'src': u'/dev/mapper/foo-test1', u'group': None, u'dump': 0, u'passno': 0, u'fstype': u'xfs', u'state': u'mounted', u'mode': None, u'owner': None, u'path': u'/opt/test1', u'opts': u'defaults'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/foo-test1" } TASK [fedora.linux_system_roles.storage : Manage mount ownership/permissions] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:175 Friday 17 January 2025 04:19:22 -0500 (0:00:00.623) 0:09:15.557 ******** skipping: [managed-node1] => (item={u'src': u'/dev/mapper/foo-test1', u'group': None, u'dump': 0, u'passno': 0, u'fstype': u'xfs', u'state': u'mounted', u'mode': None, u'owner': None, u'path': u'/opt/test1', u'opts': u'defaults'}) => { "ansible_loop_var": "mount_info", "changed": false, "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "mounted" }, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:187 Friday 17 January 2025 04:19:22 -0500 (0:00:00.139) 0:09:15.696 ******** ok: [managed-node1] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Retrieve facts for the /etc/crypttab file] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:195 Friday 17 January 2025 04:19:23 -0500 (0:00:00.638) 0:09:16.335 ******** ok: [managed-node1] => { "changed": false, "stat": { "atime": 1737105405.1960802, "attr_flags": "e", "attributes": [ "extents" ], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "552cf075acd7d67244861700a71df5d7c42dce5c", "ctime": 1737105402.2070773, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 263648, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0600", "mtime": 1737105402.2060773, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 66, "uid": 0, "version": "18446744072031194679", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Manage /etc/crypttab to account for changes we just made] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:200 Friday 17 January 2025 04:19:24 -0500 (0:00:00.536) 0:09:16.872 ******** changed: [managed-node1] => (item={u'state': u'absent', u'password': u'-', u'name': u'luks-eb5a5901-32b6-427f-885e-6ee3011725ec', u'backing_device': u'/dev/mapper/foo-test1'}) => { "ansible_loop_var": "entry", "backup": "", "changed": true, "entry": { "backing_device": "/dev/mapper/foo-test1", "name": "luks-eb5a5901-32b6-427f-885e-6ee3011725ec", "password": "-", "state": "absent" }, "found": 1 } MSG: 1 line(s) removed TASK [fedora.linux_system_roles.storage : Update facts] ************************ task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:222 Friday 17 January 2025 04:19:24 -0500 (0:00:00.686) 0:09:17.558 ******** ok: [managed-node1] TASK [Verify role results] ***************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:477 Friday 17 January 2025 04:19:25 -0500 (0:00:01.085) 0:09:18.644 ******** included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml for managed-node1 TASK [Print out pool information] ********************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:2 Friday 17 January 2025 04:19:26 -0500 (0:00:00.220) 0:09:18.865 ******** ok: [managed-node1] => { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ] } TASK [Print out volume information] ******************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:7 Friday 17 January 2025 04:19:26 -0500 (0:00:00.116) 0:09:18.982 ******** skipping: [managed-node1] => {} TASK [Collect info about the volumes.] ***************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:15 Friday 17 January 2025 04:19:26 -0500 (0:00:00.091) 0:09:19.073 ******** ok: [managed-node1] => { "changed": false, "info": { "/dev/mapper/foo-test1": { "fstype": "xfs", "label": "", "mountpoint": "/opt/test1", "name": "/dev/mapper/foo-test1", "size": "4G", "type": "lvm", "uuid": "7c1366e7-5b44-46b6-ad0d-9e5b26028fe3" }, "/dev/sda": { "fstype": "LVM2_member", "label": "", "mountpoint": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "30b7yk-4Y3c-DHs7-HP7U-2G7i-eOL6-HqzhXx" }, "/dev/sdb": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdd": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdd", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sde": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sde", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdf": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdf", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdg": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdg", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdh": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdh", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdi": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdi", "size": "10G", "type": "disk", "uuid": "" }, "/dev/xvda": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/xvda", "size": "250G", "type": "disk", "uuid": "" }, "/dev/xvda1": { "fstype": "ext4", "label": "", "mountpoint": "/", "name": "/dev/xvda1", "size": "250G", "type": "partition", "uuid": "c7b7d6a5-fd01-4b9b-bcca-153eaff9d312" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:20 Friday 17 January 2025 04:19:26 -0500 (0:00:00.529) 0:09:19.603 ******** ok: [managed-node1] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.002923", "end": "2025-01-17 04:19:27.262688", "rc": 0, "start": "2025-01-17 04:19:27.259765" } STDOUT: # system_role:storage # # /etc/fstab # Created by anaconda on Thu Jun 20 10:23:46 2024 # # Accessible filesystems, by reference, are maintained under '/dev/disk' # See man pages fstab(5), findfs(8), mount(8) and/or blkid(8) for more info # UUID=c7b7d6a5-fd01-4b9b-bcca-153eaff9d312 / ext4 defaults 1 1 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat /mnt/redhat nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/qa /mnt/qa nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 vtap-eng01.storage.rdu2.redhat.com:/vol/engarchive /mnt/engarchive nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/tpsdist /mnt/tpsdist nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat/brewroot /mnt/brew nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_brew_scratch_nfs/scratch /mnt/brew_scratch nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 /dev/mapper/foo-test1 /opt/test1 xfs defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:25 Friday 17 January 2025 04:19:27 -0500 (0:00:00.524) 0:09:20.128 ******** ok: [managed-node1] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.002899", "end": "2025-01-17 04:19:27.805322", "failed_when_result": false, "rc": 0, "start": "2025-01-17 04:19:27.802423" } TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:34 Friday 17 January 2025 04:19:27 -0500 (0:00:00.506) 0:09:20.634 ******** included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml for managed-node1 TASK [Set _storage_pool_tests] ************************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:5 Friday 17 January 2025 04:19:28 -0500 (0:00:00.256) 0:09:20.891 ******** ok: [managed-node1] => { "ansible_facts": { "_storage_pool_tests": [ "members", "volumes" ] }, "changed": false } TASK [Get VG shared value status] ********************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:18 Friday 17 January 2025 04:19:28 -0500 (0:00:00.086) 0:09:20.977 ******** ok: [managed-node1] => { "changed": false, "cmd": [ "vgs", "--noheadings", "--binary", "-o", "shared", "foo" ], "delta": "0:00:00.017678", "end": "2025-01-17 04:19:28.712787", "rc": 0, "start": "2025-01-17 04:19:28.695109" } STDOUT: 0 TASK [Verify that VG shared value checks out] ********************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:24 Friday 17 January 2025 04:19:28 -0500 (0:00:00.535) 0:09:21.512 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Verify pool subset] ****************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:34 Friday 17 January 2025 04:19:28 -0500 (0:00:00.078) 0:09:21.592 ******** included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml for managed-node1 included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-volumes.yml for managed-node1 TASK [Set test variables] ****************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:2 Friday 17 January 2025 04:19:29 -0500 (0:00:00.151) 0:09:21.743 ******** ok: [managed-node1] => { "ansible_facts": { "_storage_test_expected_pv_count": "1", "_storage_test_pool_pvs_lvm": [ "/dev/sda" ] }, "changed": false } TASK [Get the canonical device path for each member device] ******************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:8 Friday 17 January 2025 04:19:29 -0500 (0:00:00.077) 0:09:21.820 ******** ok: [managed-node1] => (item=/dev/sda) => { "ansible_loop_var": "pv", "changed": false, "device": "/dev/sda", "pv": "/dev/sda" } TASK [Set pvs lvm length] ****************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:17 Friday 17 January 2025 04:19:29 -0500 (0:00:00.542) 0:09:22.363 ******** ok: [managed-node1] => { "ansible_facts": { "__pvs_lvm_len": "1" }, "changed": false } TASK [Set pool pvs] ************************************************************ task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:22 Friday 17 January 2025 04:19:29 -0500 (0:00:00.115) 0:09:22.479 ******** ok: [managed-node1] => { "ansible_facts": { "_storage_test_pool_pvs": [ "/dev/sda" ] }, "changed": false } TASK [Verify PV count] ********************************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:27 Friday 17 January 2025 04:19:29 -0500 (0:00:00.114) 0:09:22.594 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Set expected pv type] **************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:36 Friday 17 January 2025 04:19:30 -0500 (0:00:00.128) 0:09:22.722 ******** ok: [managed-node1] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [Set expected pv type] **************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:41 Friday 17 January 2025 04:19:30 -0500 (0:00:00.082) 0:09:22.804 ******** ok: [managed-node1] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [Set expected pv type] **************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:46 Friday 17 January 2025 04:19:30 -0500 (0:00:00.090) 0:09:22.894 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check the type of each PV] *********************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:51 Friday 17 January 2025 04:19:30 -0500 (0:00:00.064) 0:09:22.959 ******** ok: [managed-node1] => (item=/dev/sda) => { "ansible_loop_var": "pv", "changed": false, "pv": "/dev/sda" } MSG: All assertions passed TASK [Check that blivet supports PV grow to fill] ****************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:64 Friday 17 January 2025 04:19:30 -0500 (0:00:00.087) 0:09:23.046 ******** ok: [managed-node1] => { "changed": false, "rc": 0 } STDOUT: False STDERR: Shared connection to 10.31.46.65 closed. TASK [Verify that PVs fill the whole devices when they should] ***************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:73 Friday 17 January 2025 04:19:30 -0500 (0:00:00.537) 0:09:23.583 ******** skipping: [managed-node1] => (item=/dev/sda) => { "ansible_loop_var": "st_pool_pv", "changed": false, "skip_reason": "Conditional result was False", "st_pool_pv": "/dev/sda" } TASK [Check MD RAID] *********************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:83 Friday 17 January 2025 04:19:31 -0500 (0:00:00.125) 0:09:23.708 ******** included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml for managed-node1 TASK [Get information about RAID] ********************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:8 Friday 17 January 2025 04:19:31 -0500 (0:00:00.216) 0:09:23.925 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:14 Friday 17 January 2025 04:19:31 -0500 (0:00:00.088) 0:09:24.014 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:19 Friday 17 January 2025 04:19:31 -0500 (0:00:00.141) 0:09:24.156 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:24 Friday 17 January 2025 04:19:31 -0500 (0:00:00.106) 0:09:24.262 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md chunk size regex] ************************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:29 Friday 17 January 2025 04:19:31 -0500 (0:00:00.096) 0:09:24.359 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:37 Friday 17 January 2025 04:19:31 -0500 (0:00:00.090) 0:09:24.450 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:46 Friday 17 January 2025 04:19:31 -0500 (0:00:00.083) 0:09:24.534 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:55 Friday 17 January 2025 04:19:31 -0500 (0:00:00.073) 0:09:24.608 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:64 Friday 17 January 2025 04:19:31 -0500 (0:00:00.077) 0:09:24.686 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:74 Friday 17 January 2025 04:19:32 -0500 (0:00:00.068) 0:09:24.754 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Reset variables used by tests] ******************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:83 Friday 17 January 2025 04:19:32 -0500 (0:00:00.062) 0:09:24.817 ******** ok: [managed-node1] => { "ansible_facts": { "storage_test_md_active_devices_re": null, "storage_test_md_chunk_size_re": null, "storage_test_md_metadata_version_re": null, "storage_test_md_spare_devices_re": null }, "changed": false } TASK [Check LVM RAID] ********************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:86 Friday 17 January 2025 04:19:32 -0500 (0:00:00.063) 0:09:24.880 ******** included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-lvmraid.yml for managed-node1 TASK [Validate pool member LVM RAID settings] ********************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-lvmraid.yml:2 Friday 17 January 2025 04:19:32 -0500 (0:00:00.128) 0:09:25.008 ******** included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml for managed-node1 TASK [Get information about the LV] ******************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:8 Friday 17 January 2025 04:19:32 -0500 (0:00:00.233) 0:09:25.242 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV segment type] ***************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:16 Friday 17 January 2025 04:19:32 -0500 (0:00:00.138) 0:09:25.380 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check segment type] ****************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:20 Friday 17 January 2025 04:19:32 -0500 (0:00:00.310) 0:09:25.690 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV stripe size] ****************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:27 Friday 17 January 2025 04:19:33 -0500 (0:00:00.094) 0:09:25.785 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested stripe size] ***************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:31 Friday 17 January 2025 04:19:33 -0500 (0:00:00.107) 0:09:25.893 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected stripe size] ************************************************ task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:37 Friday 17 January 2025 04:19:33 -0500 (0:00:00.096) 0:09:25.989 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check stripe size] ******************************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:42 Friday 17 January 2025 04:19:33 -0500 (0:00:00.115) 0:09:26.105 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check Thin Pools] ******************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:89 Friday 17 January 2025 04:19:33 -0500 (0:00:00.060) 0:09:26.165 ******** included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-thin.yml for managed-node1 TASK [Validate pool member thinpool settings] ********************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-thin.yml:2 Friday 17 January 2025 04:19:33 -0500 (0:00:00.176) 0:09:26.342 ******** included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml for managed-node1 TASK [Get information about thinpool] ****************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml:8 Friday 17 January 2025 04:19:33 -0500 (0:00:00.185) 0:09:26.528 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is in correct thinpool (when thinp name is provided)] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml:16 Friday 17 January 2025 04:19:33 -0500 (0:00:00.089) 0:09:26.618 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is in thinpool (when thinp name is not provided)] ****** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml:22 Friday 17 January 2025 04:19:34 -0500 (0:00:00.128) 0:09:26.747 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Reset variable used by test] ********************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml:26 Friday 17 January 2025 04:19:34 -0500 (0:00:00.127) 0:09:26.874 ******** ok: [managed-node1] => { "ansible_facts": { "storage_test_thin_status": null }, "changed": false } TASK [Check member encryption] ************************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:92 Friday 17 January 2025 04:19:34 -0500 (0:00:00.111) 0:09:26.986 ******** included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml for managed-node1 TASK [Set test variables] ****************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:5 Friday 17 January 2025 04:19:34 -0500 (0:00:00.261) 0:09:27.247 ******** ok: [managed-node1] => { "ansible_facts": { "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Validate pool member LUKS settings] ************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:10 Friday 17 January 2025 04:19:34 -0500 (0:00:00.066) 0:09:27.314 ******** skipping: [managed-node1] => (item=/dev/sda) => { "_storage_test_pool_member_path": "/dev/sda", "ansible_loop_var": "_storage_test_pool_member_path", "changed": false, "skip_reason": "Conditional result was False" } TASK [Validate pool member crypttab entries] *********************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:17 Friday 17 January 2025 04:19:34 -0500 (0:00:00.075) 0:09:27.389 ******** included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml for managed-node1 TASK [Set variables used by tests] ********************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:2 Friday 17 January 2025 04:19:34 -0500 (0:00:00.152) 0:09:27.542 ******** ok: [managed-node1] => { "ansible_facts": { "_storage_test_crypttab_entries": [] }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:6 Friday 17 January 2025 04:19:34 -0500 (0:00:00.080) 0:09:27.623 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:14 Friday 17 January 2025 04:19:34 -0500 (0:00:00.072) 0:09:27.696 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:23 Friday 17 January 2025 04:19:35 -0500 (0:00:00.059) 0:09:27.755 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:32 Friday 17 January 2025 04:19:35 -0500 (0:00:00.060) 0:09:27.815 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clear test variables] **************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:41 Friday 17 January 2025 04:19:35 -0500 (0:00:00.058) 0:09:27.874 ******** ok: [managed-node1] => { "ansible_facts": { "_storage_test_crypttab_entries": null }, "changed": false } TASK [Clear test variables] **************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:24 Friday 17 January 2025 04:19:35 -0500 (0:00:00.059) 0:09:27.934 ******** ok: [managed-node1] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_crypttab_key_file": null }, "changed": false } TASK [Check VDO] *************************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:95 Friday 17 January 2025 04:19:35 -0500 (0:00:00.059) 0:09:27.993 ******** included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-vdo.yml for managed-node1 TASK [Validate pool member VDO settings] *************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-vdo.yml:2 Friday 17 January 2025 04:19:35 -0500 (0:00:00.136) 0:09:28.129 ******** included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml for managed-node1 TASK [Get information about VDO deduplication] ********************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:8 Friday 17 January 2025 04:19:35 -0500 (0:00:00.120) 0:09:28.249 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is off] *************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:15 Friday 17 January 2025 04:19:35 -0500 (0:00:00.059) 0:09:28.309 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is on] **************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:21 Friday 17 January 2025 04:19:35 -0500 (0:00:00.061) 0:09:28.371 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about VDO compression] *********************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:27 Friday 17 January 2025 04:19:35 -0500 (0:00:00.061) 0:09:28.432 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is off] *************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:34 Friday 17 January 2025 04:19:35 -0500 (0:00:00.058) 0:09:28.490 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is on] **************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:40 Friday 17 January 2025 04:19:35 -0500 (0:00:00.056) 0:09:28.547 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clear test variables] **************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:46 Friday 17 January 2025 04:19:35 -0500 (0:00:00.066) 0:09:28.614 ******** ok: [managed-node1] => { "ansible_facts": { "storage_test_vdo_status": null }, "changed": false } TASK [Check Stratis] *********************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:98 Friday 17 January 2025 04:19:35 -0500 (0:00:00.074) 0:09:28.689 ******** included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml for managed-node1 TASK [Run 'stratis report'] **************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:6 Friday 17 January 2025 04:19:36 -0500 (0:00:00.139) 0:09:28.828 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about Stratis] ******************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:11 Friday 17 January 2025 04:19:36 -0500 (0:00:00.055) 0:09:28.884 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the pools was created] *************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:15 Friday 17 January 2025 04:19:36 -0500 (0:00:00.056) 0:09:28.941 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that encryption is correctly set] ********************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:25 Friday 17 January 2025 04:19:36 -0500 (0:00:00.059) 0:09:29.000 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that Clevis/Tang encryption is correctly set] ********************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:34 Friday 17 January 2025 04:19:36 -0500 (0:00:00.056) 0:09:29.057 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Reset variable used by test] ********************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:44 Friday 17 January 2025 04:19:36 -0500 (0:00:00.056) 0:09:29.113 ******** ok: [managed-node1] => { "ansible_facts": { "storage_test_stratis_report": null }, "changed": false } TASK [Clean up test variables] ************************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:101 Friday 17 January 2025 04:19:36 -0500 (0:00:00.062) 0:09:29.176 ******** ok: [managed-node1] => { "ansible_facts": { "__pvs_lvm_len": null, "_storage_test_expected_pv_count": null, "_storage_test_expected_pv_type": null, "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [Verify the volumes] ****************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-volumes.yml:3 Friday 17 January 2025 04:19:36 -0500 (0:00:00.057) 0:09:29.233 ******** included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml for managed-node1 TASK [Set storage volume test variables] *************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:2 Friday 17 January 2025 04:19:36 -0500 (0:00:00.108) 0:09:29.341 ******** ok: [managed-node1] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [Run test verify for {{ storage_test_volume_subset }}] ******************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:19 Friday 17 January 2025 04:19:36 -0500 (0:00:00.067) 0:09:29.409 ******** included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml for managed-node1 included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml for managed-node1 included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml for managed-node1 included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml for managed-node1 included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml for managed-node1 included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml for managed-node1 included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml for managed-node1 included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml for managed-node1 TASK [Get expected mount device based on device type] ************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:7 Friday 17 January 2025 04:19:37 -0500 (0:00:00.315) 0:09:29.725 ******** ok: [managed-node1] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/foo-test1" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:11 Friday 17 January 2025 04:19:37 -0500 (0:00:00.065) 0:09:29.790 ******** ok: [managed-node1] => { "ansible_facts": { "storage_test_mount_expected_mount_point": "/opt/test1", "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Get information about the mountpoint directory] ************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:19 Friday 17 January 2025 04:19:37 -0500 (0:00:00.070) 0:09:29.861 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by device] ******************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:28 Friday 17 January 2025 04:19:37 -0500 (0:00:00.151) 0:09:30.013 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Verify mount directory user] ********************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:36 Friday 17 January 2025 04:19:37 -0500 (0:00:00.068) 0:09:30.081 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory group] ******************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:42 Friday 17 January 2025 04:19:37 -0500 (0:00:00.061) 0:09:30.142 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory permissions] ************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:48 Friday 17 January 2025 04:19:37 -0500 (0:00:00.057) 0:09:30.199 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get path of test volume device] ****************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:57 Friday 17 January 2025 04:19:37 -0500 (0:00:00.056) 0:09:30.256 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:63 Friday 17 January 2025 04:19:37 -0500 (0:00:00.056) 0:09:30.312 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:69 Friday 17 January 2025 04:19:37 -0500 (0:00:00.061) 0:09:30.373 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:79 Friday 17 January 2025 04:19:37 -0500 (0:00:00.057) 0:09:30.431 ******** ok: [managed-node1] => { "ansible_facts": { "storage_test_found_mount_stat": null, "storage_test_mount_expected_mount_point": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:2 Friday 17 January 2025 04:19:37 -0500 (0:00:00.059) 0:09:30.490 ******** ok: [managed-node1] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "/dev/mapper/foo-test1 " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:17 Friday 17 January 2025 04:19:37 -0500 (0:00:00.101) 0:09:30.591 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:24 Friday 17 January 2025 04:19:37 -0500 (0:00:00.064) 0:09:30.656 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:33 Friday 17 January 2025 04:19:38 -0500 (0:00:00.053) 0:09:30.710 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify fingerprint] ****************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:45 Friday 17 January 2025 04:19:38 -0500 (0:00:00.057) 0:09:30.767 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Clean up variables] ****************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:52 Friday 17 January 2025 04:19:38 -0500 (0:00:00.067) 0:09:30.835 ******** ok: [managed-node1] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:6 Friday 17 January 2025 04:19:38 -0500 (0:00:00.039) 0:09:30.875 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:14 Friday 17 January 2025 04:19:38 -0500 (0:00:00.051) 0:09:30.926 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:3 Friday 17 January 2025 04:19:38 -0500 (0:00:00.066) 0:09:30.992 ******** ok: [managed-node1] => { "changed": false, "stat": { "atime": 1737105558.6732123, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1737105558.6732123, "dev": 5, "device_type": 64768, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 123121, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1737105558.6732123, "nlink": 1, "path": "/dev/mapper/foo-test1", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:9 Friday 17 January 2025 04:19:38 -0500 (0:00:00.504) 0:09:31.497 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:16 Friday 17 January 2025 04:19:38 -0500 (0:00:00.130) 0:09:31.627 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Make sure we got info about this volume] ********************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:23 Friday 17 January 2025 04:19:39 -0500 (0:00:00.070) 0:09:31.698 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Process volume type (set initial value) (1/2)] *************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:29 Friday 17 January 2025 04:19:39 -0500 (0:00:00.075) 0:09:31.774 ******** ok: [managed-node1] => { "ansible_facts": { "st_volume_type": "lvm" }, "changed": false } TASK [Process volume type (get RAID value) (2/2)] ****************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:33 Friday 17 January 2025 04:19:39 -0500 (0:00:00.092) 0:09:31.866 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:38 Friday 17 January 2025 04:19:39 -0500 (0:00:00.059) 0:09:31.925 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:3 Friday 17 January 2025 04:19:39 -0500 (0:00:00.075) 0:09:32.000 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Ensure cryptsetup is present] ******************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:10 Friday 17 January 2025 04:19:39 -0500 (0:00:00.058) 0:09:32.059 ******** ok: [managed-node1] => { "changed": false, "rc": 0, "results": [ "cryptsetup-2.0.3-6.el7.x86_64 providing cryptsetup is already installed" ] } TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:16 Friday 17 January 2025 04:19:40 -0500 (0:00:00.702) 0:09:32.762 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:22 Friday 17 January 2025 04:19:40 -0500 (0:00:00.061) 0:09:32.824 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:29 Friday 17 January 2025 04:19:40 -0500 (0:00:00.059) 0:09:32.884 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:40 Friday 17 January 2025 04:19:40 -0500 (0:00:00.079) 0:09:32.963 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:46 Friday 17 January 2025 04:19:40 -0500 (0:00:00.060) 0:09:33.024 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:51 Friday 17 January 2025 04:19:40 -0500 (0:00:00.056) 0:09:33.080 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:63 Friday 17 January 2025 04:19:40 -0500 (0:00:00.056) 0:09:33.137 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:75 Friday 17 January 2025 04:19:40 -0500 (0:00:00.056) 0:09:33.193 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set test variables] ****************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:87 Friday 17 January 2025 04:19:40 -0500 (0:00:00.056) 0:09:33.250 ******** ok: [managed-node1] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:93 Friday 17 January 2025 04:19:40 -0500 (0:00:00.068) 0:09:33.319 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:100 Friday 17 January 2025 04:19:40 -0500 (0:00:00.071) 0:09:33.390 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:108 Friday 17 January 2025 04:19:40 -0500 (0:00:00.060) 0:09:33.451 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:116 Friday 17 January 2025 04:19:40 -0500 (0:00:00.063) 0:09:33.514 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clear test variables] **************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:124 Friday 17 January 2025 04:19:40 -0500 (0:00:00.059) 0:09:33.574 ******** ok: [managed-node1] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [Get information about RAID] ********************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:8 Friday 17 January 2025 04:19:40 -0500 (0:00:00.063) 0:09:33.637 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:14 Friday 17 January 2025 04:19:40 -0500 (0:00:00.057) 0:09:33.695 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:19 Friday 17 January 2025 04:19:41 -0500 (0:00:00.060) 0:09:33.756 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:24 Friday 17 January 2025 04:19:41 -0500 (0:00:00.057) 0:09:33.814 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set chunk size regex] **************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:29 Friday 17 January 2025 04:19:41 -0500 (0:00:00.058) 0:09:33.872 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:37 Friday 17 January 2025 04:19:41 -0500 (0:00:00.056) 0:09:33.928 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:46 Friday 17 January 2025 04:19:41 -0500 (0:00:00.058) 0:09:33.986 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:54 Friday 17 January 2025 04:19:41 -0500 (0:00:00.064) 0:09:34.051 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:62 Friday 17 January 2025 04:19:41 -0500 (0:00:00.061) 0:09:34.113 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:70 Friday 17 January 2025 04:19:41 -0500 (0:00:00.061) 0:09:34.175 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the actual size of the volume] ************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:3 Friday 17 January 2025 04:19:41 -0500 (0:00:00.058) 0:09:34.233 ******** ok: [managed-node1] => { "bytes": 4294967296, "changed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } TASK [Parse the requested size of the volume] ********************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:11 Friday 17 January 2025 04:19:41 -0500 (0:00:00.383) 0:09:34.617 ******** ok: [managed-node1] => { "bytes": 4294967296, "changed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } TASK [Establish base value for expected size] ********************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:20 Friday 17 January 2025 04:19:42 -0500 (0:00:00.379) 0:09:34.996 ******** ok: [managed-node1] => { "ansible_facts": { "storage_test_expected_size": "4294967296" }, "changed": false } TASK [Show expected size] ****************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:28 Friday 17 January 2025 04:19:42 -0500 (0:00:00.097) 0:09:35.093 ******** ok: [managed-node1] => { "storage_test_expected_size": "4294967296" } TASK [Get the size of parent/pool device] ************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:32 Friday 17 January 2025 04:19:42 -0500 (0:00:00.062) 0:09:35.155 ******** ok: [managed-node1] => { "bytes": 10737418240, "changed": false, "lvm": "10g", "parted": "10GiB", "size": "10 GiB" } TASK [Show test pool] ********************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:46 Friday 17 January 2025 04:19:42 -0500 (0:00:00.394) 0:09:35.550 ******** skipping: [managed-node1] => {} TASK [Show test blockinfo] ***************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:50 Friday 17 January 2025 04:19:42 -0500 (0:00:00.066) 0:09:35.617 ******** skipping: [managed-node1] => {} TASK [Show test pool size] ***************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:54 Friday 17 January 2025 04:19:42 -0500 (0:00:00.062) 0:09:35.680 ******** skipping: [managed-node1] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:58 Friday 17 January 2025 04:19:43 -0500 (0:00:00.051) 0:09:35.732 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default thin pool reserved space values] ********************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:67 Friday 17 January 2025 04:19:43 -0500 (0:00:00.066) 0:09:35.798 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default minimal thin pool reserved space size] *************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:71 Friday 17 January 2025 04:19:43 -0500 (0:00:00.054) 0:09:35.852 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default maximal thin pool reserved space size] *************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:76 Friday 17 January 2025 04:19:43 -0500 (0:00:00.058) 0:09:35.910 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate maximum usable space in thin pool] ***************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:82 Friday 17 January 2025 04:19:43 -0500 (0:00:00.056) 0:09:35.967 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply upper size limit to max usable thin pool space] ******************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:86 Friday 17 January 2025 04:19:43 -0500 (0:00:00.050) 0:09:36.018 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply lower size limit to max usable thin pool space] ******************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:91 Friday 17 January 2025 04:19:43 -0500 (0:00:00.042) 0:09:36.061 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Convert maximum usable thin pool space from int to Size] ***************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:96 Friday 17 January 2025 04:19:43 -0500 (0:00:00.054) 0:09:36.115 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show max thin pool size] ************************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:101 Friday 17 January 2025 04:19:43 -0500 (0:00:00.038) 0:09:36.154 ******** skipping: [managed-node1] => {} TASK [Show volume thin pool size] ********************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:105 Friday 17 January 2025 04:19:43 -0500 (0:00:00.038) 0:09:36.193 ******** skipping: [managed-node1] => {} TASK [Show test volume size] *************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:109 Friday 17 January 2025 04:19:43 -0500 (0:00:00.047) 0:09:36.241 ******** skipping: [managed-node1] => {} TASK [Establish base value for expected thin pool size] ************************ task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:113 Friday 17 January 2025 04:19:43 -0500 (0:00:00.069) 0:09:36.311 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:120 Friday 17 January 2025 04:19:43 -0500 (0:00:00.100) 0:09:36.411 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected thin pool volume size] ***************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:127 Friday 17 January 2025 04:19:43 -0500 (0:00:00.199) 0:09:36.610 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected thin pool volume size based on percentage value] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:131 Friday 17 January 2025 04:19:44 -0500 (0:00:00.098) 0:09:36.709 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Replace expected volume size with calculated value] ********************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:137 Friday 17 January 2025 04:19:44 -0500 (0:00:00.067) 0:09:36.776 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show actual size] ******************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:143 Friday 17 January 2025 04:19:44 -0500 (0:00:00.057) 0:09:36.834 ******** ok: [managed-node1] => { "storage_test_actual_size": { "bytes": 4294967296, "changed": false, "failed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } } TASK [Show expected size] ****************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:147 Friday 17 January 2025 04:19:44 -0500 (0:00:00.070) 0:09:36.905 ******** ok: [managed-node1] => { "storage_test_expected_size": "4294967296" } TASK [Assert expected size is actual size] ************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:151 Friday 17 January 2025 04:19:44 -0500 (0:00:00.074) 0:09:36.979 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Get information about the LV] ******************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:5 Friday 17 January 2025 04:19:44 -0500 (0:00:00.108) 0:09:37.088 ******** ok: [managed-node1] => { "changed": false, "cmd": [ "lvs", "--noheadings", "--nameprefixes", "--units=b", "--nosuffix", "--unquoted", "-o", "name,attr,cache_total_blocks,chunk_size,segtype", "foo/test1" ], "delta": "0:00:00.017256", "end": "2025-01-17 04:19:44.734982", "rc": 0, "start": "2025-01-17 04:19:44.717726" } STDOUT: LVM2_LV_NAME=test1 LVM2_LV_ATTR=-wi-ao---- LVM2_CACHE_TOTAL_BLOCKS= LVM2_CHUNK_SIZE=0 LVM2_SEGTYPE=linear TASK [Set LV segment type] ***************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:13 Friday 17 January 2025 04:19:44 -0500 (0:00:00.422) 0:09:37.511 ******** ok: [managed-node1] => { "ansible_facts": { "storage_test_lv_segtype": [ "linear" ] }, "changed": false } TASK [Check segment type] ****************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:17 Friday 17 January 2025 04:19:44 -0500 (0:00:00.048) 0:09:37.559 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Set LV cache size] ******************************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:24 Friday 17 January 2025 04:19:44 -0500 (0:00:00.065) 0:09:37.624 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested cache size] ****************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:31 Friday 17 January 2025 04:19:45 -0500 (0:00:00.092) 0:09:37.717 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected cache size] ************************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:37 Friday 17 January 2025 04:19:45 -0500 (0:00:00.072) 0:09:37.789 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:42 Friday 17 January 2025 04:19:45 -0500 (0:00:00.076) 0:09:37.866 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:25 Friday 17 January 2025 04:19:45 -0500 (0:00:00.080) 0:09:37.946 ******** ok: [managed-node1] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:44 Friday 17 January 2025 04:19:45 -0500 (0:00:00.068) 0:09:38.015 ******** TASK [Clean up variable namespace] ********************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:54 Friday 17 January 2025 04:19:45 -0500 (0:00:00.055) 0:09:38.070 ******** ok: [managed-node1] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null }, "changed": false } TASK [Create a file] *********************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/create-test-file.yml:12 Friday 17 January 2025 04:19:45 -0500 (0:00:00.063) 0:09:38.134 ******** changed: [managed-node1] => { "changed": true, "dest": "/opt/test1/quux", "gid": 0, "group": "root", "mode": "0644", "owner": "root", "secontext": "unconfined_u:object_r:unlabeled_t:s0", "size": 0, "state": "file", "uid": 0 } TASK [Test for correct handling of safe_mode] ********************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:483 Friday 17 January 2025 04:19:45 -0500 (0:00:00.416) 0:09:38.550 ******** included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml for managed-node1 TASK [Store global variable value copy] **************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:4 Friday 17 January 2025 04:19:45 -0500 (0:00:00.130) 0:09:38.681 ******** ok: [managed-node1] => { "ansible_facts": { "storage_pools_global": [], "storage_safe_mode_global": true, "storage_volumes_global": [] }, "changed": false } TASK [Verify role raises correct error] **************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:10 Friday 17 January 2025 04:19:46 -0500 (0:00:00.068) 0:09:38.749 ******** TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Friday 17 January 2025 04:19:46 -0500 (0:00:00.115) 0:09:38.864 ******** included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node1 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Friday 17 January 2025 04:19:46 -0500 (0:00:00.115) 0:09:38.980 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Friday 17 January 2025 04:19:46 -0500 (0:00:00.107) 0:09:39.087 ******** skipping: [managed-node1] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node1] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node1] => (item=CentOS_7.yml) => { "ansible_facts": { "__storage_blivet_diskvolume_mkfs_option_map": { "ext2": "-F", "ext3": "-F", "ext4": "-F" }, "blivet_package_list": [ "python-enum34", "python-blivet3", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "{{ 'libblockdev-s390' if ansible_architecture == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_7.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_7.yml" } skipping: [managed-node1] => (item=CentOS_7.9.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS_7.9.yml", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Friday 17 January 2025 04:19:46 -0500 (0:00:00.158) 0:09:39.245 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Friday 17 January 2025 04:19:46 -0500 (0:00:00.067) 0:09:39.313 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Friday 17 January 2025 04:19:46 -0500 (0:00:00.089) 0:09:39.403 ******** ok: [managed-node1] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Friday 17 January 2025 04:19:46 -0500 (0:00:00.071) 0:09:39.475 ******** ok: [managed-node1] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Friday 17 January 2025 04:19:46 -0500 (0:00:00.062) 0:09:39.537 ******** included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node1 TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2 Friday 17 January 2025 04:19:46 -0500 (0:00:00.140) 0:09:39.678 ******** ok: [managed-node1] => { "changed": false, "rc": 0, "results": [ "python-enum34-1.0.4-1.el7.noarch providing python-enum34 is already installed", "1:python2-blivet3-3.1.3-3.el7.noarch providing python-blivet3 is already installed", "libblockdev-crypto-2.18-5.el7.x86_64 providing libblockdev-crypto is already installed", "libblockdev-dm-2.18-5.el7.x86_64 providing libblockdev-dm is already installed", "libblockdev-lvm-2.18-5.el7.x86_64 providing libblockdev-lvm is already installed", "libblockdev-mdraid-2.18-5.el7.x86_64 providing libblockdev-mdraid is already installed", "libblockdev-swap-2.18-5.el7.x86_64 providing libblockdev-swap is already installed", "libblockdev-2.18-5.el7.x86_64 providing libblockdev is already installed" ] } TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Friday 17 January 2025 04:19:50 -0500 (0:00:03.982) 0:09:43.661 ******** ok: [managed-node1] => { "storage_pools": [ { "disks": [ "sda" ], "name": "foo", "type": "lvm", "volumes": [ { "encryption": true, "encryption_luks_version": "luks2", "encryption_password": "yabbadabbadoo", "mount_point": "/opt/test1", "name": "test1", "size": "4g" } ] } ] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:14 Friday 17 January 2025 04:19:51 -0500 (0:00:00.074) 0:09:43.735 ******** ok: [managed-node1] => { "storage_volumes": [] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19 Friday 17 January 2025 04:19:51 -0500 (0:00:00.072) 0:09:43.807 ******** ok: [managed-node1] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "cryptsetup", "lvm2" ], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:31 Friday 17 January 2025 04:19:55 -0500 (0:00:04.281) 0:09:48.089 ******** included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node1 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Friday 17 January 2025 04:19:55 -0500 (0:00:00.092) 0:09:48.182 ******** TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Friday 17 January 2025 04:19:55 -0500 (0:00:00.043) 0:09:48.225 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Friday 17 January 2025 04:19:55 -0500 (0:00:00.036) 0:09:48.261 ******** TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:37 Friday 17 January 2025 04:19:55 -0500 (0:00:00.033) 0:09:48.295 ******** ok: [managed-node1] => { "changed": false, "rc": 0, "results": [ "cryptsetup-2.0.3-6.el7.x86_64 providing cryptsetup is already installed", "7:lvm2-2.02.187-6.el7_9.5.x86_64 providing lvm2 is already installed", "kpartx-0.4.9-136.el7_9.x86_64 providing kpartx is already installed" ] } TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:51 Friday 17 January 2025 04:19:56 -0500 (0:00:00.678) 0:09:48.974 ******** ok: [managed-node1] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "arp-ethers.service": { "name": "arp-ethers.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "brandbot.service": { "name": "brandbot.service", "source": "systemd", "state": "inactive", "status": "static" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "console-shell.service": { "name": "console-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.import1.service": { "name": "dbus-org.freedesktop.import1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.machine1.service": { "name": "dbus-org.freedesktop.machine1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dmraid-activation.service": { "name": "dmraid-activation.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "ebtables.service": { "name": "ebtables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iprdump.service": { "name": "iprdump.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprinit.service": { "name": "iprinit.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprupdate.service": { "name": "iprupdate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-lvmetad.service": { "name": "lvm2-lvmetad.service", "source": "systemd", "state": "running", "status": "static" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "lvm2-pvscan@8:0.service": { "name": "lvm2-pvscan@8:0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "netconsole": { "name": "netconsole", "source": "sysv", "state": "stopped", "status": "disabled" }, "network": { "name": "network", "source": "sysv", "state": "running", "status": "enabled" }, "network.service": { "name": "network.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-config.service": { "name": "nfs-config.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-idmap.service": { "name": "nfs-idmap.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-lock.service": { "name": "nfs-lock.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-rquotad.service": { "name": "nfs-rquotad.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-secure.service": { "name": "nfs-secure.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs.service": { "name": "nfs.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfslock.service": { "name": "nfslock.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "postfix.service": { "name": "postfix.service", "source": "systemd", "state": "running", "status": "enabled" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rhel-autorelabel-mark.service": { "name": "rhel-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-autorelabel.service": { "name": "rhel-autorelabel.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-configure.service": { "name": "rhel-configure.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-dmesg.service": { "name": "rhel-dmesg.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-domainname.service": { "name": "rhel-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-import-state.service": { "name": "rhel-import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-loadmodules.service": { "name": "rhel-loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-readonly.service": { "name": "rhel-readonly.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-rquotad.service": { "name": "rpc-rquotad.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpcgssd.service": { "name": "rpcgssd.service", "source": "systemd", "state": "inactive", "status": "static" }, "rpcidmapd.service": { "name": "rpcidmapd.service", "source": "systemd", "state": "inactive", "status": "static" }, "rsyncd.service": { "name": "rsyncd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyncd@.service": { "name": "rsyncd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-policy-migrate-local-changes@.service": { "name": "selinux-policy-migrate-local-changes@.service", "source": "systemd", "state": "unknown", "status": "static" }, "selinux-policy-migrate-local-changes@targeted.service": { "name": "selinux-policy-migrate-local-changes@targeted.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "unknown" }, "sshd-keygen.service": { "name": "sshd-keygen.service", "source": "systemd", "state": "stopped", "status": "static" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-bootchart.service": { "name": "systemd-bootchart.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-cryptsetup@luks\\x2deb5a5901\\x2d32b6\\x2d427f\\x2d885e\\x2d6ee3011725ec.service": { "name": "systemd-cryptsetup@luks\\x2deb5a5901\\x2d32b6\\x2d427f\\x2d885e\\x2d6ee3011725ec.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-importd.service": { "name": "systemd-importd.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-machined.service": { "name": "systemd-machined.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-nspawn@.service": { "name": "systemd-nspawn@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-readahead-collect.service": { "name": "systemd-readahead-collect.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-readahead-done.service": { "name": "systemd-readahead-done.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "systemd-readahead-drop.service": { "name": "systemd-readahead-drop.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "systemd-readahead-replay.service": { "name": "systemd-readahead-replay.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill@.service": { "name": "systemd-rfkill@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-shutdownd.service": { "name": "systemd-shutdownd.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "wpa_supplicant.service": { "name": "wpa_supplicant.service", "source": "systemd", "state": "inactive", "status": "disabled" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Set storage_cryptsetup_services] ***** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:57 Friday 17 January 2025 04:19:57 -0500 (0:00:01.178) 0:09:50.152 ******** ok: [managed-node1] => { "ansible_facts": { "storage_cryptsetup_services": [ "systemd-cryptsetup@luks\\x2deb5a5901\\x2d32b6\\x2d427f\\x2d885e\\x2d6ee3011725ec.service" ] }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:63 Friday 17 January 2025 04:19:57 -0500 (0:00:00.066) 0:09:50.219 ******** changed: [managed-node1] => (item=systemd-cryptsetup@luks\x2deb5a5901\x2d32b6\x2d427f\x2d885e\x2d6ee3011725ec.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2deb5a5901\\x2d32b6\\x2d427f\\x2d885e\\x2d6ee3011725ec.service", "name": "systemd-cryptsetup@luks\\x2deb5a5901\\x2d32b6\\x2d427f\\x2d885e\\x2d6ee3011725ec.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "After": "systemd-readahead-replay.service cryptsetup-pre.target systemd-journald.socket system-systemd\\x2dcryptsetup.slice systemd-readahead-collect.service dev-mapper-foo\\x2dtest1.device", "AllowIsolate": "no", "AmbientCapabilities": "0", "AssertResult": "no", "AssertTimestampMonotonic": "0", "Before": "cryptsetup.target umount.target", "BindsTo": "dev-mapper-foo\\x2dtest1.device", "BlockIOAccounting": "no", "BlockIOWeight": "18446744073709551615", "CPUAccounting": "no", "CPUQuotaPerSecUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "18446744073709551615", "CanIsolate": "no", "CanReload": "no", "CanStart": "yes", "CanStop": "yes", "CapabilityBoundingSet": "18446744073709551615", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "Conflicts": "umount.target", "ControlPID": "0", "DefaultDependencies": "no", "Delegate": "no", "Description": "Cryptography Setup for luks-eb5a5901-32b6-427f-885e-6ee3011725ec", "DevicePolicy": "auto", "Documentation": "man:crypttab(5) man:systemd-cryptsetup-generator(8) man:systemd-cryptsetup@.service(8)", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup attach luks-eb5a5901-32b6-427f-885e-6ee3011725ec /dev/mapper/foo-test1 - ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStop": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup detach luks-eb5a5901-32b6-427f-885e-6ee3011725ec ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/run/systemd/generator/systemd-cryptsetup@luks\\x2deb5a5901\\x2d32b6\\x2d427f\\x2d885e\\x2d6ee3011725ec.service", "GuessMainPID": "yes", "IOScheduling": "0", "Id": "systemd-cryptsetup@luks\\x2deb5a5901\\x2d32b6\\x2d427f\\x2d885e\\x2d6ee3011725ec.service", "IgnoreOnIsolate": "yes", "IgnoreOnSnapshot": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobTimeoutAction": "none", "JobTimeoutUSec": "0", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "18446744073709551615", "LimitCORE": "18446744073709551615", "LimitCPU": "18446744073709551615", "LimitDATA": "18446744073709551615", "LimitFSIZE": "18446744073709551615", "LimitLOCKS": "18446744073709551615", "LimitMEMLOCK": "65536", "LimitMSGQUEUE": "819200", "LimitNICE": "0", "LimitNOFILE": "4096", "LimitNPROC": "14311", "LimitRSS": "18446744073709551615", "LimitRTPRIO": "0", "LimitRTTIME": "18446744073709551615", "LimitSIGPENDING": "14311", "LimitSTACK": "18446744073709551615", "LoadState": "loaded", "MainPID": "0", "MemoryAccounting": "no", "MemoryCurrent": "18446744073709551615", "MemoryLimit": "18446744073709551615", "MountFlags": "0", "Names": "systemd-cryptsetup@luks\\x2deb5a5901\\x2d32b6\\x2d427f\\x2d885e\\x2d6ee3011725ec.service", "NeedDaemonReload": "yes", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "PrivateDevices": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "ProtectHome": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "yes", "RequiredBy": "cryptsetup.target", "Requires": "system-systemd\\x2dcryptsetup.slice", "Restart": "no", "RestartUSec": "100ms", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "Slice": "system-systemd\\x2dcryptsetup.slice", "SourcePath": "/etc/crypttab", "StandardError": "inherit", "StandardInput": "null", "StandardOutput": "journal", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitInterval": "10000000", "StartupBlockIOWeight": "18446744073709551615", "StartupCPUShares": "18446744073709551615", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "no", "TasksCurrent": "18446744073709551615", "TasksMax": "18446744073709551615", "TimeoutStartUSec": "0", "TimeoutStopUSec": "0", "TimerSlackNSec": "50000", "Transient": "no", "Type": "oneshot", "UMask": "0022", "UnitFilePreset": "disabled", "UnitFileState": "bad", "WantedBy": "dev-mapper-foo\\x2dtest1.device", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:69 Friday 17 January 2025 04:19:58 -0500 (0:00:00.509) 0:09:50.729 ******** fatal: [managed-node1]: FAILED! => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } MSG: cannot remove existing formatting on device 'test1' in safe mode due to adding encryption TASK [fedora.linux_system_roles.storage : Failed message] ********************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:109 Friday 17 January 2025 04:20:02 -0500 (0:00:04.373) 0:09:55.103 ******** fatal: [managed-node1]: FAILED! => { "changed": false } MSG: {u'_ansible_no_log': False, u'crypts': [], u'pools': [], u'leaves': [], u'changed': False, u'actions': [], u'failed': True, u'volumes': [], u'invocation': {u'module_args': {u'packages_only': False, u'disklabel_type': None, u'diskvolume_mkfs_option_map': {u'ext4': u'-F', u'ext3': u'-F', u'ext2': u'-F'}, u'safe_mode': True, u'pools': [{u'raid_metadata_version': None, u'encryption_key_size': None, u'encryption_key': None, u'encryption_luks_version': None, u'encryption_tang_url': None, u'raid_spare_count': None, u'grow_to_fill': False, u'encryption_tang_thumbprint': None, u'name': u'foo', u'encryption_password': None, u'encryption': False, u'disks': [u'sda'], u'raid_level': None, u'raid_device_count': None, u'state': u'present', u'volumes': [{u'raid_metadata_version': None, u'mount_device_identifier': u'uuid', u'fs_type': u'xfs', u'mount_options': u'defaults', u'size': u'4g', u'mount_point': u'/opt/test1', u'compression': None, u'encryption_password': u'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', u'encryption': True, u'raid_level': None, u'raid_device_count': None, u'state': u'present', u'vdo_pool_size': None, u'mount_mode': None, u'thin_pool_name': None, u'type': u'lvm', u'encryption_key_size': None, u'deduplication': None, u'encryption_cipher': None, u'encryption_key': None, u'fs_label': u'', u'encryption_luks_version': u'luks2', u'raid_stripe_size': None, u'mount_passno': 0, u'mount_user': None, u'raid_spare_count': None, u'name': u'test1', u'cache_mode': None, u'raid_disks': [], u'mount_group': None, u'fs_overwrite_existing': True, u'disks': [u'sda'], u'cached': False, u'thin_pool_size': None, u'thin': False, u'mount_check': 0, u'cache_size': 0, u'raid_chunk_size': None, u'cache_devices': [], u'fs_create_options': u''}], u'shared': False, u'encryption_clevis_pin': None, u'type': u'lvm', u'encryption_cipher': None, u'raid_chunk_size': None}], u'volumes': [], u'pool_defaults': {u'raid_metadata_version': None, u'encryption_cipher': None, u'encryption_key': None, u'encryption_luks_version': None, u'raid_spare_count': None, u'grow_to_fill': False, u'encryption_password': None, u'encryption': False, u'disks': [], u'raid_level': None, u'raid_device_count': None, u'state': u'present', u'volumes': [], u'shared': False, u'type': u'lvm', u'encryption_key_size': None, u'raid_chunk_size': None}, u'volume_defaults': {u'raid_metadata_version': None, u'raid_level': None, u'fs_type': u'xfs', u'mount_options': u'defaults', u'size': 0, u'mount_point': u'', u'compression': None, u'encryption_password': None, u'encryption': False, u'mount_device_identifier': u'uuid', u'raid_device_count': None, u'state': u'present', u'vdo_pool_size': None, u'thin_pool_name': None, u'type': u'lvm', u'encryption_key_size': None, u'encryption_cipher': None, u'encryption_key': None, u'fs_label': u'', u'encryption_luks_version': None, u'raid_stripe_size': None, u'cache_size': 0, u'raid_spare_count': None, u'cache_mode': None, u'deduplication': None, u'cached': False, u'fs_overwrite_existing': True, u'disks': [], u'thin_pool_size': None, u'thin': None, u'mount_check': 0, u'mount_passno': 0, u'raid_chunk_size': None, u'cache_devices': [], u'fs_create_options': u''}, u'use_partitions': None}}, u'mounts': [], u'packages': [], u'msg': u"cannot remove existing formatting on device 'test1' in safe mode due to adding encryption"} TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:113 Friday 17 January 2025 04:20:02 -0500 (0:00:00.181) 0:09:55.284 ******** changed: [managed-node1] => (item=systemd-cryptsetup@luks\x2deb5a5901\x2d32b6\x2d427f\x2d885e\x2d6ee3011725ec.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2deb5a5901\\x2d32b6\\x2d427f\\x2d885e\\x2d6ee3011725ec.service", "name": "systemd-cryptsetup@luks\\x2deb5a5901\\x2d32b6\\x2d427f\\x2d885e\\x2d6ee3011725ec.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AmbientCapabilities": "0", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "18446744073709551615", "CPUAccounting": "no", "CPUQuotaPerSecUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "18446744073709551615", "CanIsolate": "no", "CanReload": "no", "CanStart": "yes", "CanStop": "yes", "CapabilityBoundingSet": "18446744073709551615", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ControlPID": "0", "DefaultDependencies": "yes", "Delegate": "no", "Description": "systemd-cryptsetup@luks\\x2deb5a5901\\x2d32b6\\x2d427f\\x2d885e\\x2d6ee3011725ec.service", "DevicePolicy": "auto", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/dev/null", "GuessMainPID": "yes", "IOScheduling": "0", "Id": "systemd-cryptsetup@luks\\x2deb5a5901\\x2d32b6\\x2d427f\\x2d885e\\x2d6ee3011725ec.service", "IgnoreOnIsolate": "no", "IgnoreOnSnapshot": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobTimeoutAction": "none", "JobTimeoutUSec": "0", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "18446744073709551615", "LimitCORE": "18446744073709551615", "LimitCPU": "18446744073709551615", "LimitDATA": "18446744073709551615", "LimitFSIZE": "18446744073709551615", "LimitLOCKS": "18446744073709551615", "LimitMEMLOCK": "65536", "LimitMSGQUEUE": "819200", "LimitNICE": "0", "LimitNOFILE": "1048576", "LimitNPROC": "14311", "LimitRSS": "18446744073709551615", "LimitRTPRIO": "0", "LimitRTTIME": "18446744073709551615", "LimitSIGPENDING": "14311", "LimitSTACK": "18446744073709551615", "LoadState": "masked", "MainPID": "0", "MemoryAccounting": "no", "MemoryCurrent": "18446744073709551615", "MemoryLimit": "18446744073709551615", "MountFlags": "0", "Names": "systemd-cryptsetup@luks\\x2deb5a5901\\x2d32b6\\x2d427f\\x2d885e\\x2d6ee3011725ec.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "PrivateDevices": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "ProtectHome": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "no", "Restart": "no", "RestartUSec": "100ms", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitInterval": "10000000", "StartupBlockIOWeight": "18446744073709551615", "StartupCPUShares": "18446744073709551615", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "no", "TasksCurrent": "18446744073709551615", "TasksMax": "18446744073709551615", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UMask": "0022", "UnitFileState": "bad", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } TASK [Check that we failed in the role] **************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:23 Friday 17 January 2025 04:20:03 -0500 (0:00:00.792) 0:09:56.077 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Verify the blivet output and error message are correct] ****************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:28 Friday 17 January 2025 04:20:03 -0500 (0:00:00.094) 0:09:56.171 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Verify correct exception or error message] ******************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:39 Friday 17 January 2025 04:20:03 -0500 (0:00:00.158) 0:09:56.330 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Stat the file] *********************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml:11 Friday 17 January 2025 04:20:03 -0500 (0:00:00.101) 0:09:56.431 ******** ok: [managed-node1] => { "changed": false, "stat": { "atime": 1737105585.764237, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1737105585.764237, "dev": 64768, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 67, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0644", "mtime": 1737105585.764237, "nlink": 1, "path": "/opt/test1/quux", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "version": "18446744073575593800", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Assert file presence] **************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml:16 Friday 17 January 2025 04:20:04 -0500 (0:00:00.516) 0:09:56.948 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Add encryption to the volume] ******************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:507 Friday 17 January 2025 04:20:04 -0500 (0:00:00.099) 0:09:57.048 ******** TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Friday 17 January 2025 04:20:04 -0500 (0:00:00.203) 0:09:57.252 ******** included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node1 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Friday 17 January 2025 04:20:04 -0500 (0:00:00.150) 0:09:57.402 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Friday 17 January 2025 04:20:04 -0500 (0:00:00.090) 0:09:57.493 ******** skipping: [managed-node1] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node1] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node1] => (item=CentOS_7.yml) => { "ansible_facts": { "__storage_blivet_diskvolume_mkfs_option_map": { "ext2": "-F", "ext3": "-F", "ext4": "-F" }, "blivet_package_list": [ "python-enum34", "python-blivet3", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "{{ 'libblockdev-s390' if ansible_architecture == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_7.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_7.yml" } skipping: [managed-node1] => (item=CentOS_7.9.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS_7.9.yml", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Friday 17 January 2025 04:20:05 -0500 (0:00:00.257) 0:09:57.751 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Friday 17 January 2025 04:20:05 -0500 (0:00:00.058) 0:09:57.809 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Friday 17 January 2025 04:20:05 -0500 (0:00:00.100) 0:09:57.910 ******** ok: [managed-node1] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Friday 17 January 2025 04:20:05 -0500 (0:00:00.058) 0:09:57.968 ******** ok: [managed-node1] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Friday 17 January 2025 04:20:05 -0500 (0:00:00.099) 0:09:58.067 ******** included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node1 TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2 Friday 17 January 2025 04:20:05 -0500 (0:00:00.178) 0:09:58.246 ******** ok: [managed-node1] => { "changed": false, "rc": 0, "results": [ "python-enum34-1.0.4-1.el7.noarch providing python-enum34 is already installed", "1:python2-blivet3-3.1.3-3.el7.noarch providing python-blivet3 is already installed", "libblockdev-crypto-2.18-5.el7.x86_64 providing libblockdev-crypto is already installed", "libblockdev-dm-2.18-5.el7.x86_64 providing libblockdev-dm is already installed", "libblockdev-lvm-2.18-5.el7.x86_64 providing libblockdev-lvm is already installed", "libblockdev-mdraid-2.18-5.el7.x86_64 providing libblockdev-mdraid is already installed", "libblockdev-swap-2.18-5.el7.x86_64 providing libblockdev-swap is already installed", "libblockdev-2.18-5.el7.x86_64 providing libblockdev is already installed" ] } TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Friday 17 January 2025 04:20:09 -0500 (0:00:03.982) 0:10:02.228 ******** ok: [managed-node1] => { "storage_pools": [ { "disks": [ "sda" ], "name": "foo", "type": "lvm", "volumes": [ { "encryption": true, "encryption_luks_version": "luks2", "encryption_password": "yabbadabbadoo", "mount_point": "/opt/test1", "name": "test1", "size": "4g" } ] } ] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:14 Friday 17 January 2025 04:20:09 -0500 (0:00:00.087) 0:10:02.316 ******** ok: [managed-node1] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19 Friday 17 January 2025 04:20:09 -0500 (0:00:00.113) 0:10:02.429 ******** ok: [managed-node1] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "cryptsetup", "lvm2" ], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:31 Friday 17 January 2025 04:20:13 -0500 (0:00:04.105) 0:10:06.535 ******** included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node1 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Friday 17 January 2025 04:20:13 -0500 (0:00:00.142) 0:10:06.677 ******** TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Friday 17 January 2025 04:20:14 -0500 (0:00:00.053) 0:10:06.731 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Friday 17 January 2025 04:20:14 -0500 (0:00:00.061) 0:10:06.792 ******** TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:37 Friday 17 January 2025 04:20:14 -0500 (0:00:00.069) 0:10:06.862 ******** ok: [managed-node1] => { "changed": false, "rc": 0, "results": [ "cryptsetup-2.0.3-6.el7.x86_64 providing cryptsetup is already installed", "7:lvm2-2.02.187-6.el7_9.5.x86_64 providing lvm2 is already installed", "kpartx-0.4.9-136.el7_9.x86_64 providing kpartx is already installed" ] } TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:51 Friday 17 January 2025 04:20:15 -0500 (0:00:00.938) 0:10:07.800 ******** ok: [managed-node1] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "arp-ethers.service": { "name": "arp-ethers.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "brandbot.service": { "name": "brandbot.service", "source": "systemd", "state": "inactive", "status": "static" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "console-shell.service": { "name": "console-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.import1.service": { "name": "dbus-org.freedesktop.import1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.machine1.service": { "name": "dbus-org.freedesktop.machine1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dmraid-activation.service": { "name": "dmraid-activation.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "ebtables.service": { "name": "ebtables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iprdump.service": { "name": "iprdump.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprinit.service": { "name": "iprinit.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprupdate.service": { "name": "iprupdate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-lvmetad.service": { "name": "lvm2-lvmetad.service", "source": "systemd", "state": "running", "status": "static" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "lvm2-pvscan@8:0.service": { "name": "lvm2-pvscan@8:0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "netconsole": { "name": "netconsole", "source": "sysv", "state": "stopped", "status": "disabled" }, "network": { "name": "network", "source": "sysv", "state": "running", "status": "enabled" }, "network.service": { "name": "network.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-config.service": { "name": "nfs-config.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-idmap.service": { "name": "nfs-idmap.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-lock.service": { "name": "nfs-lock.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-rquotad.service": { "name": "nfs-rquotad.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-secure.service": { "name": "nfs-secure.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs.service": { "name": "nfs.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfslock.service": { "name": "nfslock.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "postfix.service": { "name": "postfix.service", "source": "systemd", "state": "running", "status": "enabled" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rhel-autorelabel-mark.service": { "name": "rhel-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-autorelabel.service": { "name": "rhel-autorelabel.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-configure.service": { "name": "rhel-configure.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-dmesg.service": { "name": "rhel-dmesg.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-domainname.service": { "name": "rhel-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-import-state.service": { "name": "rhel-import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-loadmodules.service": { "name": "rhel-loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-readonly.service": { "name": "rhel-readonly.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-rquotad.service": { "name": "rpc-rquotad.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpcgssd.service": { "name": "rpcgssd.service", "source": "systemd", "state": "inactive", "status": "static" }, "rpcidmapd.service": { "name": "rpcidmapd.service", "source": "systemd", "state": "inactive", "status": "static" }, "rsyncd.service": { "name": "rsyncd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyncd@.service": { "name": "rsyncd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-policy-migrate-local-changes@.service": { "name": "selinux-policy-migrate-local-changes@.service", "source": "systemd", "state": "unknown", "status": "static" }, "selinux-policy-migrate-local-changes@targeted.service": { "name": "selinux-policy-migrate-local-changes@targeted.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "unknown" }, "sshd-keygen.service": { "name": "sshd-keygen.service", "source": "systemd", "state": "stopped", "status": "static" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-bootchart.service": { "name": "systemd-bootchart.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-importd.service": { "name": "systemd-importd.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-machined.service": { "name": "systemd-machined.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-nspawn@.service": { "name": "systemd-nspawn@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-readahead-collect.service": { "name": "systemd-readahead-collect.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-readahead-done.service": { "name": "systemd-readahead-done.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "systemd-readahead-drop.service": { "name": "systemd-readahead-drop.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "systemd-readahead-replay.service": { "name": "systemd-readahead-replay.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill@.service": { "name": "systemd-rfkill@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-shutdownd.service": { "name": "systemd-shutdownd.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "wpa_supplicant.service": { "name": "wpa_supplicant.service", "source": "systemd", "state": "inactive", "status": "disabled" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Set storage_cryptsetup_services] ***** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:57 Friday 17 January 2025 04:20:16 -0500 (0:00:01.187) 0:10:08.988 ******** ok: [managed-node1] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:63 Friday 17 January 2025 04:20:16 -0500 (0:00:00.079) 0:10:09.067 ******** TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:69 Friday 17 January 2025 04:20:16 -0500 (0:00:00.050) 0:10:09.118 ******** changed: [managed-node1] => { "actions": [ { "action": "destroy format", "device": "/dev/mapper/foo-test1", "fs_type": "xfs" }, { "action": "create format", "device": "/dev/mapper/foo-test1", "fs_type": "luks" }, { "action": "create device", "device": "/dev/mapper/luks-f2c4c01a-e23d-4ad3-a7f9-93d565703bdb", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/luks-f2c4c01a-e23d-4ad3-a7f9-93d565703bdb", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/mapper/foo-test1", "name": "luks-f2c4c01a-e23d-4ad3-a7f9-93d565703bdb", "password": "-", "state": "present" } ], "leaves": [ "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1", "/dev/mapper/luks-f2c4c01a-e23d-4ad3-a7f9-93d565703bdb" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "absent" }, { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-f2c4c01a-e23d-4ad3-a7f9-93d565703bdb", "state": "mounted" } ], "packages": [ "cryptsetup", "xfsprogs", "e2fsprogs", "lvm2" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/luks-f2c4c01a-e23d-4ad3-a7f9-93d565703bdb", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/luks-f2c4c01a-e23d-4ad3-a7f9-93d565703bdb", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } TASK [fedora.linux_system_roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:83 Friday 17 January 2025 04:20:29 -0500 (0:00:12.792) 0:10:21.911 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if /etc/fstab is present] ****** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:90 Friday 17 January 2025 04:20:29 -0500 (0:00:00.057) 0:10:21.969 ******** ok: [managed-node1] => { "changed": false, "stat": { "atime": 1737105562.6842158, "attr_flags": "e", "attributes": [ "extents" ], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "3fceedeef6c619b69ada96279531b69ed89734ba", "ctime": 1737105562.6812158, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 263644, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0644", "mtime": 1737105562.6812158, "nlink": 1, "path": "/etc/fstab", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 1279, "uid": 0, "version": "18446744072031193646", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Add fingerprint to /etc/fstab if present] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:95 Friday 17 January 2025 04:20:29 -0500 (0:00:00.410) 0:10:22.379 ******** ok: [managed-node1] => { "backup": "", "changed": false } TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:113 Friday 17 January 2025 04:20:30 -0500 (0:00:00.406) 0:10:22.786 ******** TASK [fedora.linux_system_roles.storage : Show blivet_output] ****************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:119 Friday 17 January 2025 04:20:30 -0500 (0:00:00.054) 0:10:22.840 ******** ok: [managed-node1] => { "blivet_output": { "actions": [ { "action": "destroy format", "device": "/dev/mapper/foo-test1", "fs_type": "xfs" }, { "action": "create format", "device": "/dev/mapper/foo-test1", "fs_type": "luks" }, { "action": "create device", "device": "/dev/mapper/luks-f2c4c01a-e23d-4ad3-a7f9-93d565703bdb", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/luks-f2c4c01a-e23d-4ad3-a7f9-93d565703bdb", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/mapper/foo-test1", "name": "luks-f2c4c01a-e23d-4ad3-a7f9-93d565703bdb", "password": "-", "state": "present" } ], "failed": false, "leaves": [ "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1", "/dev/mapper/luks-f2c4c01a-e23d-4ad3-a7f9-93d565703bdb" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "absent" }, { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-f2c4c01a-e23d-4ad3-a7f9-93d565703bdb", "state": "mounted" } ], "packages": [ "cryptsetup", "xfsprogs", "e2fsprogs", "lvm2" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/luks-f2c4c01a-e23d-4ad3-a7f9-93d565703bdb", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/luks-f2c4c01a-e23d-4ad3-a7f9-93d565703bdb", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } } TASK [fedora.linux_system_roles.storage : Set the list of pools for test verification] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:128 Friday 17 January 2025 04:20:30 -0500 (0:00:00.079) 0:10:22.920 ******** ok: [managed-node1] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/luks-f2c4c01a-e23d-4ad3-a7f9-93d565703bdb", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/luks-f2c4c01a-e23d-4ad3-a7f9-93d565703bdb", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ] }, "changed": false } TASK [fedora.linux_system_roles.storage : Set the list of volumes for test verification] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:132 Friday 17 January 2025 04:20:30 -0500 (0:00:00.072) 0:10:22.992 ******** ok: [managed-node1] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Remove obsolete mounts] ************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:148 Friday 17 January 2025 04:20:30 -0500 (0:00:00.068) 0:10:23.060 ******** changed: [managed-node1] => (item={u'src': u'/dev/mapper/foo-test1', u'state': u'absent', u'fstype': u'xfs', u'path': u'/opt/test1'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "absent" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/foo-test1" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:159 Friday 17 January 2025 04:20:30 -0500 (0:00:00.416) 0:10:23.477 ******** ok: [managed-node1] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Set up new/current mounts] *********** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:164 Friday 17 January 2025 04:20:31 -0500 (0:00:00.731) 0:10:24.208 ******** changed: [managed-node1] => (item={u'src': u'/dev/mapper/luks-f2c4c01a-e23d-4ad3-a7f9-93d565703bdb', u'group': None, u'dump': 0, u'passno': 0, u'fstype': u'xfs', u'state': u'mounted', u'mode': None, u'owner': None, u'path': u'/opt/test1', u'opts': u'defaults'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-f2c4c01a-e23d-4ad3-a7f9-93d565703bdb", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/luks-f2c4c01a-e23d-4ad3-a7f9-93d565703bdb" } TASK [fedora.linux_system_roles.storage : Manage mount ownership/permissions] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:175 Friday 17 January 2025 04:20:32 -0500 (0:00:00.496) 0:10:24.705 ******** skipping: [managed-node1] => (item={u'src': u'/dev/mapper/luks-f2c4c01a-e23d-4ad3-a7f9-93d565703bdb', u'group': None, u'dump': 0, u'passno': 0, u'fstype': u'xfs', u'state': u'mounted', u'mode': None, u'owner': None, u'path': u'/opt/test1', u'opts': u'defaults'}) => { "ansible_loop_var": "mount_info", "changed": false, "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-f2c4c01a-e23d-4ad3-a7f9-93d565703bdb", "state": "mounted" }, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:187 Friday 17 January 2025 04:20:32 -0500 (0:00:00.065) 0:10:24.771 ******** ok: [managed-node1] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Retrieve facts for the /etc/crypttab file] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:195 Friday 17 January 2025 04:20:32 -0500 (0:00:00.459) 0:10:25.230 ******** ok: [managed-node1] => { "changed": false, "stat": { "atime": 1737105567.80422, "attr_flags": "e", "attributes": [ "extents" ], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1737105564.7222173, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 263647, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0600", "mtime": 1737105564.7212174, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": "18446744072031195044", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Manage /etc/crypttab to account for changes we just made] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:200 Friday 17 January 2025 04:20:32 -0500 (0:00:00.354) 0:10:25.585 ******** changed: [managed-node1] => (item={u'state': u'present', u'password': u'-', u'name': u'luks-f2c4c01a-e23d-4ad3-a7f9-93d565703bdb', u'backing_device': u'/dev/mapper/foo-test1'}) => { "ansible_loop_var": "entry", "backup": "", "changed": true, "entry": { "backing_device": "/dev/mapper/foo-test1", "name": "luks-f2c4c01a-e23d-4ad3-a7f9-93d565703bdb", "password": "-", "state": "present" } } MSG: line added TASK [fedora.linux_system_roles.storage : Update facts] ************************ task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:222 Friday 17 January 2025 04:20:33 -0500 (0:00:00.360) 0:10:25.945 ******** ok: [managed-node1] TASK [Verify role results] ***************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:524 Friday 17 January 2025 04:20:33 -0500 (0:00:00.725) 0:10:26.670 ******** included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml for managed-node1 TASK [Print out pool information] ********************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:2 Friday 17 January 2025 04:20:34 -0500 (0:00:00.089) 0:10:26.760 ******** ok: [managed-node1] => { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/luks-f2c4c01a-e23d-4ad3-a7f9-93d565703bdb", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/luks-f2c4c01a-e23d-4ad3-a7f9-93d565703bdb", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ] } TASK [Print out volume information] ******************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:7 Friday 17 January 2025 04:20:34 -0500 (0:00:00.052) 0:10:26.812 ******** skipping: [managed-node1] => {} TASK [Collect info about the volumes.] ***************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:15 Friday 17 January 2025 04:20:34 -0500 (0:00:00.042) 0:10:26.854 ******** ok: [managed-node1] => { "changed": false, "info": { "/dev/mapper/foo-test1": { "fstype": "crypto_LUKS", "label": "", "mountpoint": "", "name": "/dev/mapper/foo-test1", "size": "4G", "type": "lvm", "uuid": "f2c4c01a-e23d-4ad3-a7f9-93d565703bdb" }, "/dev/mapper/luks-f2c4c01a-e23d-4ad3-a7f9-93d565703bdb": { "fstype": "xfs", "label": "", "mountpoint": "/opt/test1", "name": "/dev/mapper/luks-f2c4c01a-e23d-4ad3-a7f9-93d565703bdb", "size": "4G", "type": "crypt", "uuid": "36ee4f71-c5b1-4f1c-af2c-5de1373498ed" }, "/dev/sda": { "fstype": "LVM2_member", "label": "", "mountpoint": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "30b7yk-4Y3c-DHs7-HP7U-2G7i-eOL6-HqzhXx" }, "/dev/sdb": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdd": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdd", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sde": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sde", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdf": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdf", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdg": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdg", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdh": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdh", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdi": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdi", "size": "10G", "type": "disk", "uuid": "" }, "/dev/xvda": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/xvda", "size": "250G", "type": "disk", "uuid": "" }, "/dev/xvda1": { "fstype": "ext4", "label": "", "mountpoint": "/", "name": "/dev/xvda1", "size": "250G", "type": "partition", "uuid": "c7b7d6a5-fd01-4b9b-bcca-153eaff9d312" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:20 Friday 17 January 2025 04:20:34 -0500 (0:00:00.348) 0:10:27.202 ******** ok: [managed-node1] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.002945", "end": "2025-01-17 04:20:34.771185", "rc": 0, "start": "2025-01-17 04:20:34.768240" } STDOUT: # system_role:storage # # /etc/fstab # Created by anaconda on Thu Jun 20 10:23:46 2024 # # Accessible filesystems, by reference, are maintained under '/dev/disk' # See man pages fstab(5), findfs(8), mount(8) and/or blkid(8) for more info # UUID=c7b7d6a5-fd01-4b9b-bcca-153eaff9d312 / ext4 defaults 1 1 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat /mnt/redhat nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/qa /mnt/qa nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 vtap-eng01.storage.rdu2.redhat.com:/vol/engarchive /mnt/engarchive nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/tpsdist /mnt/tpsdist nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat/brewroot /mnt/brew nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_brew_scratch_nfs/scratch /mnt/brew_scratch nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 /dev/mapper/luks-f2c4c01a-e23d-4ad3-a7f9-93d565703bdb /opt/test1 xfs defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:25 Friday 17 January 2025 04:20:34 -0500 (0:00:00.331) 0:10:27.534 ******** ok: [managed-node1] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.002854", "end": "2025-01-17 04:20:35.095036", "failed_when_result": false, "rc": 0, "start": "2025-01-17 04:20:35.092182" } STDOUT: luks-f2c4c01a-e23d-4ad3-a7f9-93d565703bdb /dev/mapper/foo-test1 - TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:34 Friday 17 January 2025 04:20:35 -0500 (0:00:00.321) 0:10:27.855 ******** included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml for managed-node1 TASK [Set _storage_pool_tests] ************************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:5 Friday 17 January 2025 04:20:35 -0500 (0:00:00.079) 0:10:27.935 ******** ok: [managed-node1] => { "ansible_facts": { "_storage_pool_tests": [ "members", "volumes" ] }, "changed": false } TASK [Get VG shared value status] ********************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:18 Friday 17 January 2025 04:20:35 -0500 (0:00:00.039) 0:10:27.975 ******** ok: [managed-node1] => { "changed": false, "cmd": [ "vgs", "--noheadings", "--binary", "-o", "shared", "foo" ], "delta": "0:00:00.017965", "end": "2025-01-17 04:20:35.559835", "rc": 0, "start": "2025-01-17 04:20:35.541870" } STDOUT: 0 TASK [Verify that VG shared value checks out] ********************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:24 Friday 17 January 2025 04:20:35 -0500 (0:00:00.370) 0:10:28.346 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Verify pool subset] ****************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:34 Friday 17 January 2025 04:20:35 -0500 (0:00:00.078) 0:10:28.425 ******** included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml for managed-node1 included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-volumes.yml for managed-node1 TASK [Set test variables] ****************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:2 Friday 17 January 2025 04:20:35 -0500 (0:00:00.201) 0:10:28.627 ******** ok: [managed-node1] => { "ansible_facts": { "_storage_test_expected_pv_count": "1", "_storage_test_pool_pvs_lvm": [ "/dev/sda" ] }, "changed": false } TASK [Get the canonical device path for each member device] ******************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:8 Friday 17 January 2025 04:20:35 -0500 (0:00:00.056) 0:10:28.684 ******** ok: [managed-node1] => (item=/dev/sda) => { "ansible_loop_var": "pv", "changed": false, "device": "/dev/sda", "pv": "/dev/sda" } TASK [Set pvs lvm length] ****************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:17 Friday 17 January 2025 04:20:36 -0500 (0:00:00.363) 0:10:29.047 ******** ok: [managed-node1] => { "ansible_facts": { "__pvs_lvm_len": "1" }, "changed": false } TASK [Set pool pvs] ************************************************************ task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:22 Friday 17 January 2025 04:20:36 -0500 (0:00:00.046) 0:10:29.093 ******** ok: [managed-node1] => { "ansible_facts": { "_storage_test_pool_pvs": [ "/dev/sda" ] }, "changed": false } TASK [Verify PV count] ********************************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:27 Friday 17 January 2025 04:20:36 -0500 (0:00:00.047) 0:10:29.140 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Set expected pv type] **************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:36 Friday 17 January 2025 04:20:36 -0500 (0:00:00.058) 0:10:29.199 ******** ok: [managed-node1] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [Set expected pv type] **************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:41 Friday 17 January 2025 04:20:36 -0500 (0:00:00.069) 0:10:29.268 ******** ok: [managed-node1] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [Set expected pv type] **************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:46 Friday 17 January 2025 04:20:36 -0500 (0:00:00.060) 0:10:29.328 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check the type of each PV] *********************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:51 Friday 17 January 2025 04:20:36 -0500 (0:00:00.051) 0:10:29.380 ******** ok: [managed-node1] => (item=/dev/sda) => { "ansible_loop_var": "pv", "changed": false, "pv": "/dev/sda" } MSG: All assertions passed TASK [Check that blivet supports PV grow to fill] ****************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:64 Friday 17 January 2025 04:20:36 -0500 (0:00:00.065) 0:10:29.446 ******** ok: [managed-node1] => { "changed": false, "rc": 0 } STDOUT: False STDERR: Shared connection to 10.31.46.65 closed. TASK [Verify that PVs fill the whole devices when they should] ***************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:73 Friday 17 January 2025 04:20:37 -0500 (0:00:00.340) 0:10:29.787 ******** skipping: [managed-node1] => (item=/dev/sda) => { "ansible_loop_var": "st_pool_pv", "changed": false, "skip_reason": "Conditional result was False", "st_pool_pv": "/dev/sda" } TASK [Check MD RAID] *********************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:83 Friday 17 January 2025 04:20:37 -0500 (0:00:00.107) 0:10:29.894 ******** included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml for managed-node1 TASK [Get information about RAID] ********************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:8 Friday 17 January 2025 04:20:37 -0500 (0:00:00.123) 0:10:30.017 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:14 Friday 17 January 2025 04:20:37 -0500 (0:00:00.065) 0:10:30.083 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:19 Friday 17 January 2025 04:20:37 -0500 (0:00:00.056) 0:10:30.139 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:24 Friday 17 January 2025 04:20:37 -0500 (0:00:00.058) 0:10:30.198 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md chunk size regex] ************************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:29 Friday 17 January 2025 04:20:37 -0500 (0:00:00.038) 0:10:30.236 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:37 Friday 17 January 2025 04:20:37 -0500 (0:00:00.040) 0:10:30.276 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:46 Friday 17 January 2025 04:20:37 -0500 (0:00:00.044) 0:10:30.321 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:55 Friday 17 January 2025 04:20:37 -0500 (0:00:00.064) 0:10:30.385 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:64 Friday 17 January 2025 04:20:37 -0500 (0:00:00.081) 0:10:30.467 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:74 Friday 17 January 2025 04:20:37 -0500 (0:00:00.146) 0:10:30.614 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Reset variables used by tests] ******************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:83 Friday 17 January 2025 04:20:38 -0500 (0:00:00.114) 0:10:30.731 ******** ok: [managed-node1] => { "ansible_facts": { "storage_test_md_active_devices_re": null, "storage_test_md_chunk_size_re": null, "storage_test_md_metadata_version_re": null, "storage_test_md_spare_devices_re": null }, "changed": false } TASK [Check LVM RAID] ********************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:86 Friday 17 January 2025 04:20:38 -0500 (0:00:00.065) 0:10:30.797 ******** included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-lvmraid.yml for managed-node1 TASK [Validate pool member LVM RAID settings] ********************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-lvmraid.yml:2 Friday 17 January 2025 04:20:38 -0500 (0:00:00.143) 0:10:30.940 ******** included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml for managed-node1 TASK [Get information about the LV] ******************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:8 Friday 17 January 2025 04:20:38 -0500 (0:00:00.091) 0:10:31.032 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV segment type] ***************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:16 Friday 17 January 2025 04:20:38 -0500 (0:00:00.057) 0:10:31.089 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check segment type] ****************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:20 Friday 17 January 2025 04:20:38 -0500 (0:00:00.071) 0:10:31.161 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV stripe size] ****************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:27 Friday 17 January 2025 04:20:38 -0500 (0:00:00.059) 0:10:31.220 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested stripe size] ***************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:31 Friday 17 January 2025 04:20:38 -0500 (0:00:00.080) 0:10:31.301 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected stripe size] ************************************************ task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:37 Friday 17 January 2025 04:20:38 -0500 (0:00:00.061) 0:10:31.362 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check stripe size] ******************************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:42 Friday 17 January 2025 04:20:38 -0500 (0:00:00.070) 0:10:31.432 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check Thin Pools] ******************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:89 Friday 17 January 2025 04:20:38 -0500 (0:00:00.062) 0:10:31.495 ******** included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-thin.yml for managed-node1 TASK [Validate pool member thinpool settings] ********************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-thin.yml:2 Friday 17 January 2025 04:20:38 -0500 (0:00:00.140) 0:10:31.635 ******** included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml for managed-node1 TASK [Get information about thinpool] ****************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml:8 Friday 17 January 2025 04:20:39 -0500 (0:00:00.158) 0:10:31.794 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is in correct thinpool (when thinp name is provided)] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml:16 Friday 17 January 2025 04:20:39 -0500 (0:00:00.057) 0:10:31.851 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is in thinpool (when thinp name is not provided)] ****** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml:22 Friday 17 January 2025 04:20:39 -0500 (0:00:00.074) 0:10:31.925 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Reset variable used by test] ********************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml:26 Friday 17 January 2025 04:20:39 -0500 (0:00:00.080) 0:10:32.006 ******** ok: [managed-node1] => { "ansible_facts": { "storage_test_thin_status": null }, "changed": false } TASK [Check member encryption] ************************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:92 Friday 17 January 2025 04:20:39 -0500 (0:00:00.065) 0:10:32.071 ******** included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml for managed-node1 TASK [Set test variables] ****************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:5 Friday 17 January 2025 04:20:39 -0500 (0:00:00.138) 0:10:32.209 ******** ok: [managed-node1] => { "ansible_facts": { "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Validate pool member LUKS settings] ************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:10 Friday 17 January 2025 04:20:39 -0500 (0:00:00.114) 0:10:32.323 ******** skipping: [managed-node1] => (item=/dev/sda) => { "_storage_test_pool_member_path": "/dev/sda", "ansible_loop_var": "_storage_test_pool_member_path", "changed": false, "skip_reason": "Conditional result was False" } TASK [Validate pool member crypttab entries] *********************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:17 Friday 17 January 2025 04:20:39 -0500 (0:00:00.068) 0:10:32.392 ******** included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml for managed-node1 TASK [Set variables used by tests] ********************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:2 Friday 17 January 2025 04:20:39 -0500 (0:00:00.132) 0:10:32.524 ******** ok: [managed-node1] => { "ansible_facts": { "_storage_test_crypttab_entries": [] }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:6 Friday 17 January 2025 04:20:39 -0500 (0:00:00.067) 0:10:32.592 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:14 Friday 17 January 2025 04:20:39 -0500 (0:00:00.069) 0:10:32.661 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:23 Friday 17 January 2025 04:20:40 -0500 (0:00:00.057) 0:10:32.719 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:32 Friday 17 January 2025 04:20:40 -0500 (0:00:00.215) 0:10:32.934 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clear test variables] **************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:41 Friday 17 January 2025 04:20:40 -0500 (0:00:00.074) 0:10:33.009 ******** ok: [managed-node1] => { "ansible_facts": { "_storage_test_crypttab_entries": null }, "changed": false } TASK [Clear test variables] **************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:24 Friday 17 January 2025 04:20:40 -0500 (0:00:00.080) 0:10:33.089 ******** ok: [managed-node1] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_crypttab_key_file": null }, "changed": false } TASK [Check VDO] *************************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:95 Friday 17 January 2025 04:20:40 -0500 (0:00:00.059) 0:10:33.149 ******** included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-vdo.yml for managed-node1 TASK [Validate pool member VDO settings] *************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-vdo.yml:2 Friday 17 January 2025 04:20:40 -0500 (0:00:00.142) 0:10:33.291 ******** included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml for managed-node1 TASK [Get information about VDO deduplication] ********************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:8 Friday 17 January 2025 04:20:40 -0500 (0:00:00.108) 0:10:33.400 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is off] *************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:15 Friday 17 January 2025 04:20:40 -0500 (0:00:00.081) 0:10:33.482 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is on] **************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:21 Friday 17 January 2025 04:20:40 -0500 (0:00:00.069) 0:10:33.552 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about VDO compression] *********************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:27 Friday 17 January 2025 04:20:40 -0500 (0:00:00.073) 0:10:33.625 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is off] *************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:34 Friday 17 January 2025 04:20:40 -0500 (0:00:00.066) 0:10:33.692 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is on] **************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:40 Friday 17 January 2025 04:20:41 -0500 (0:00:00.092) 0:10:33.784 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clear test variables] **************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:46 Friday 17 January 2025 04:20:41 -0500 (0:00:00.061) 0:10:33.846 ******** ok: [managed-node1] => { "ansible_facts": { "storage_test_vdo_status": null }, "changed": false } TASK [Check Stratis] *********************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:98 Friday 17 January 2025 04:20:41 -0500 (0:00:00.059) 0:10:33.906 ******** included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml for managed-node1 TASK [Run 'stratis report'] **************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:6 Friday 17 January 2025 04:20:41 -0500 (0:00:00.189) 0:10:34.096 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about Stratis] ******************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:11 Friday 17 January 2025 04:20:41 -0500 (0:00:00.057) 0:10:34.153 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the pools was created] *************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:15 Friday 17 January 2025 04:20:41 -0500 (0:00:00.058) 0:10:34.212 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that encryption is correctly set] ********************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:25 Friday 17 January 2025 04:20:41 -0500 (0:00:00.057) 0:10:34.269 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that Clevis/Tang encryption is correctly set] ********************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:34 Friday 17 January 2025 04:20:41 -0500 (0:00:00.056) 0:10:34.326 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Reset variable used by test] ********************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:44 Friday 17 January 2025 04:20:41 -0500 (0:00:00.061) 0:10:34.387 ******** ok: [managed-node1] => { "ansible_facts": { "storage_test_stratis_report": null }, "changed": false } TASK [Clean up test variables] ************************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:101 Friday 17 January 2025 04:20:41 -0500 (0:00:00.058) 0:10:34.445 ******** ok: [managed-node1] => { "ansible_facts": { "__pvs_lvm_len": null, "_storage_test_expected_pv_count": null, "_storage_test_expected_pv_type": null, "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [Verify the volumes] ****************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-volumes.yml:3 Friday 17 January 2025 04:20:41 -0500 (0:00:00.062) 0:10:34.508 ******** included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml for managed-node1 TASK [Set storage volume test variables] *************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:2 Friday 17 January 2025 04:20:41 -0500 (0:00:00.122) 0:10:34.631 ******** ok: [managed-node1] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [Run test verify for {{ storage_test_volume_subset }}] ******************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:19 Friday 17 January 2025 04:20:42 -0500 (0:00:00.122) 0:10:34.753 ******** included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml for managed-node1 included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml for managed-node1 included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml for managed-node1 included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml for managed-node1 included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml for managed-node1 included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml for managed-node1 included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml for managed-node1 included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml for managed-node1 TASK [Get expected mount device based on device type] ************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:7 Friday 17 January 2025 04:20:42 -0500 (0:00:00.454) 0:10:35.208 ******** ok: [managed-node1] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/luks-f2c4c01a-e23d-4ad3-a7f9-93d565703bdb" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:11 Friday 17 January 2025 04:20:42 -0500 (0:00:00.064) 0:10:35.272 ******** ok: [managed-node1] => { "ansible_facts": { "storage_test_mount_expected_mount_point": "/opt/test1", "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Get information about the mountpoint directory] ************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:19 Friday 17 January 2025 04:20:42 -0500 (0:00:00.070) 0:10:35.343 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by device] ******************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:28 Friday 17 January 2025 04:20:42 -0500 (0:00:00.061) 0:10:35.404 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Verify mount directory user] ********************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:36 Friday 17 January 2025 04:20:42 -0500 (0:00:00.150) 0:10:35.555 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory group] ******************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:42 Friday 17 January 2025 04:20:42 -0500 (0:00:00.104) 0:10:35.659 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory permissions] ************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:48 Friday 17 January 2025 04:20:43 -0500 (0:00:00.061) 0:10:35.720 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get path of test volume device] ****************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:57 Friday 17 January 2025 04:20:43 -0500 (0:00:00.075) 0:10:35.796 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:63 Friday 17 January 2025 04:20:43 -0500 (0:00:00.066) 0:10:35.862 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:69 Friday 17 January 2025 04:20:43 -0500 (0:00:00.071) 0:10:35.934 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:79 Friday 17 January 2025 04:20:43 -0500 (0:00:00.060) 0:10:35.994 ******** ok: [managed-node1] => { "ansible_facts": { "storage_test_found_mount_stat": null, "storage_test_mount_expected_mount_point": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:2 Friday 17 January 2025 04:20:43 -0500 (0:00:00.057) 0:10:36.052 ******** ok: [managed-node1] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "/dev/mapper/luks-f2c4c01a-e23d-4ad3-a7f9-93d565703bdb " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:17 Friday 17 January 2025 04:20:43 -0500 (0:00:00.182) 0:10:36.234 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:24 Friday 17 January 2025 04:20:43 -0500 (0:00:00.069) 0:10:36.304 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:33 Friday 17 January 2025 04:20:43 -0500 (0:00:00.074) 0:10:36.379 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify fingerprint] ****************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:45 Friday 17 January 2025 04:20:43 -0500 (0:00:00.061) 0:10:36.441 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Clean up variables] ****************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:52 Friday 17 January 2025 04:20:43 -0500 (0:00:00.080) 0:10:36.521 ******** ok: [managed-node1] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:6 Friday 17 January 2025 04:20:43 -0500 (0:00:00.041) 0:10:36.562 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:14 Friday 17 January 2025 04:20:43 -0500 (0:00:00.055) 0:10:36.617 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:3 Friday 17 January 2025 04:20:43 -0500 (0:00:00.074) 0:10:36.692 ******** ok: [managed-node1] => { "changed": false, "stat": { "atime": 1737105628.957278, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1737105628.957278, "dev": 5, "device_type": 64768, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 123121, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1737105628.957278, "nlink": 1, "path": "/dev/mapper/foo-test1", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:9 Friday 17 January 2025 04:20:44 -0500 (0:00:00.504) 0:10:37.196 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:16 Friday 17 January 2025 04:20:44 -0500 (0:00:00.178) 0:10:37.375 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Make sure we got info about this volume] ********************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:23 Friday 17 January 2025 04:20:44 -0500 (0:00:00.129) 0:10:37.504 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Process volume type (set initial value) (1/2)] *************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:29 Friday 17 January 2025 04:20:44 -0500 (0:00:00.135) 0:10:37.640 ******** ok: [managed-node1] => { "ansible_facts": { "st_volume_type": "lvm" }, "changed": false } TASK [Process volume type (get RAID value) (2/2)] ****************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:33 Friday 17 January 2025 04:20:45 -0500 (0:00:00.068) 0:10:37.708 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:38 Friday 17 January 2025 04:20:45 -0500 (0:00:00.090) 0:10:37.799 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:3 Friday 17 January 2025 04:20:45 -0500 (0:00:00.155) 0:10:37.954 ******** ok: [managed-node1] => { "changed": false, "stat": { "atime": 1737105629.0652783, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1737105629.0652783, "dev": 5, "device_type": 64769, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 134375, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1737105629.0652783, "nlink": 1, "path": "/dev/mapper/luks-f2c4c01a-e23d-4ad3-a7f9-93d565703bdb", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Ensure cryptsetup is present] ******************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:10 Friday 17 January 2025 04:20:45 -0500 (0:00:00.676) 0:10:38.630 ******** ok: [managed-node1] => { "changed": false, "rc": 0, "results": [ "cryptsetup-2.0.3-6.el7.x86_64 providing cryptsetup is already installed" ] } TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:16 Friday 17 January 2025 04:20:47 -0500 (0:00:01.139) 0:10:39.770 ******** ok: [managed-node1] => { "changed": false, "cmd": [ "cryptsetup", "luksDump", "/dev/mapper/foo-test1" ], "delta": "0:00:00.026119", "end": "2025-01-17 04:20:47.612599", "rc": 0, "start": "2025-01-17 04:20:47.586480" } STDOUT: LUKS header information Version: 2 Epoch: 3 Metadata area: 12288 bytes UUID: f2c4c01a-e23d-4ad3-a7f9-93d565703bdb Label: (no label) Subsystem: (no subsystem) Flags: (no flags) Data segments: 0: crypt offset: 4194304 [bytes] length: (whole device) cipher: aes-xts-plain64 sector: 512 [bytes] Keyslots: 0: luks2 Key: 512 bits Priority: normal Cipher: aes-xts-plain64 PBKDF: argon2i Time cost: 4 Memory: 665047 Threads: 2 Salt: ab 37 3d 96 c2 0c 4d c2 8d f7 c9 c1 50 43 cd 0d f2 0a 33 84 35 f6 fb 17 89 c7 55 56 b8 0b 69 70 AF stripes: 4000 Area offset:32768 [bytes] Area length:258048 [bytes] Digest ID: 0 Tokens: Digests: 0: pbkdf2 Hash: sha256 Iterations: 23173 Salt: 9e 32 e5 c5 31 a8 af 42 2a a5 7b 39 28 f9 36 18 31 c6 06 0f 67 62 cd 4e 50 9d 05 c4 b0 ee e4 c2 Digest: 24 68 d4 6b 5f 68 6f 29 92 1c 80 db 4d 63 43 65 ad 92 51 65 07 a8 f9 51 e2 34 bf 2b c5 ea a4 b7 TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:22 Friday 17 January 2025 04:20:47 -0500 (0:00:00.651) 0:10:40.421 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:29 Friday 17 January 2025 04:20:47 -0500 (0:00:00.099) 0:10:40.521 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:40 Friday 17 January 2025 04:20:47 -0500 (0:00:00.091) 0:10:40.612 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:46 Friday 17 January 2025 04:20:47 -0500 (0:00:00.073) 0:10:40.686 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Check LUKS version] ****************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:51 Friday 17 January 2025 04:20:48 -0500 (0:00:00.071) 0:10:40.758 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Check LUKS key size] ***************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:63 Friday 17 January 2025 04:20:48 -0500 (0:00:00.088) 0:10:40.846 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:75 Friday 17 January 2025 04:20:48 -0500 (0:00:00.062) 0:10:40.909 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set test variables] ****************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:87 Friday 17 January 2025 04:20:48 -0500 (0:00:00.058) 0:10:40.968 ******** ok: [managed-node1] => { "ansible_facts": { "_storage_test_crypttab_entries": [ "luks-f2c4c01a-e23d-4ad3-a7f9-93d565703bdb /dev/mapper/foo-test1 -" ], "_storage_test_expected_crypttab_entries": "1", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:93 Friday 17 January 2025 04:20:48 -0500 (0:00:00.072) 0:10:41.041 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:100 Friday 17 January 2025 04:20:48 -0500 (0:00:00.066) 0:10:41.107 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:108 Friday 17 January 2025 04:20:48 -0500 (0:00:00.069) 0:10:41.177 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Check key file of crypttab entry] **************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:116 Friday 17 January 2025 04:20:48 -0500 (0:00:00.076) 0:10:41.253 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Clear test variables] **************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:124 Friday 17 January 2025 04:20:48 -0500 (0:00:00.102) 0:10:41.356 ******** ok: [managed-node1] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [Get information about RAID] ********************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:8 Friday 17 January 2025 04:20:48 -0500 (0:00:00.147) 0:10:41.503 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:14 Friday 17 January 2025 04:20:48 -0500 (0:00:00.131) 0:10:41.635 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:19 Friday 17 January 2025 04:20:49 -0500 (0:00:00.129) 0:10:41.765 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:24 Friday 17 January 2025 04:20:49 -0500 (0:00:00.060) 0:10:41.825 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set chunk size regex] **************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:29 Friday 17 January 2025 04:20:49 -0500 (0:00:00.113) 0:10:41.939 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:37 Friday 17 January 2025 04:20:49 -0500 (0:00:00.101) 0:10:42.040 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:46 Friday 17 January 2025 04:20:49 -0500 (0:00:00.060) 0:10:42.100 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:54 Friday 17 January 2025 04:20:49 -0500 (0:00:00.059) 0:10:42.160 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:62 Friday 17 January 2025 04:20:49 -0500 (0:00:00.064) 0:10:42.224 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:70 Friday 17 January 2025 04:20:49 -0500 (0:00:00.068) 0:10:42.293 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the actual size of the volume] ************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:3 Friday 17 January 2025 04:20:49 -0500 (0:00:00.105) 0:10:42.398 ******** ok: [managed-node1] => { "bytes": 4294967296, "changed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } TASK [Parse the requested size of the volume] ********************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:11 Friday 17 January 2025 04:20:50 -0500 (0:00:00.703) 0:10:43.102 ******** ok: [managed-node1] => { "bytes": 4294967296, "changed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } TASK [Establish base value for expected size] ********************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:20 Friday 17 January 2025 04:20:50 -0500 (0:00:00.566) 0:10:43.669 ******** ok: [managed-node1] => { "ansible_facts": { "storage_test_expected_size": "4294967296" }, "changed": false } TASK [Show expected size] ****************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:28 Friday 17 January 2025 04:20:51 -0500 (0:00:00.079) 0:10:43.748 ******** ok: [managed-node1] => { "storage_test_expected_size": "4294967296" } TASK [Get the size of parent/pool device] ************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:32 Friday 17 January 2025 04:20:51 -0500 (0:00:00.134) 0:10:43.883 ******** ok: [managed-node1] => { "bytes": 10737418240, "changed": false, "lvm": "10g", "parted": "10GiB", "size": "10 GiB" } TASK [Show test pool] ********************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:46 Friday 17 January 2025 04:20:51 -0500 (0:00:00.669) 0:10:44.552 ******** skipping: [managed-node1] => {} TASK [Show test blockinfo] ***************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:50 Friday 17 January 2025 04:20:51 -0500 (0:00:00.108) 0:10:44.661 ******** skipping: [managed-node1] => {} TASK [Show test pool size] ***************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:54 Friday 17 January 2025 04:20:52 -0500 (0:00:00.081) 0:10:44.742 ******** skipping: [managed-node1] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:58 Friday 17 January 2025 04:20:52 -0500 (0:00:00.090) 0:10:44.833 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default thin pool reserved space values] ********************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:67 Friday 17 January 2025 04:20:52 -0500 (0:00:00.079) 0:10:44.912 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default minimal thin pool reserved space size] *************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:71 Friday 17 January 2025 04:20:52 -0500 (0:00:00.069) 0:10:44.982 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default maximal thin pool reserved space size] *************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:76 Friday 17 January 2025 04:20:52 -0500 (0:00:00.058) 0:10:45.040 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate maximum usable space in thin pool] ***************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:82 Friday 17 January 2025 04:20:52 -0500 (0:00:00.062) 0:10:45.102 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply upper size limit to max usable thin pool space] ******************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:86 Friday 17 January 2025 04:20:52 -0500 (0:00:00.062) 0:10:45.165 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply lower size limit to max usable thin pool space] ******************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:91 Friday 17 January 2025 04:20:52 -0500 (0:00:00.082) 0:10:45.247 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Convert maximum usable thin pool space from int to Size] ***************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:96 Friday 17 January 2025 04:20:52 -0500 (0:00:00.059) 0:10:45.307 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show max thin pool size] ************************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:101 Friday 17 January 2025 04:20:52 -0500 (0:00:00.103) 0:10:45.410 ******** skipping: [managed-node1] => {} TASK [Show volume thin pool size] ********************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:105 Friday 17 January 2025 04:20:52 -0500 (0:00:00.120) 0:10:45.531 ******** skipping: [managed-node1] => {} TASK [Show test volume size] *************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:109 Friday 17 January 2025 04:20:52 -0500 (0:00:00.131) 0:10:45.662 ******** skipping: [managed-node1] => {} TASK [Establish base value for expected thin pool size] ************************ task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:113 Friday 17 January 2025 04:20:53 -0500 (0:00:00.056) 0:10:45.719 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:120 Friday 17 January 2025 04:20:53 -0500 (0:00:00.061) 0:10:45.781 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected thin pool volume size] ***************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:127 Friday 17 January 2025 04:20:53 -0500 (0:00:00.062) 0:10:45.843 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected thin pool volume size based on percentage value] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:131 Friday 17 January 2025 04:20:53 -0500 (0:00:00.079) 0:10:45.922 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Replace expected volume size with calculated value] ********************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:137 Friday 17 January 2025 04:20:53 -0500 (0:00:00.075) 0:10:45.998 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show actual size] ******************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:143 Friday 17 January 2025 04:20:53 -0500 (0:00:00.095) 0:10:46.093 ******** ok: [managed-node1] => { "storage_test_actual_size": { "bytes": 4294967296, "changed": false, "failed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } } TASK [Show expected size] ****************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:147 Friday 17 January 2025 04:20:53 -0500 (0:00:00.073) 0:10:46.167 ******** ok: [managed-node1] => { "storage_test_expected_size": "4294967296" } TASK [Assert expected size is actual size] ************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:151 Friday 17 January 2025 04:20:53 -0500 (0:00:00.076) 0:10:46.243 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Get information about the LV] ******************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:5 Friday 17 January 2025 04:20:53 -0500 (0:00:00.084) 0:10:46.327 ******** ok: [managed-node1] => { "changed": false, "cmd": [ "lvs", "--noheadings", "--nameprefixes", "--units=b", "--nosuffix", "--unquoted", "-o", "name,attr,cache_total_blocks,chunk_size,segtype", "foo/test1" ], "delta": "0:00:00.017640", "end": "2025-01-17 04:20:54.166653", "rc": 0, "start": "2025-01-17 04:20:54.149013" } STDOUT: LVM2_LV_NAME=test1 LVM2_LV_ATTR=-wi-ao---- LVM2_CACHE_TOTAL_BLOCKS= LVM2_CHUNK_SIZE=0 LVM2_SEGTYPE=linear TASK [Set LV segment type] ***************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:13 Friday 17 January 2025 04:20:54 -0500 (0:00:00.703) 0:10:47.031 ******** ok: [managed-node1] => { "ansible_facts": { "storage_test_lv_segtype": [ "linear" ] }, "changed": false } TASK [Check segment type] ****************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:17 Friday 17 January 2025 04:20:54 -0500 (0:00:00.113) 0:10:47.145 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Set LV cache size] ******************************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:24 Friday 17 January 2025 04:20:54 -0500 (0:00:00.119) 0:10:47.264 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested cache size] ****************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:31 Friday 17 January 2025 04:20:54 -0500 (0:00:00.098) 0:10:47.362 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected cache size] ************************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:37 Friday 17 January 2025 04:20:54 -0500 (0:00:00.121) 0:10:47.484 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:42 Friday 17 January 2025 04:20:54 -0500 (0:00:00.139) 0:10:47.623 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:25 Friday 17 January 2025 04:20:55 -0500 (0:00:00.089) 0:10:47.713 ******** ok: [managed-node1] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:44 Friday 17 January 2025 04:20:55 -0500 (0:00:00.088) 0:10:47.802 ******** TASK [Clean up variable namespace] ********************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:54 Friday 17 January 2025 04:20:55 -0500 (0:00:00.061) 0:10:47.864 ******** ok: [managed-node1] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null }, "changed": false } TASK [Clean up] **************************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:527 Friday 17 January 2025 04:20:55 -0500 (0:00:00.091) 0:10:47.955 ******** TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Friday 17 January 2025 04:20:55 -0500 (0:00:00.210) 0:10:48.166 ******** included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node1 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Friday 17 January 2025 04:20:55 -0500 (0:00:00.093) 0:10:48.259 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Friday 17 January 2025 04:20:55 -0500 (0:00:00.077) 0:10:48.337 ******** skipping: [managed-node1] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node1] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node1] => (item=CentOS_7.yml) => { "ansible_facts": { "__storage_blivet_diskvolume_mkfs_option_map": { "ext2": "-F", "ext3": "-F", "ext4": "-F" }, "blivet_package_list": [ "python-enum34", "python-blivet3", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "{{ 'libblockdev-s390' if ansible_architecture == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_7.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_7.yml" } skipping: [managed-node1] => (item=CentOS_7.9.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS_7.9.yml", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Friday 17 January 2025 04:20:55 -0500 (0:00:00.151) 0:10:48.488 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Friday 17 January 2025 04:20:55 -0500 (0:00:00.075) 0:10:48.564 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Friday 17 January 2025 04:20:56 -0500 (0:00:00.181) 0:10:48.745 ******** ok: [managed-node1] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Friday 17 January 2025 04:20:56 -0500 (0:00:00.057) 0:10:48.803 ******** ok: [managed-node1] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Friday 17 January 2025 04:20:56 -0500 (0:00:00.067) 0:10:48.871 ******** included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node1 TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2 Friday 17 January 2025 04:20:56 -0500 (0:00:00.245) 0:10:49.116 ******** ok: [managed-node1] => { "changed": false, "rc": 0, "results": [ "python-enum34-1.0.4-1.el7.noarch providing python-enum34 is already installed", "1:python2-blivet3-3.1.3-3.el7.noarch providing python-blivet3 is already installed", "libblockdev-crypto-2.18-5.el7.x86_64 providing libblockdev-crypto is already installed", "libblockdev-dm-2.18-5.el7.x86_64 providing libblockdev-dm is already installed", "libblockdev-lvm-2.18-5.el7.x86_64 providing libblockdev-lvm is already installed", "libblockdev-mdraid-2.18-5.el7.x86_64 providing libblockdev-mdraid is already installed", "libblockdev-swap-2.18-5.el7.x86_64 providing libblockdev-swap is already installed", "libblockdev-2.18-5.el7.x86_64 providing libblockdev is already installed" ] } TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Friday 17 January 2025 04:21:00 -0500 (0:00:04.130) 0:10:53.247 ******** ok: [managed-node1] => { "storage_pools": "VARIABLE IS NOT DEFINED!: 'storage_pools' is undefined" } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:14 Friday 17 January 2025 04:21:00 -0500 (0:00:00.070) 0:10:53.318 ******** ok: [managed-node1] => { "storage_volumes": [ { "disks": [ "sda" ], "name": "foo", "state": "absent", "type": "disk" } ] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19 Friday 17 January 2025 04:21:00 -0500 (0:00:00.126) 0:10:53.444 ******** ok: [managed-node1] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:31 Friday 17 January 2025 04:21:04 -0500 (0:00:04.141) 0:10:57.585 ******** included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node1 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Friday 17 January 2025 04:21:04 -0500 (0:00:00.109) 0:10:57.694 ******** TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Friday 17 January 2025 04:21:05 -0500 (0:00:00.057) 0:10:57.752 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Friday 17 January 2025 04:21:05 -0500 (0:00:00.075) 0:10:57.828 ******** TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:37 Friday 17 January 2025 04:21:05 -0500 (0:00:00.056) 0:10:57.884 ******** ok: [managed-node1] => { "changed": false, "rc": 0, "results": [ "kpartx-0.4.9-136.el7_9.x86_64 providing kpartx is already installed" ] } TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:51 Friday 17 January 2025 04:21:05 -0500 (0:00:00.774) 0:10:58.659 ******** ok: [managed-node1] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "arp-ethers.service": { "name": "arp-ethers.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "brandbot.service": { "name": "brandbot.service", "source": "systemd", "state": "inactive", "status": "static" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "console-shell.service": { "name": "console-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.import1.service": { "name": "dbus-org.freedesktop.import1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.machine1.service": { "name": "dbus-org.freedesktop.machine1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dmraid-activation.service": { "name": "dmraid-activation.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "ebtables.service": { "name": "ebtables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iprdump.service": { "name": "iprdump.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprinit.service": { "name": "iprinit.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprupdate.service": { "name": "iprupdate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-lvmetad.service": { "name": "lvm2-lvmetad.service", "source": "systemd", "state": "running", "status": "static" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "lvm2-pvscan@8:0.service": { "name": "lvm2-pvscan@8:0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "netconsole": { "name": "netconsole", "source": "sysv", "state": "stopped", "status": "disabled" }, "network": { "name": "network", "source": "sysv", "state": "running", "status": "enabled" }, "network.service": { "name": "network.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-config.service": { "name": "nfs-config.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-idmap.service": { "name": "nfs-idmap.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-lock.service": { "name": "nfs-lock.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-rquotad.service": { "name": "nfs-rquotad.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-secure.service": { "name": "nfs-secure.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs.service": { "name": "nfs.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfslock.service": { "name": "nfslock.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "postfix.service": { "name": "postfix.service", "source": "systemd", "state": "running", "status": "enabled" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rhel-autorelabel-mark.service": { "name": "rhel-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-autorelabel.service": { "name": "rhel-autorelabel.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-configure.service": { "name": "rhel-configure.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-dmesg.service": { "name": "rhel-dmesg.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-domainname.service": { "name": "rhel-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-import-state.service": { "name": "rhel-import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-loadmodules.service": { "name": "rhel-loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-readonly.service": { "name": "rhel-readonly.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-rquotad.service": { "name": "rpc-rquotad.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpcgssd.service": { "name": "rpcgssd.service", "source": "systemd", "state": "inactive", "status": "static" }, "rpcidmapd.service": { "name": "rpcidmapd.service", "source": "systemd", "state": "inactive", "status": "static" }, "rsyncd.service": { "name": "rsyncd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyncd@.service": { "name": "rsyncd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-policy-migrate-local-changes@.service": { "name": "selinux-policy-migrate-local-changes@.service", "source": "systemd", "state": "unknown", "status": "static" }, "selinux-policy-migrate-local-changes@targeted.service": { "name": "selinux-policy-migrate-local-changes@targeted.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "unknown" }, "sshd-keygen.service": { "name": "sshd-keygen.service", "source": "systemd", "state": "stopped", "status": "static" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-bootchart.service": { "name": "systemd-bootchart.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-importd.service": { "name": "systemd-importd.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-machined.service": { "name": "systemd-machined.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-nspawn@.service": { "name": "systemd-nspawn@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-readahead-collect.service": { "name": "systemd-readahead-collect.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-readahead-done.service": { "name": "systemd-readahead-done.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "systemd-readahead-drop.service": { "name": "systemd-readahead-drop.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "systemd-readahead-replay.service": { "name": "systemd-readahead-replay.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill@.service": { "name": "systemd-rfkill@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-shutdownd.service": { "name": "systemd-shutdownd.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "wpa_supplicant.service": { "name": "wpa_supplicant.service", "source": "systemd", "state": "inactive", "status": "disabled" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Set storage_cryptsetup_services] ***** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:57 Friday 17 January 2025 04:21:07 -0500 (0:00:01.128) 0:10:59.788 ******** ok: [managed-node1] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:63 Friday 17 January 2025 04:21:07 -0500 (0:00:00.085) 0:10:59.874 ******** TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:69 Friday 17 January 2025 04:21:07 -0500 (0:00:00.068) 0:10:59.943 ******** changed: [managed-node1] => { "actions": [ { "action": "destroy format", "device": "/dev/mapper/luks-f2c4c01a-e23d-4ad3-a7f9-93d565703bdb", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/luks-f2c4c01a-e23d-4ad3-a7f9-93d565703bdb", "fs_type": null }, { "action": "destroy format", "device": "/dev/mapper/foo-test1", "fs_type": "luks" }, { "action": "destroy device", "device": "/dev/mapper/foo-test1", "fs_type": null }, { "action": "destroy device", "device": "/dev/foo", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda", "fs_type": "lvmpv" } ], "changed": true, "crypts": [ { "backing_device": "/dev/mapper/foo-test1", "name": "luks-f2c4c01a-e23d-4ad3-a7f9-93d565703bdb", "password": "-", "state": "absent" } ], "leaves": [ "/dev/sda", "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-f2c4c01a-e23d-4ad3-a7f9-93d565703bdb", "state": "absent" } ], "packages": [ "e2fsprogs" ], "pools": [], "volumes": [ { "_device": "/dev/sda", "_mount_id": "UUID=30b7yk-4Y3c-DHs7-HP7U-2G7i-eOL6-HqzhXx", "_raw_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "lvmpv", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": null, "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10737418240, "state": "absent", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] } TASK [fedora.linux_system_roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:83 Friday 17 January 2025 04:21:42 -0500 (0:00:34.890) 0:11:34.833 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if /etc/fstab is present] ****** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:90 Friday 17 January 2025 04:21:42 -0500 (0:00:00.039) 0:11:34.873 ******** ok: [managed-node1] => { "changed": false, "stat": { "atime": 1737105631.919281, "attr_flags": "e", "attributes": [ "extents" ], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "a075f31a35f98b6235434fe7a3de0351c065a7bd", "ctime": 1737105631.916281, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 263644, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0644", "mtime": 1737105631.916281, "nlink": 1, "path": "/etc/fstab", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 1311, "uid": 0, "version": "18446744072031193646", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Add fingerprint to /etc/fstab if present] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:95 Friday 17 January 2025 04:21:42 -0500 (0:00:00.393) 0:11:35.267 ******** ok: [managed-node1] => { "backup": "", "changed": false } TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:113 Friday 17 January 2025 04:21:43 -0500 (0:00:00.466) 0:11:35.733 ******** TASK [fedora.linux_system_roles.storage : Show blivet_output] ****************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:119 Friday 17 January 2025 04:21:43 -0500 (0:00:00.090) 0:11:35.824 ******** ok: [managed-node1] => { "blivet_output": { "actions": [ { "action": "destroy format", "device": "/dev/mapper/luks-f2c4c01a-e23d-4ad3-a7f9-93d565703bdb", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/luks-f2c4c01a-e23d-4ad3-a7f9-93d565703bdb", "fs_type": null }, { "action": "destroy format", "device": "/dev/mapper/foo-test1", "fs_type": "luks" }, { "action": "destroy device", "device": "/dev/mapper/foo-test1", "fs_type": null }, { "action": "destroy device", "device": "/dev/foo", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda", "fs_type": "lvmpv" } ], "changed": true, "crypts": [ { "backing_device": "/dev/mapper/foo-test1", "name": "luks-f2c4c01a-e23d-4ad3-a7f9-93d565703bdb", "password": "-", "state": "absent" } ], "failed": false, "leaves": [ "/dev/sda", "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-f2c4c01a-e23d-4ad3-a7f9-93d565703bdb", "state": "absent" } ], "packages": [ "e2fsprogs" ], "pools": [], "volumes": [ { "_device": "/dev/sda", "_mount_id": "UUID=30b7yk-4Y3c-DHs7-HP7U-2G7i-eOL6-HqzhXx", "_raw_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "lvmpv", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": null, "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10737418240, "state": "absent", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] } } TASK [fedora.linux_system_roles.storage : Set the list of pools for test verification] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:128 Friday 17 January 2025 04:21:43 -0500 (0:00:00.092) 0:11:35.917 ******** ok: [managed-node1] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Set the list of volumes for test verification] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:132 Friday 17 January 2025 04:21:43 -0500 (0:00:00.080) 0:11:35.997 ******** ok: [managed-node1] => { "ansible_facts": { "_storage_volumes_list": [ { "_device": "/dev/sda", "_mount_id": "UUID=30b7yk-4Y3c-DHs7-HP7U-2G7i-eOL6-HqzhXx", "_raw_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "lvmpv", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": null, "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10737418240, "state": "absent", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] }, "changed": false } TASK [fedora.linux_system_roles.storage : Remove obsolete mounts] ************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:148 Friday 17 January 2025 04:21:43 -0500 (0:00:00.073) 0:11:36.071 ******** changed: [managed-node1] => (item={u'src': u'/dev/mapper/luks-f2c4c01a-e23d-4ad3-a7f9-93d565703bdb', u'state': u'absent', u'fstype': u'xfs', u'path': u'/opt/test1'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-f2c4c01a-e23d-4ad3-a7f9-93d565703bdb", "state": "absent" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/luks-f2c4c01a-e23d-4ad3-a7f9-93d565703bdb" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:159 Friday 17 January 2025 04:21:43 -0500 (0:00:00.452) 0:11:36.523 ******** ok: [managed-node1] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Set up new/current mounts] *********** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:164 Friday 17 January 2025 04:21:44 -0500 (0:00:00.560) 0:11:37.083 ******** TASK [fedora.linux_system_roles.storage : Manage mount ownership/permissions] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:175 Friday 17 January 2025 04:21:44 -0500 (0:00:00.061) 0:11:37.145 ******** TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:187 Friday 17 January 2025 04:21:44 -0500 (0:00:00.057) 0:11:37.202 ******** ok: [managed-node1] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Retrieve facts for the /etc/crypttab file] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:195 Friday 17 January 2025 04:21:45 -0500 (0:00:00.558) 0:11:37.760 ******** ok: [managed-node1] => { "changed": false, "stat": { "atime": 1737105635.094284, "attr_flags": "e", "attributes": [ "extents" ], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "02c220ffacc346e181df3d514e1d63c88183ffbd", "ctime": 1737105633.175282, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 263648, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0600", "mtime": 1737105633.174282, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 66, "uid": 0, "version": "18446744072031195238", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Manage /etc/crypttab to account for changes we just made] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:200 Friday 17 January 2025 04:21:45 -0500 (0:00:00.421) 0:11:38.182 ******** changed: [managed-node1] => (item={u'state': u'absent', u'password': u'-', u'name': u'luks-f2c4c01a-e23d-4ad3-a7f9-93d565703bdb', u'backing_device': u'/dev/mapper/foo-test1'}) => { "ansible_loop_var": "entry", "backup": "", "changed": true, "entry": { "backing_device": "/dev/mapper/foo-test1", "name": "luks-f2c4c01a-e23d-4ad3-a7f9-93d565703bdb", "password": "-", "state": "absent" }, "found": 1 } MSG: 1 line(s) removed TASK [fedora.linux_system_roles.storage : Update facts] ************************ task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:222 Friday 17 January 2025 04:21:45 -0500 (0:00:00.445) 0:11:38.627 ******** ok: [managed-node1] TASK [Verify role results] ***************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:537 Friday 17 January 2025 04:21:46 -0500 (0:00:00.796) 0:11:39.424 ******** included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml for managed-node1 TASK [Print out pool information] ********************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:2 Friday 17 January 2025 04:21:46 -0500 (0:00:00.142) 0:11:39.566 ******** skipping: [managed-node1] => {} TASK [Print out volume information] ******************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:7 Friday 17 January 2025 04:21:47 -0500 (0:00:00.157) 0:11:39.724 ******** ok: [managed-node1] => { "_storage_volumes_list": [ { "_device": "/dev/sda", "_mount_id": "UUID=30b7yk-4Y3c-DHs7-HP7U-2G7i-eOL6-HqzhXx", "_raw_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "lvmpv", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": null, "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10737418240, "state": "absent", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] } TASK [Collect info about the volumes.] ***************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:15 Friday 17 January 2025 04:21:47 -0500 (0:00:00.073) 0:11:39.797 ******** ok: [managed-node1] => { "changed": false, "info": { "/dev/sda": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdb": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdd": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdd", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sde": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sde", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdf": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdf", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdg": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdg", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdh": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdh", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdi": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdi", "size": "10G", "type": "disk", "uuid": "" }, "/dev/xvda": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/xvda", "size": "250G", "type": "disk", "uuid": "" }, "/dev/xvda1": { "fstype": "ext4", "label": "", "mountpoint": "/", "name": "/dev/xvda1", "size": "250G", "type": "partition", "uuid": "c7b7d6a5-fd01-4b9b-bcca-153eaff9d312" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:20 Friday 17 January 2025 04:21:48 -0500 (0:00:01.407) 0:11:41.204 ******** ok: [managed-node1] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.002879", "end": "2025-01-17 04:21:48.834579", "rc": 0, "start": "2025-01-17 04:21:48.831700" } STDOUT: # system_role:storage # # /etc/fstab # Created by anaconda on Thu Jun 20 10:23:46 2024 # # Accessible filesystems, by reference, are maintained under '/dev/disk' # See man pages fstab(5), findfs(8), mount(8) and/or blkid(8) for more info # UUID=c7b7d6a5-fd01-4b9b-bcca-153eaff9d312 / ext4 defaults 1 1 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat /mnt/redhat nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/qa /mnt/qa nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 vtap-eng01.storage.rdu2.redhat.com:/vol/engarchive /mnt/engarchive nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/tpsdist /mnt/tpsdist nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat/brewroot /mnt/brew nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_brew_scratch_nfs/scratch /mnt/brew_scratch nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:25 Friday 17 January 2025 04:21:48 -0500 (0:00:00.403) 0:11:41.608 ******** ok: [managed-node1] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.002835", "end": "2025-01-17 04:21:49.223906", "failed_when_result": false, "rc": 0, "start": "2025-01-17 04:21:49.221071" } TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:34 Friday 17 January 2025 04:21:49 -0500 (0:00:00.394) 0:11:42.002 ******** TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:44 Friday 17 January 2025 04:21:49 -0500 (0:00:00.070) 0:11:42.073 ******** included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml for managed-node1 TASK [Set storage volume test variables] *************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:2 Friday 17 January 2025 04:21:49 -0500 (0:00:00.165) 0:11:42.239 ******** ok: [managed-node1] => { "ansible_facts": { "_storage_test_volume_present": false, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [Run test verify for {{ storage_test_volume_subset }}] ******************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:19 Friday 17 January 2025 04:21:49 -0500 (0:00:00.085) 0:11:42.325 ******** included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml for managed-node1 included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml for managed-node1 included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml for managed-node1 included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml for managed-node1 included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml for managed-node1 included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml for managed-node1 included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml for managed-node1 included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml for managed-node1 TASK [Get expected mount device based on device type] ************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:7 Friday 17 January 2025 04:21:49 -0500 (0:00:00.347) 0:11:42.672 ******** ok: [managed-node1] => { "ansible_facts": { "storage_test_device_path": "/dev/sda" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:11 Friday 17 January 2025 04:21:50 -0500 (0:00:00.065) 0:11:42.738 ******** ok: [managed-node1] => { "ansible_facts": { "storage_test_mount_expected_mount_point": "", "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Get information about the mountpoint directory] ************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:19 Friday 17 January 2025 04:21:50 -0500 (0:00:00.072) 0:11:42.810 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by device] ******************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:28 Friday 17 January 2025 04:21:50 -0500 (0:00:00.058) 0:11:42.868 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory user] ********************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:36 Friday 17 January 2025 04:21:50 -0500 (0:00:00.051) 0:11:42.920 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory group] ******************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:42 Friday 17 January 2025 04:21:50 -0500 (0:00:00.059) 0:11:42.980 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory permissions] ************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:48 Friday 17 January 2025 04:21:50 -0500 (0:00:00.057) 0:11:43.037 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get path of test volume device] ****************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:57 Friday 17 January 2025 04:21:50 -0500 (0:00:00.055) 0:11:43.093 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:63 Friday 17 January 2025 04:21:50 -0500 (0:00:00.056) 0:11:43.149 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:69 Friday 17 January 2025 04:21:50 -0500 (0:00:00.051) 0:11:43.201 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:79 Friday 17 January 2025 04:21:50 -0500 (0:00:00.052) 0:11:43.253 ******** ok: [managed-node1] => { "ansible_facts": { "storage_test_found_mount_stat": null, "storage_test_mount_expected_mount_point": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:2 Friday 17 January 2025 04:21:50 -0500 (0:00:00.071) 0:11:43.324 ******** ok: [managed-node1] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "0", "storage_test_fstab_expected_mount_options_matches": "0", "storage_test_fstab_expected_mount_point_matches": "0", "storage_test_fstab_id_matches": [], "storage_test_fstab_mount_options_matches": [], "storage_test_fstab_mount_point_matches": [] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:17 Friday 17 January 2025 04:21:50 -0500 (0:00:00.099) 0:11:43.423 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the fstab mount point] ******************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:24 Friday 17 January 2025 04:21:50 -0500 (0:00:00.064) 0:11:43.488 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:33 Friday 17 January 2025 04:21:50 -0500 (0:00:00.077) 0:11:43.565 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify fingerprint] ****************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:45 Friday 17 January 2025 04:21:50 -0500 (0:00:00.094) 0:11:43.660 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Clean up variables] ****************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:52 Friday 17 January 2025 04:21:51 -0500 (0:00:00.083) 0:11:43.743 ******** ok: [managed-node1] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:6 Friday 17 January 2025 04:21:51 -0500 (0:00:00.075) 0:11:43.819 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify fs label] ********************************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:14 Friday 17 January 2025 04:21:51 -0500 (0:00:00.060) 0:11:43.880 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [See whether the device node is present] ********************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:3 Friday 17 January 2025 04:21:51 -0500 (0:00:00.054) 0:11:43.934 ******** ok: [managed-node1] => { "changed": false, "stat": { "atime": 1737105701.978356, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1737105701.978356, "dev": 5, "device_type": 2048, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 28267, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/blockdevice", "mode": "0660", "mtime": 1737105701.978356, "nlink": 1, "path": "/dev/sda", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:9 Friday 17 January 2025 04:21:51 -0500 (0:00:00.413) 0:11:44.348 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:16 Friday 17 January 2025 04:21:51 -0500 (0:00:00.075) 0:11:44.424 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Make sure we got info about this volume] ********************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:23 Friday 17 January 2025 04:21:51 -0500 (0:00:00.055) 0:11:44.479 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Process volume type (set initial value) (1/2)] *************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:29 Friday 17 January 2025 04:21:51 -0500 (0:00:00.048) 0:11:44.527 ******** ok: [managed-node1] => { "ansible_facts": { "st_volume_type": "disk" }, "changed": false } TASK [Process volume type (get RAID value) (2/2)] ****************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:33 Friday 17 January 2025 04:21:51 -0500 (0:00:00.053) 0:11:44.581 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:38 Friday 17 January 2025 04:21:51 -0500 (0:00:00.065) 0:11:44.646 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:3 Friday 17 January 2025 04:21:51 -0500 (0:00:00.044) 0:11:44.691 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Ensure cryptsetup is present] ******************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:10 Friday 17 January 2025 04:21:52 -0500 (0:00:00.047) 0:11:44.738 ******** ok: [managed-node1] => { "changed": false, "rc": 0, "results": [ "cryptsetup-2.0.3-6.el7.x86_64 providing cryptsetup is already installed" ] } TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:16 Friday 17 January 2025 04:21:52 -0500 (0:00:00.695) 0:11:45.434 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:22 Friday 17 January 2025 04:21:52 -0500 (0:00:00.062) 0:11:45.496 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:29 Friday 17 January 2025 04:21:52 -0500 (0:00:00.062) 0:11:45.559 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:40 Friday 17 January 2025 04:21:52 -0500 (0:00:00.054) 0:11:45.613 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:46 Friday 17 January 2025 04:21:52 -0500 (0:00:00.078) 0:11:45.692 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:51 Friday 17 January 2025 04:21:53 -0500 (0:00:00.057) 0:11:45.749 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:63 Friday 17 January 2025 04:21:53 -0500 (0:00:00.049) 0:11:45.799 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:75 Friday 17 January 2025 04:21:53 -0500 (0:00:00.047) 0:11:45.847 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set test variables] ****************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:87 Friday 17 January 2025 04:21:53 -0500 (0:00:00.052) 0:11:45.899 ******** ok: [managed-node1] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:93 Friday 17 January 2025 04:21:53 -0500 (0:00:00.072) 0:11:45.971 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:100 Friday 17 January 2025 04:21:53 -0500 (0:00:00.067) 0:11:46.038 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:108 Friday 17 January 2025 04:21:53 -0500 (0:00:00.065) 0:11:46.104 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:116 Friday 17 January 2025 04:21:53 -0500 (0:00:00.052) 0:11:46.156 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clear test variables] **************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:124 Friday 17 January 2025 04:21:53 -0500 (0:00:00.046) 0:11:46.203 ******** ok: [managed-node1] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [Get information about RAID] ********************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:8 Friday 17 January 2025 04:21:53 -0500 (0:00:00.046) 0:11:46.250 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:14 Friday 17 January 2025 04:21:53 -0500 (0:00:00.043) 0:11:46.293 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:19 Friday 17 January 2025 04:21:53 -0500 (0:00:00.039) 0:11:46.333 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:24 Friday 17 January 2025 04:21:53 -0500 (0:00:00.040) 0:11:46.373 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set chunk size regex] **************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:29 Friday 17 January 2025 04:21:53 -0500 (0:00:00.039) 0:11:46.412 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:37 Friday 17 January 2025 04:21:53 -0500 (0:00:00.046) 0:11:46.459 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:46 Friday 17 January 2025 04:21:53 -0500 (0:00:00.053) 0:11:46.512 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:54 Friday 17 January 2025 04:21:53 -0500 (0:00:00.063) 0:11:46.576 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:62 Friday 17 January 2025 04:21:53 -0500 (0:00:00.056) 0:11:46.633 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:70 Friday 17 January 2025 04:21:53 -0500 (0:00:00.061) 0:11:46.694 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the actual size of the volume] ************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:3 Friday 17 January 2025 04:21:54 -0500 (0:00:00.172) 0:11:46.866 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested size of the volume] ********************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:11 Friday 17 January 2025 04:21:54 -0500 (0:00:00.058) 0:11:46.925 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected size] ********************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:20 Friday 17 January 2025 04:21:54 -0500 (0:00:00.060) 0:11:46.986 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show expected size] ****************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:28 Friday 17 January 2025 04:21:54 -0500 (0:00:00.056) 0:11:47.042 ******** ok: [managed-node1] => { "storage_test_expected_size": "4294967296" } TASK [Get the size of parent/pool device] ************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:32 Friday 17 January 2025 04:21:54 -0500 (0:00:00.061) 0:11:47.104 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show test pool] ********************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:46 Friday 17 January 2025 04:21:54 -0500 (0:00:00.055) 0:11:47.159 ******** skipping: [managed-node1] => {} TASK [Show test blockinfo] ***************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:50 Friday 17 January 2025 04:21:54 -0500 (0:00:00.060) 0:11:47.220 ******** skipping: [managed-node1] => {} TASK [Show test pool size] ***************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:54 Friday 17 January 2025 04:21:54 -0500 (0:00:00.056) 0:11:47.277 ******** skipping: [managed-node1] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:58 Friday 17 January 2025 04:21:54 -0500 (0:00:00.057) 0:11:47.334 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default thin pool reserved space values] ********************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:67 Friday 17 January 2025 04:21:54 -0500 (0:00:00.057) 0:11:47.392 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default minimal thin pool reserved space size] *************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:71 Friday 17 January 2025 04:21:54 -0500 (0:00:00.061) 0:11:47.454 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default maximal thin pool reserved space size] *************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:76 Friday 17 January 2025 04:21:54 -0500 (0:00:00.051) 0:11:47.505 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate maximum usable space in thin pool] ***************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:82 Friday 17 January 2025 04:21:54 -0500 (0:00:00.059) 0:11:47.565 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply upper size limit to max usable thin pool space] ******************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:86 Friday 17 January 2025 04:21:54 -0500 (0:00:00.057) 0:11:47.622 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply lower size limit to max usable thin pool space] ******************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:91 Friday 17 January 2025 04:21:54 -0500 (0:00:00.057) 0:11:47.680 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Convert maximum usable thin pool space from int to Size] ***************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:96 Friday 17 January 2025 04:21:55 -0500 (0:00:00.058) 0:11:47.738 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show max thin pool size] ************************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:101 Friday 17 January 2025 04:21:55 -0500 (0:00:00.074) 0:11:47.813 ******** skipping: [managed-node1] => {} TASK [Show volume thin pool size] ********************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:105 Friday 17 January 2025 04:21:55 -0500 (0:00:00.054) 0:11:47.868 ******** skipping: [managed-node1] => {} TASK [Show test volume size] *************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:109 Friday 17 January 2025 04:21:55 -0500 (0:00:00.059) 0:11:47.928 ******** skipping: [managed-node1] => {} TASK [Establish base value for expected thin pool size] ************************ task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:113 Friday 17 January 2025 04:21:55 -0500 (0:00:00.053) 0:11:47.981 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:120 Friday 17 January 2025 04:21:55 -0500 (0:00:00.058) 0:11:48.040 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected thin pool volume size] ***************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:127 Friday 17 January 2025 04:21:55 -0500 (0:00:00.057) 0:11:48.097 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected thin pool volume size based on percentage value] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:131 Friday 17 January 2025 04:21:55 -0500 (0:00:00.059) 0:11:48.157 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Replace expected volume size with calculated value] ********************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:137 Friday 17 January 2025 04:21:55 -0500 (0:00:00.055) 0:11:48.212 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show actual size] ******************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:143 Friday 17 January 2025 04:21:55 -0500 (0:00:00.062) 0:11:48.275 ******** ok: [managed-node1] => { "storage_test_actual_size": { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } } TASK [Show expected size] ****************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:147 Friday 17 January 2025 04:21:55 -0500 (0:00:00.065) 0:11:48.340 ******** ok: [managed-node1] => { "storage_test_expected_size": "4294967296" } TASK [Assert expected size is actual size] ************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:151 Friday 17 January 2025 04:21:55 -0500 (0:00:00.067) 0:11:48.408 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about the LV] ******************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:5 Friday 17 January 2025 04:21:55 -0500 (0:00:00.059) 0:11:48.467 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV segment type] ***************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:13 Friday 17 January 2025 04:21:55 -0500 (0:00:00.045) 0:11:48.512 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check segment type] ****************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:17 Friday 17 January 2025 04:21:55 -0500 (0:00:00.049) 0:11:48.562 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV cache size] ******************************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:24 Friday 17 January 2025 04:21:55 -0500 (0:00:00.054) 0:11:48.616 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested cache size] ****************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:31 Friday 17 January 2025 04:21:55 -0500 (0:00:00.046) 0:11:48.662 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected cache size] ************************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:37 Friday 17 January 2025 04:21:56 -0500 (0:00:00.041) 0:11:48.704 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:42 Friday 17 January 2025 04:21:56 -0500 (0:00:00.040) 0:11:48.745 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:25 Friday 17 January 2025 04:21:56 -0500 (0:00:00.038) 0:11:48.783 ******** ok: [managed-node1] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Clean up variable namespace] ********************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:54 Friday 17 January 2025 04:21:56 -0500 (0:00:00.048) 0:11:48.832 ******** ok: [managed-node1] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null }, "changed": false } META: ran handlers META: ran handlers PLAY RECAP ********************************************************************* managed-node1 : ok=1229 changed=63 unreachable=0 failed=9 skipped=1061 rescued=9 ignored=0 Friday 17 January 2025 04:21:56 -0500 (0:00:00.022) 0:11:48.855 ******** =============================================================================== fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state -- 64.99s /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:69 fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state -- 34.89s /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:69 fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state -- 13.99s /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:69 fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state -- 13.44s /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:69 fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state -- 13.21s /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:69 fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state -- 12.93s /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:69 fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state -- 12.79s /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:69 fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state -- 12.53s /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:69 fedora.linux_system_roles.storage : Make sure blivet is available ------ 10.30s /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2 fedora.linux_system_roles.storage : Make sure required packages are installed --- 8.14s /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:37 fedora.linux_system_roles.storage : Get required packages --------------- 4.66s /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19 fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state --- 4.53s /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:69 fedora.linux_system_roles.storage : Get required packages --------------- 4.46s /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19 fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state --- 4.45s /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:69 fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state --- 4.37s /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:69 fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state --- 4.31s /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:69 fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state --- 4.29s /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:69 fedora.linux_system_roles.storage : Get required packages --------------- 4.28s /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19 fedora.linux_system_roles.storage : Get required packages --------------- 4.28s /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19 fedora.linux_system_roles.storage : Make sure blivet is available ------- 4.28s /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2