[DEPRECATION WARNING]: ANSIBLE_COLLECTIONS_PATHS option, does not fit var naming standard, use the singular form ANSIBLE_COLLECTIONS_PATH instead. This feature will be removed from ansible-core in version 2.19. Deprecation warnings can be disabled by setting deprecation_warnings=False in ansible.cfg. ansible-playbook [core 2.17.2] config file = None configured module search path = ['/root/.ansible/plugins/modules', '/usr/share/ansible/plugins/modules'] ansible python module location = /usr/local/lib/python3.12/site-packages/ansible ansible collection location = /tmp/tmp.kBSwYGXc3s executable location = /usr/local/bin/ansible-playbook python version = 3.12.4 (main, Jul 17 2024, 00:00:00) [GCC 14.1.1 20240607 (Red Hat 14.1.1-5)] (/usr/bin/python3.12) jinja version = 3.1.4 libyaml = True No config file found; using defaults running playbook inside collection fedora.linux_system_roles redirecting (type: callback) ansible.builtin.debug to ansible.posix.debug redirecting (type: callback) ansible.builtin.debug to ansible.posix.debug redirecting (type: callback) ansible.builtin.profile_tasks to ansible.posix.profile_tasks Skipping callback 'default', as we already have a stdout callback. Skipping callback 'minimal', as we already have a stdout callback. Skipping callback 'oneline', as we already have a stdout callback. PLAYBOOK: tests_quadlet_basic.yml ********************************************** 1 plays in /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/tests/podman/tests_quadlet_basic.yml PLAY [Ensure that the role can manage quadlet specs] *************************** TASK [Gathering Facts] ********************************************************* task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/tests/podman/tests_quadlet_basic.yml:3 Saturday 27 July 2024 12:35:49 -0400 (0:00:00.010) 0:00:00.010 ********* [WARNING]: Platform linux on host managed_node1 is using the discovered Python interpreter at /usr/bin/python3.12, but future installation of another Python interpreter could change the meaning of that path. See https://docs.ansible.com/ansible- core/2.17/reference_appendices/interpreter_discovery.html for more information. ok: [managed_node1] TASK [Run role - do not pull images] ******************************************* task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/tests/podman/tests_quadlet_basic.yml:54 Saturday 27 July 2024 12:35:51 -0400 (0:00:01.224) 0:00:01.234 ********* included: fedora.linux_system_roles.podman for managed_node1 TASK [fedora.linux_system_roles.podman : Set platform/version specific variables] *** task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:3 Saturday 27 July 2024 12:35:51 -0400 (0:00:00.048) 0:00:01.283 ********* included: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/set_vars.yml for managed_node1 TASK [fedora.linux_system_roles.podman : Ensure ansible_facts used by role] **** task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/set_vars.yml:3 Saturday 27 July 2024 12:35:51 -0400 (0:00:00.029) 0:00:01.313 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "__podman_required_facts | difference(ansible_facts.keys() | list) | length > 0", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Check if system is ostree] ************ task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/set_vars.yml:11 Saturday 27 July 2024 12:35:51 -0400 (0:00:00.023) 0:00:01.337 ********* ok: [managed_node1] => { "changed": false, "stat": { "exists": false } } TASK [fedora.linux_system_roles.podman : Set flag to indicate system is ostree] *** task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/set_vars.yml:16 Saturday 27 July 2024 12:35:51 -0400 (0:00:00.461) 0:00:01.799 ********* ok: [managed_node1] => { "ansible_facts": { "__podman_is_ostree": false }, "changed": false } TASK [fedora.linux_system_roles.podman : Set platform/version specific variables] *** task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/set_vars.yml:20 Saturday 27 July 2024 12:35:51 -0400 (0:00:00.025) 0:00:01.824 ********* ok: [managed_node1] => (item=RedHat.yml) => { "ansible_facts": { "__podman_packages": [ "podman", "shadow-utils-subid" ] }, "ansible_included_var_files": [ "/tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/vars/RedHat.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml" } skipping: [managed_node1] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed_node1] => (item=CentOS_10.yml) => { "ansible_facts": { "__podman_packages": [ "iptables-nft", "podman", "shadow-utils-subid" ] }, "ansible_included_var_files": [ "/tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/vars/CentOS_10.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_10.yml" } ok: [managed_node1] => (item=CentOS_10.yml) => { "ansible_facts": { "__podman_packages": [ "iptables-nft", "podman", "shadow-utils-subid" ] }, "ansible_included_var_files": [ "/tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/vars/CentOS_10.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_10.yml" } TASK [fedora.linux_system_roles.podman : Gather the package facts] ************* task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:6 Saturday 27 July 2024 12:35:51 -0400 (0:00:00.047) 0:00:01.871 ********* ok: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } TASK [fedora.linux_system_roles.podman : Enable copr if requested] ************* task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:10 Saturday 27 July 2024 12:35:52 -0400 (0:00:01.010) 0:00:02.882 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "podman_use_copr | d(false)", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Ensure required packages are installed] *** task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:14 Saturday 27 July 2024 12:35:52 -0400 (0:00:00.036) 0:00:02.918 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "(__podman_packages | difference(ansible_facts.packages))", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Get podman version] ******************* task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:22 Saturday 27 July 2024 12:35:52 -0400 (0:00:00.038) 0:00:02.956 ********* ok: [managed_node1] => { "changed": false, "cmd": [ "podman", "--version" ], "delta": "0:00:00.028218", "end": "2024-07-27 12:35:53.249871", "rc": 0, "start": "2024-07-27 12:35:53.221653" } STDOUT: podman version 5.1.2 TASK [fedora.linux_system_roles.podman : Set podman version] ******************* task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:28 Saturday 27 July 2024 12:35:53 -0400 (0:00:00.500) 0:00:03.457 ********* ok: [managed_node1] => { "ansible_facts": { "podman_version": "5.1.2" }, "changed": false } TASK [fedora.linux_system_roles.podman : Podman package version must be 4.2 or later] *** task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:32 Saturday 27 July 2024 12:35:53 -0400 (0:00:00.032) 0:00:03.490 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "podman_version is version(\"4.2\", \"<\")", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Podman package version must be 4.4 or later for quadlet, secrets] *** task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:39 Saturday 27 July 2024 12:35:53 -0400 (0:00:00.033) 0:00:03.524 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "podman_version is version(\"4.4\", \"<\")", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Podman package version must be 4.4 or later for quadlet, secrets] *** task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:49 Saturday 27 July 2024 12:35:53 -0400 (0:00:00.039) 0:00:03.564 ********* META: end_host conditional evaluated to False, continuing execution for managed_node1 skipping: [managed_node1] => { "skip_reason": "end_host conditional evaluated to False, continuing execution for managed_node1" } MSG: end_host conditional evaluated to false, continuing execution for managed_node1 TASK [fedora.linux_system_roles.podman : Check user and group information] ***** task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:56 Saturday 27 July 2024 12:35:53 -0400 (0:00:00.042) 0:00:03.607 ********* included: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml for managed_node1 TASK [fedora.linux_system_roles.podman : Get user information] ***************** task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:2 Saturday 27 July 2024 12:35:53 -0400 (0:00:00.068) 0:00:03.675 ********* ok: [managed_node1] => { "ansible_facts": { "getent_passwd": { "root": [ "x", "0", "0", "Super User", "/root", "/bin/bash" ] } }, "changed": false } TASK [fedora.linux_system_roles.podman : Fail if user does not exist] ********** task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:9 Saturday 27 July 2024 12:35:54 -0400 (0:00:00.484) 0:00:04.159 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "not ansible_facts[\"getent_passwd\"][__podman_user]", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Set group for podman user] ************ task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:16 Saturday 27 July 2024 12:35:54 -0400 (0:00:00.033) 0:00:04.193 ********* ok: [managed_node1] => { "ansible_facts": { "__podman_group": "0" }, "changed": false } TASK [fedora.linux_system_roles.podman : Get group information] **************** task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:28 Saturday 27 July 2024 12:35:54 -0400 (0:00:00.041) 0:00:04.235 ********* ok: [managed_node1] => { "ansible_facts": { "getent_group": { "root": [ "x", "0", "" ] } }, "changed": false } TASK [fedora.linux_system_roles.podman : Set group name] *********************** task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:35 Saturday 27 July 2024 12:35:54 -0400 (0:00:00.418) 0:00:04.653 ********* ok: [managed_node1] => { "ansible_facts": { "__podman_group_name": "root" }, "changed": false } TASK [fedora.linux_system_roles.podman : See if getsubids exists] ************** task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:39 Saturday 27 July 2024 12:35:54 -0400 (0:00:00.040) 0:00:04.693 ********* ok: [managed_node1] => { "changed": false, "stat": { "atime": 1722097719.096431, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 32, "charset": "binary", "checksum": "86395ad7ce62834c967dc50f963a68f042029188", "ctime": 1722097690.6782997, "dev": 51714, "device_type": 0, "executable": true, "exists": true, "gid": 0, "gr_name": "root", "inode": 4755377, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "application/x-pie-executable", "mode": "0755", "mtime": 1719187200.0, "nlink": 1, "path": "/usr/bin/getsubids", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 15728, "uid": 0, "version": "2656437169", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": true, "xoth": true, "xusr": true } } TASK [fedora.linux_system_roles.podman : Check user with getsubids] ************ task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:50 Saturday 27 July 2024 12:35:54 -0400 (0:00:00.387) 0:00:05.081 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "__podman_user not in [\"root\", \"0\"]", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Check group with getsubids] *********** task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:55 Saturday 27 July 2024 12:35:54 -0400 (0:00:00.029) 0:00:05.110 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "__podman_user not in [\"root\", \"0\"]", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Set user subuid and subgid info] ****** task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:60 Saturday 27 July 2024 12:35:55 -0400 (0:00:00.029) 0:00:05.140 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "__podman_user not in [\"root\", \"0\"]", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Get subuid file] ********************** task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:74 Saturday 27 July 2024 12:35:55 -0400 (0:00:00.029) 0:00:05.169 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "not __podman_stat_getsubids.stat.exists", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Get subgid file] ********************** task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:79 Saturday 27 July 2024 12:35:55 -0400 (0:00:00.029) 0:00:05.199 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "not __podman_stat_getsubids.stat.exists", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Set user subuid and subgid info] ****** task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:84 Saturday 27 July 2024 12:35:55 -0400 (0:00:00.028) 0:00:05.227 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "not __podman_stat_getsubids.stat.exists", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Fail if user not in subuid file] ****** task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:94 Saturday 27 July 2024 12:35:55 -0400 (0:00:00.029) 0:00:05.257 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "not __podman_stat_getsubids.stat.exists", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Fail if group not in subgid file] ***** task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:101 Saturday 27 July 2024 12:35:55 -0400 (0:00:00.028) 0:00:05.286 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "not __podman_stat_getsubids.stat.exists", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Set config file paths] **************** task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:62 Saturday 27 July 2024 12:35:55 -0400 (0:00:00.029) 0:00:05.315 ********* ok: [managed_node1] => { "ansible_facts": { "__podman_container_conf_file": "/etc/containers/containers.conf.d/50-systemroles.conf", "__podman_policy_json_file": "/etc/containers/policy.json", "__podman_registries_conf_file": "/etc/containers/registries.conf.d/50-systemroles.conf", "__podman_storage_conf_file": "/etc/containers/storage.conf" }, "changed": false } TASK [fedora.linux_system_roles.podman : Handle container.conf.d] ************** task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:71 Saturday 27 July 2024 12:35:55 -0400 (0:00:00.056) 0:00:05.371 ********* included: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_container_conf_d.yml for managed_node1 TASK [fedora.linux_system_roles.podman : Ensure containers.d exists] *********** task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_container_conf_d.yml:5 Saturday 27 July 2024 12:35:55 -0400 (0:00:00.062) 0:00:05.434 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "podman_containers_conf | length > 0", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Update container config file] ********* task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_container_conf_d.yml:13 Saturday 27 July 2024 12:35:55 -0400 (0:00:00.031) 0:00:05.465 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "podman_containers_conf | length > 0", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Handle registries.conf.d] ************* task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:74 Saturday 27 July 2024 12:35:55 -0400 (0:00:00.030) 0:00:05.496 ********* included: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_registries_conf_d.yml for managed_node1 TASK [fedora.linux_system_roles.podman : Ensure registries.d exists] *********** task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_registries_conf_d.yml:5 Saturday 27 July 2024 12:35:55 -0400 (0:00:00.063) 0:00:05.559 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "podman_registries_conf | length > 0", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Update registries config file] ******** task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_registries_conf_d.yml:13 Saturday 27 July 2024 12:35:55 -0400 (0:00:00.031) 0:00:05.590 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "podman_registries_conf | length > 0", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Handle storage.conf] ****************** task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:77 Saturday 27 July 2024 12:35:55 -0400 (0:00:00.030) 0:00:05.621 ********* included: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_storage_conf.yml for managed_node1 TASK [fedora.linux_system_roles.podman : Ensure storage.conf parent dir exists] *** task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_storage_conf.yml:5 Saturday 27 July 2024 12:35:55 -0400 (0:00:00.092) 0:00:05.714 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "podman_storage_conf | length > 0", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Update storage config file] *********** task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_storage_conf.yml:13 Saturday 27 July 2024 12:35:55 -0400 (0:00:00.030) 0:00:05.745 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "podman_storage_conf | length > 0", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Handle policy.json] ******************* task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:80 Saturday 27 July 2024 12:35:55 -0400 (0:00:00.031) 0:00:05.776 ********* included: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_policy_json.yml for managed_node1 TASK [fedora.linux_system_roles.podman : Ensure policy.json parent dir exists] *** task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_policy_json.yml:6 Saturday 27 July 2024 12:35:55 -0400 (0:00:00.068) 0:00:05.845 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "podman_policy_json | length > 0", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Stat the policy.json file] ************ task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_policy_json.yml:14 Saturday 27 July 2024 12:35:55 -0400 (0:00:00.030) 0:00:05.876 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "podman_policy_json | length > 0", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Get the existing policy.json] ********* task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_policy_json.yml:19 Saturday 27 July 2024 12:35:55 -0400 (0:00:00.031) 0:00:05.908 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "podman_policy_json | length > 0", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Write new policy.json file] *********** task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_policy_json.yml:25 Saturday 27 July 2024 12:35:55 -0400 (0:00:00.031) 0:00:05.939 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "podman_policy_json | length > 0", "skip_reason": "Conditional result was False" } TASK [Manage firewall for specified ports] ************************************* task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:86 Saturday 27 July 2024 12:35:55 -0400 (0:00:00.031) 0:00:05.971 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "podman_firewall | length > 0", "skip_reason": "Conditional result was False" } TASK [Manage selinux for specified ports] ************************************** task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:93 Saturday 27 July 2024 12:35:55 -0400 (0:00:00.030) 0:00:06.001 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "podman_selinux_ports | length > 0", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Keep track of users that need to cancel linger] *** task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:100 Saturday 27 July 2024 12:35:55 -0400 (0:00:00.031) 0:00:06.033 ********* ok: [managed_node1] => { "ansible_facts": { "__podman_cancel_user_linger": [] }, "changed": false } TASK [fedora.linux_system_roles.podman : Handle certs.d files - present] ******* task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:104 Saturday 27 July 2024 12:35:55 -0400 (0:00:00.030) 0:00:06.064 ********* skipping: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } TASK [fedora.linux_system_roles.podman : Handle credential files - present] **** task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:113 Saturday 27 July 2024 12:35:55 -0400 (0:00:00.028) 0:00:06.092 ********* skipping: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } TASK [fedora.linux_system_roles.podman : Handle secrets] *********************** task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:122 Saturday 27 July 2024 12:35:55 -0400 (0:00:00.027) 0:00:06.120 ********* skipping: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } TASK [fedora.linux_system_roles.podman : Handle Kubernetes specifications] ***** task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:129 Saturday 27 July 2024 12:35:56 -0400 (0:00:00.027) 0:00:06.148 ********* skipping: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } TASK [fedora.linux_system_roles.podman : Handle Quadlet specifications] ******** task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:136 Saturday 27 July 2024 12:35:56 -0400 (0:00:00.027) 0:00:06.175 ********* included: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_quadlet_spec.yml for managed_node1 => (item=(censored due to no_log)) TASK [fedora.linux_system_roles.podman : Set per-container variables part 0] *** task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_quadlet_spec.yml:14 Saturday 27 July 2024 12:35:56 -0400 (0:00:00.081) 0:00:06.257 ********* ok: [managed_node1] => { "ansible_facts": { "__podman_quadlet_file_src": "", "__podman_quadlet_spec": { "Container": { "ContainerName": "nopull", "Image": "quay.io/libpod/testimage:20210610" }, "Install": { "WantedBy": "default.target" } }, "__podman_quadlet_str": "", "__podman_quadlet_template_src": "" }, "changed": false } TASK [fedora.linux_system_roles.podman : Set per-container variables part 1] *** task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_quadlet_spec.yml:25 Saturday 27 July 2024 12:35:56 -0400 (0:00:00.041) 0:00:06.298 ********* ok: [managed_node1] => { "ansible_facts": { "__podman_continue_if_pull_fails": false, "__podman_pull_image": false, "__podman_state": "created", "__podman_systemd_unit_scope": "", "__podman_user": "root" }, "changed": false } TASK [fedora.linux_system_roles.podman : Fail if no quadlet spec is given] ***** task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_quadlet_spec.yml:35 Saturday 27 July 2024 12:35:56 -0400 (0:00:00.040) 0:00:06.338 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "__podman_quadlet_spec | length == 0", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Set per-container variables part 2] *** task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_quadlet_spec.yml:48 Saturday 27 July 2024 12:35:56 -0400 (0:00:00.030) 0:00:06.369 ********* ok: [managed_node1] => { "ansible_facts": { "__podman_quadlet_name": "nopull", "__podman_quadlet_type": "container", "__podman_rootless": false }, "changed": false } TASK [fedora.linux_system_roles.podman : Check user and group information] ***** task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_quadlet_spec.yml:57 Saturday 27 July 2024 12:35:56 -0400 (0:00:00.078) 0:00:06.447 ********* included: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml for managed_node1 TASK [fedora.linux_system_roles.podman : Get user information] ***************** task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:2 Saturday 27 July 2024 12:35:56 -0400 (0:00:00.061) 0:00:06.509 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "'getent_passwd' not in ansible_facts or __podman_user not in ansible_facts['getent_passwd']", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Fail if user does not exist] ********** task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:9 Saturday 27 July 2024 12:35:56 -0400 (0:00:00.034) 0:00:06.544 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "not ansible_facts[\"getent_passwd\"][__podman_user]", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Set group for podman user] ************ task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:16 Saturday 27 July 2024 12:35:56 -0400 (0:00:00.035) 0:00:06.579 ********* ok: [managed_node1] => { "ansible_facts": { "__podman_group": "0" }, "changed": false } TASK [fedora.linux_system_roles.podman : Get group information] **************** task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:28 Saturday 27 July 2024 12:35:56 -0400 (0:00:00.043) 0:00:06.623 ********* ok: [managed_node1] => { "ansible_facts": { "getent_group": { "root": [ "x", "0", "" ] } }, "changed": false } TASK [fedora.linux_system_roles.podman : Set group name] *********************** task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:35 Saturday 27 July 2024 12:35:56 -0400 (0:00:00.395) 0:00:07.019 ********* ok: [managed_node1] => { "ansible_facts": { "__podman_group_name": "root" }, "changed": false } TASK [fedora.linux_system_roles.podman : See if getsubids exists] ************** task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:39 Saturday 27 July 2024 12:35:56 -0400 (0:00:00.040) 0:00:07.059 ********* ok: [managed_node1] => { "changed": false, "stat": { "atime": 1722097719.096431, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 32, "charset": "binary", "checksum": "86395ad7ce62834c967dc50f963a68f042029188", "ctime": 1722097690.6782997, "dev": 51714, "device_type": 0, "executable": true, "exists": true, "gid": 0, "gr_name": "root", "inode": 4755377, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "application/x-pie-executable", "mode": "0755", "mtime": 1719187200.0, "nlink": 1, "path": "/usr/bin/getsubids", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 15728, "uid": 0, "version": "2656437169", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": true, "xoth": true, "xusr": true } } TASK [fedora.linux_system_roles.podman : Check user with getsubids] ************ task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:50 Saturday 27 July 2024 12:35:57 -0400 (0:00:00.397) 0:00:07.456 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "__podman_user not in [\"root\", \"0\"]", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Check group with getsubids] *********** task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:55 Saturday 27 July 2024 12:35:57 -0400 (0:00:00.029) 0:00:07.486 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "__podman_user not in [\"root\", \"0\"]", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Set user subuid and subgid info] ****** task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:60 Saturday 27 July 2024 12:35:57 -0400 (0:00:00.030) 0:00:07.517 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "__podman_user not in [\"root\", \"0\"]", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Get subuid file] ********************** task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:74 Saturday 27 July 2024 12:35:57 -0400 (0:00:00.029) 0:00:07.547 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "not __podman_stat_getsubids.stat.exists", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Get subgid file] ********************** task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:79 Saturday 27 July 2024 12:35:57 -0400 (0:00:00.030) 0:00:07.578 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "not __podman_stat_getsubids.stat.exists", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Set user subuid and subgid info] ****** task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:84 Saturday 27 July 2024 12:35:57 -0400 (0:00:00.029) 0:00:07.607 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "not __podman_stat_getsubids.stat.exists", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Fail if user not in subuid file] ****** task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:94 Saturday 27 July 2024 12:35:57 -0400 (0:00:00.031) 0:00:07.639 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "not __podman_stat_getsubids.stat.exists", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Fail if group not in subgid file] ***** task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:101 Saturday 27 July 2024 12:35:57 -0400 (0:00:00.030) 0:00:07.669 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "not __podman_stat_getsubids.stat.exists", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Set per-container variables part 3] *** task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_quadlet_spec.yml:62 Saturday 27 July 2024 12:35:57 -0400 (0:00:00.031) 0:00:07.700 ********* ok: [managed_node1] => { "ansible_facts": { "__podman_activate_systemd_unit": false, "__podman_images_found": [ "quay.io/libpod/testimage:20210610" ], "__podman_kube_yamls_raw": "", "__podman_service_name": "nopull.service", "__podman_systemd_scope": "system", "__podman_user_home_dir": "/root", "__podman_xdg_runtime_dir": "/run/user/0" }, "changed": false } TASK [fedora.linux_system_roles.podman : Set per-container variables part 4] *** task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_quadlet_spec.yml:73 Saturday 27 July 2024 12:35:57 -0400 (0:00:00.053) 0:00:07.754 ********* ok: [managed_node1] => { "ansible_facts": { "__podman_quadlet_path": "/etc/containers/systemd" }, "changed": false } TASK [fedora.linux_system_roles.podman : Get kube yaml contents] *************** task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_quadlet_spec.yml:77 Saturday 27 July 2024 12:35:57 -0400 (0:00:00.034) 0:00:07.789 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "__podman_kube_yamls_raw | length > 0", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Set per-container variables part 5] *** task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_quadlet_spec.yml:87 Saturday 27 July 2024 12:35:57 -0400 (0:00:00.033) 0:00:07.823 ********* ok: [managed_node1] => { "ansible_facts": { "__podman_images": [ "quay.io/libpod/testimage:20210610" ], "__podman_quadlet_file": "/etc/containers/systemd/nopull.container", "__podman_volumes": [] }, "changed": false } TASK [fedora.linux_system_roles.podman : Set per-container variables part 6] *** task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_quadlet_spec.yml:103 Saturday 27 July 2024 12:35:57 -0400 (0:00:00.083) 0:00:07.906 ********* ok: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } TASK [fedora.linux_system_roles.podman : Cleanup quadlets] ********************* task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_quadlet_spec.yml:110 Saturday 27 July 2024 12:35:57 -0400 (0:00:00.041) 0:00:07.947 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "__podman_state == \"absent\"", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Create and update quadlets] *********** task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_quadlet_spec.yml:114 Saturday 27 July 2024 12:35:57 -0400 (0:00:00.073) 0:00:08.021 ********* included: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/create_update_quadlet_spec.yml for managed_node1 TASK [fedora.linux_system_roles.podman : Manage linger] ************************ task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/create_update_quadlet_spec.yml:2 Saturday 27 July 2024 12:35:57 -0400 (0:00:00.075) 0:00:08.096 ********* included: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/manage_linger.yml for managed_node1 TASK [fedora.linux_system_roles.podman : Enable linger if needed] ************** task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/manage_linger.yml:12 Saturday 27 July 2024 12:35:58 -0400 (0:00:00.054) 0:00:08.151 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "__podman_rootless | bool", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Mark user as not yet needing to cancel linger] *** task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/manage_linger.yml:18 Saturday 27 July 2024 12:35:58 -0400 (0:00:00.031) 0:00:08.182 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "__podman_rootless | bool", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Mark user for possible linger cancel] *** task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/manage_linger.yml:22 Saturday 27 July 2024 12:35:58 -0400 (0:00:00.029) 0:00:08.212 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "__podman_rootless | bool", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Create host directories] ************** task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/create_update_quadlet_spec.yml:7 Saturday 27 July 2024 12:35:58 -0400 (0:00:00.031) 0:00:08.244 ********* skipping: [managed_node1] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.podman : Ensure container images are present] *** task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/create_update_quadlet_spec.yml:18 Saturday 27 July 2024 12:35:58 -0400 (0:00:00.028) 0:00:08.272 ********* skipping: [managed_node1] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } skipping: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } TASK [fedora.linux_system_roles.podman : Ensure the quadlet directory is present] *** task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/create_update_quadlet_spec.yml:39 Saturday 27 July 2024 12:35:58 -0400 (0:00:00.038) 0:00:08.311 ********* ok: [managed_node1] => { "changed": false, "gid": 0, "group": "root", "mode": "0755", "owner": "root", "path": "/etc/containers/systemd", "secontext": "system_u:object_r:etc_t:s0", "size": 6, "state": "directory", "uid": 0 } TASK [fedora.linux_system_roles.podman : Ensure quadlet file is copied] ******** task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/create_update_quadlet_spec.yml:48 Saturday 27 July 2024 12:35:58 -0400 (0:00:00.516) 0:00:08.827 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "__podman_quadlet_file_src | length > 0", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Ensure quadlet file content is present] *** task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/create_update_quadlet_spec.yml:58 Saturday 27 July 2024 12:35:58 -0400 (0:00:00.031) 0:00:08.858 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "__podman_quadlet_str | length > 0", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Ensure quadlet file is present] ******* task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/create_update_quadlet_spec.yml:70 Saturday 27 July 2024 12:35:58 -0400 (0:00:00.031) 0:00:08.890 ********* changed: [managed_node1] => { "changed": true, "checksum": "670d64fc68a9768edb20cad26df2acc703542d85", "dest": "/etc/containers/systemd/nopull.container", "gid": 0, "group": "root", "md5sum": "cedb6667f6cd1b033fe06e2810fe6b19", "mode": "0644", "owner": "root", "secontext": "system_u:object_r:etc_t:s0", "size": 151, "src": "/root/.ansible/tmp/ansible-tmp-1722098158.794963-11918-214056238671650/.source.container", "state": "file", "uid": 0 } TASK [fedora.linux_system_roles.podman : Reload systemctl] ********************* task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/create_update_quadlet_spec.yml:82 Saturday 27 July 2024 12:35:59 -0400 (0:00:00.845) 0:00:09.735 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "__podman_activate_systemd_unit | bool", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Start service] ************************ task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/create_update_quadlet_spec.yml:110 Saturday 27 July 2024 12:35:59 -0400 (0:00:00.033) 0:00:09.768 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "__podman_activate_systemd_unit | bool", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Restart service] ********************** task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/create_update_quadlet_spec.yml:125 Saturday 27 July 2024 12:35:59 -0400 (0:00:00.036) 0:00:09.804 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "__podman_activate_systemd_unit | bool", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Cancel linger] ************************ task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:143 Saturday 27 July 2024 12:35:59 -0400 (0:00:00.036) 0:00:09.840 ********* skipping: [managed_node1] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.podman : Handle credential files - absent] ***** task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:149 Saturday 27 July 2024 12:35:59 -0400 (0:00:00.027) 0:00:09.868 ********* skipping: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } TASK [fedora.linux_system_roles.podman : Handle certs.d files - absent] ******** task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:158 Saturday 27 July 2024 12:35:59 -0400 (0:00:00.027) 0:00:09.895 ********* skipping: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } TASK [Verify image not pulled] ************************************************* task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/tests/podman/tests_quadlet_basic.yml:70 Saturday 27 July 2024 12:35:59 -0400 (0:00:00.043) 0:00:09.939 ********* ok: [managed_node1] => { "changed": false } MSG: All assertions passed TASK [Run role - try to pull bogus image] ************************************** task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/tests/podman/tests_quadlet_basic.yml:74 Saturday 27 July 2024 12:35:59 -0400 (0:00:00.066) 0:00:10.005 ********* included: fedora.linux_system_roles.podman for managed_node1 TASK [fedora.linux_system_roles.podman : Set platform/version specific variables] *** task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:3 Saturday 27 July 2024 12:35:59 -0400 (0:00:00.089) 0:00:10.094 ********* included: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/set_vars.yml for managed_node1 TASK [fedora.linux_system_roles.podman : Ensure ansible_facts used by role] **** task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/set_vars.yml:3 Saturday 27 July 2024 12:36:00 -0400 (0:00:00.056) 0:00:10.150 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "__podman_required_facts | difference(ansible_facts.keys() | list) | length > 0", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Check if system is ostree] ************ task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/set_vars.yml:11 Saturday 27 July 2024 12:36:00 -0400 (0:00:00.038) 0:00:10.189 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "not __podman_is_ostree is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Set flag to indicate system is ostree] *** task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/set_vars.yml:16 Saturday 27 July 2024 12:36:00 -0400 (0:00:00.030) 0:00:10.220 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "not __podman_is_ostree is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Set platform/version specific variables] *** task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/set_vars.yml:20 Saturday 27 July 2024 12:36:00 -0400 (0:00:00.031) 0:00:10.251 ********* ok: [managed_node1] => (item=RedHat.yml) => { "ansible_facts": { "__podman_packages": [ "podman", "shadow-utils-subid" ] }, "ansible_included_var_files": [ "/tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/vars/RedHat.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml" } skipping: [managed_node1] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed_node1] => (item=CentOS_10.yml) => { "ansible_facts": { "__podman_packages": [ "iptables-nft", "podman", "shadow-utils-subid" ] }, "ansible_included_var_files": [ "/tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/vars/CentOS_10.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_10.yml" } ok: [managed_node1] => (item=CentOS_10.yml) => { "ansible_facts": { "__podman_packages": [ "iptables-nft", "podman", "shadow-utils-subid" ] }, "ansible_included_var_files": [ "/tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/vars/CentOS_10.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_10.yml" } TASK [fedora.linux_system_roles.podman : Gather the package facts] ************* task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:6 Saturday 27 July 2024 12:36:00 -0400 (0:00:00.071) 0:00:10.322 ********* ok: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } TASK [fedora.linux_system_roles.podman : Enable copr if requested] ************* task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:10 Saturday 27 July 2024 12:36:00 -0400 (0:00:00.810) 0:00:11.133 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "podman_use_copr | d(false)", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Ensure required packages are installed] *** task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:14 Saturday 27 July 2024 12:36:01 -0400 (0:00:00.031) 0:00:11.164 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "(__podman_packages | difference(ansible_facts.packages))", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Get podman version] ******************* task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:22 Saturday 27 July 2024 12:36:01 -0400 (0:00:00.035) 0:00:11.200 ********* ok: [managed_node1] => { "changed": false, "cmd": [ "podman", "--version" ], "delta": "0:00:00.030430", "end": "2024-07-27 12:36:01.410457", "rc": 0, "start": "2024-07-27 12:36:01.380027" } STDOUT: podman version 5.1.2 TASK [fedora.linux_system_roles.podman : Set podman version] ******************* task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:28 Saturday 27 July 2024 12:36:01 -0400 (0:00:00.418) 0:00:11.619 ********* ok: [managed_node1] => { "ansible_facts": { "podman_version": "5.1.2" }, "changed": false } TASK [fedora.linux_system_roles.podman : Podman package version must be 4.2 or later] *** task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:32 Saturday 27 July 2024 12:36:01 -0400 (0:00:00.034) 0:00:11.653 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "podman_version is version(\"4.2\", \"<\")", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Podman package version must be 4.4 or later for quadlet, secrets] *** task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:39 Saturday 27 July 2024 12:36:01 -0400 (0:00:00.029) 0:00:11.683 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "podman_version is version(\"4.4\", \"<\")", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Podman package version must be 4.4 or later for quadlet, secrets] *** task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:49 Saturday 27 July 2024 12:36:01 -0400 (0:00:00.033) 0:00:11.716 ********* META: end_host conditional evaluated to False, continuing execution for managed_node1 skipping: [managed_node1] => { "skip_reason": "end_host conditional evaluated to False, continuing execution for managed_node1" } MSG: end_host conditional evaluated to false, continuing execution for managed_node1 TASK [fedora.linux_system_roles.podman : Check user and group information] ***** task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:56 Saturday 27 July 2024 12:36:01 -0400 (0:00:00.035) 0:00:11.751 ********* included: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml for managed_node1 TASK [fedora.linux_system_roles.podman : Get user information] ***************** task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:2 Saturday 27 July 2024 12:36:01 -0400 (0:00:00.102) 0:00:11.854 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "'getent_passwd' not in ansible_facts or __podman_user not in ansible_facts['getent_passwd']", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Fail if user does not exist] ********** task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:9 Saturday 27 July 2024 12:36:01 -0400 (0:00:00.036) 0:00:11.890 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "not ansible_facts[\"getent_passwd\"][__podman_user]", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Set group for podman user] ************ task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:16 Saturday 27 July 2024 12:36:01 -0400 (0:00:00.037) 0:00:11.927 ********* ok: [managed_node1] => { "ansible_facts": { "__podman_group": "0" }, "changed": false } TASK [fedora.linux_system_roles.podman : Get group information] **************** task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:28 Saturday 27 July 2024 12:36:01 -0400 (0:00:00.043) 0:00:11.971 ********* ok: [managed_node1] => { "ansible_facts": { "getent_group": { "root": [ "x", "0", "" ] } }, "changed": false } TASK [fedora.linux_system_roles.podman : Set group name] *********************** task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:35 Saturday 27 July 2024 12:36:02 -0400 (0:00:00.398) 0:00:12.370 ********* ok: [managed_node1] => { "ansible_facts": { "__podman_group_name": "root" }, "changed": false } TASK [fedora.linux_system_roles.podman : See if getsubids exists] ************** task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:39 Saturday 27 July 2024 12:36:02 -0400 (0:00:00.040) 0:00:12.410 ********* ok: [managed_node1] => { "changed": false, "stat": { "atime": 1722097719.096431, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 32, "charset": "binary", "checksum": "86395ad7ce62834c967dc50f963a68f042029188", "ctime": 1722097690.6782997, "dev": 51714, "device_type": 0, "executable": true, "exists": true, "gid": 0, "gr_name": "root", "inode": 4755377, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "application/x-pie-executable", "mode": "0755", "mtime": 1719187200.0, "nlink": 1, "path": "/usr/bin/getsubids", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 15728, "uid": 0, "version": "2656437169", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": true, "xoth": true, "xusr": true } } TASK [fedora.linux_system_roles.podman : Check user with getsubids] ************ task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:50 Saturday 27 July 2024 12:36:02 -0400 (0:00:00.402) 0:00:12.813 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "__podman_user not in [\"root\", \"0\"]", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Check group with getsubids] *********** task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:55 Saturday 27 July 2024 12:36:02 -0400 (0:00:00.030) 0:00:12.843 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "__podman_user not in [\"root\", \"0\"]", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Set user subuid and subgid info] ****** task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:60 Saturday 27 July 2024 12:36:02 -0400 (0:00:00.030) 0:00:12.874 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "__podman_user not in [\"root\", \"0\"]", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Get subuid file] ********************** task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:74 Saturday 27 July 2024 12:36:02 -0400 (0:00:00.029) 0:00:12.904 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "not __podman_stat_getsubids.stat.exists", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Get subgid file] ********************** task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:79 Saturday 27 July 2024 12:36:02 -0400 (0:00:00.030) 0:00:12.934 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "not __podman_stat_getsubids.stat.exists", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Set user subuid and subgid info] ****** task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:84 Saturday 27 July 2024 12:36:02 -0400 (0:00:00.029) 0:00:12.964 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "not __podman_stat_getsubids.stat.exists", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Fail if user not in subuid file] ****** task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:94 Saturday 27 July 2024 12:36:02 -0400 (0:00:00.030) 0:00:12.995 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "not __podman_stat_getsubids.stat.exists", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Fail if group not in subgid file] ***** task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:101 Saturday 27 July 2024 12:36:02 -0400 (0:00:00.029) 0:00:13.024 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "not __podman_stat_getsubids.stat.exists", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Set config file paths] **************** task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:62 Saturday 27 July 2024 12:36:02 -0400 (0:00:00.030) 0:00:13.054 ********* ok: [managed_node1] => { "ansible_facts": { "__podman_container_conf_file": "/etc/containers/containers.conf.d/50-systemroles.conf", "__podman_policy_json_file": "/etc/containers/policy.json", "__podman_registries_conf_file": "/etc/containers/registries.conf.d/50-systemroles.conf", "__podman_storage_conf_file": "/etc/containers/storage.conf" }, "changed": false } TASK [fedora.linux_system_roles.podman : Handle container.conf.d] ************** task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:71 Saturday 27 July 2024 12:36:02 -0400 (0:00:00.040) 0:00:13.095 ********* included: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_container_conf_d.yml for managed_node1 TASK [fedora.linux_system_roles.podman : Ensure containers.d exists] *********** task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_container_conf_d.yml:5 Saturday 27 July 2024 12:36:03 -0400 (0:00:00.062) 0:00:13.158 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "podman_containers_conf | length > 0", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Update container config file] ********* task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_container_conf_d.yml:13 Saturday 27 July 2024 12:36:03 -0400 (0:00:00.032) 0:00:13.190 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "podman_containers_conf | length > 0", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Handle registries.conf.d] ************* task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:74 Saturday 27 July 2024 12:36:03 -0400 (0:00:00.030) 0:00:13.221 ********* included: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_registries_conf_d.yml for managed_node1 TASK [fedora.linux_system_roles.podman : Ensure registries.d exists] *********** task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_registries_conf_d.yml:5 Saturday 27 July 2024 12:36:03 -0400 (0:00:00.097) 0:00:13.318 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "podman_registries_conf | length > 0", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Update registries config file] ******** task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_registries_conf_d.yml:13 Saturday 27 July 2024 12:36:03 -0400 (0:00:00.032) 0:00:13.350 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "podman_registries_conf | length > 0", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Handle storage.conf] ****************** task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:77 Saturday 27 July 2024 12:36:03 -0400 (0:00:00.031) 0:00:13.381 ********* included: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_storage_conf.yml for managed_node1 TASK [fedora.linux_system_roles.podman : Ensure storage.conf parent dir exists] *** task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_storage_conf.yml:5 Saturday 27 July 2024 12:36:03 -0400 (0:00:00.065) 0:00:13.446 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "podman_storage_conf | length > 0", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Update storage config file] *********** task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_storage_conf.yml:13 Saturday 27 July 2024 12:36:03 -0400 (0:00:00.031) 0:00:13.478 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "podman_storage_conf | length > 0", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Handle policy.json] ******************* task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:80 Saturday 27 July 2024 12:36:03 -0400 (0:00:00.032) 0:00:13.510 ********* included: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_policy_json.yml for managed_node1 TASK [fedora.linux_system_roles.podman : Ensure policy.json parent dir exists] *** task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_policy_json.yml:6 Saturday 27 July 2024 12:36:03 -0400 (0:00:00.067) 0:00:13.578 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "podman_policy_json | length > 0", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Stat the policy.json file] ************ task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_policy_json.yml:14 Saturday 27 July 2024 12:36:03 -0400 (0:00:00.031) 0:00:13.609 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "podman_policy_json | length > 0", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Get the existing policy.json] ********* task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_policy_json.yml:19 Saturday 27 July 2024 12:36:03 -0400 (0:00:00.032) 0:00:13.641 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "podman_policy_json | length > 0", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Write new policy.json file] *********** task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_policy_json.yml:25 Saturday 27 July 2024 12:36:03 -0400 (0:00:00.031) 0:00:13.673 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "podman_policy_json | length > 0", "skip_reason": "Conditional result was False" } TASK [Manage firewall for specified ports] ************************************* task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:86 Saturday 27 July 2024 12:36:03 -0400 (0:00:00.032) 0:00:13.705 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "podman_firewall | length > 0", "skip_reason": "Conditional result was False" } TASK [Manage selinux for specified ports] ************************************** task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:93 Saturday 27 July 2024 12:36:03 -0400 (0:00:00.031) 0:00:13.736 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "podman_selinux_ports | length > 0", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Keep track of users that need to cancel linger] *** task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:100 Saturday 27 July 2024 12:36:03 -0400 (0:00:00.032) 0:00:13.768 ********* ok: [managed_node1] => { "ansible_facts": { "__podman_cancel_user_linger": [] }, "changed": false } TASK [fedora.linux_system_roles.podman : Handle certs.d files - present] ******* task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:104 Saturday 27 July 2024 12:36:03 -0400 (0:00:00.030) 0:00:13.799 ********* skipping: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } TASK [fedora.linux_system_roles.podman : Handle credential files - present] **** task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:113 Saturday 27 July 2024 12:36:03 -0400 (0:00:00.028) 0:00:13.828 ********* skipping: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } TASK [fedora.linux_system_roles.podman : Handle secrets] *********************** task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:122 Saturday 27 July 2024 12:36:03 -0400 (0:00:00.027) 0:00:13.856 ********* skipping: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } TASK [fedora.linux_system_roles.podman : Handle Kubernetes specifications] ***** task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:129 Saturday 27 July 2024 12:36:03 -0400 (0:00:00.028) 0:00:13.884 ********* skipping: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } TASK [fedora.linux_system_roles.podman : Handle Quadlet specifications] ******** task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:136 Saturday 27 July 2024 12:36:03 -0400 (0:00:00.027) 0:00:13.912 ********* included: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_quadlet_spec.yml for managed_node1 => (item=(censored due to no_log)) TASK [fedora.linux_system_roles.podman : Set per-container variables part 0] *** task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_quadlet_spec.yml:14 Saturday 27 July 2024 12:36:03 -0400 (0:00:00.117) 0:00:14.029 ********* ok: [managed_node1] => { "ansible_facts": { "__podman_quadlet_file_src": "", "__podman_quadlet_spec": { "Container": { "ContainerName": "bogus", "Image": "this_is_a_bogus_image" }, "Install": { "WantedBy": "default.target" } }, "__podman_quadlet_str": "", "__podman_quadlet_template_src": "" }, "changed": false } TASK [fedora.linux_system_roles.podman : Set per-container variables part 1] *** task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_quadlet_spec.yml:25 Saturday 27 July 2024 12:36:03 -0400 (0:00:00.043) 0:00:14.072 ********* ok: [managed_node1] => { "ansible_facts": { "__podman_continue_if_pull_fails": true, "__podman_pull_image": true, "__podman_state": "created", "__podman_systemd_unit_scope": "", "__podman_user": "root" }, "changed": false } TASK [fedora.linux_system_roles.podman : Fail if no quadlet spec is given] ***** task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_quadlet_spec.yml:35 Saturday 27 July 2024 12:36:03 -0400 (0:00:00.041) 0:00:14.114 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "__podman_quadlet_spec | length == 0", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Set per-container variables part 2] *** task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_quadlet_spec.yml:48 Saturday 27 July 2024 12:36:04 -0400 (0:00:00.033) 0:00:14.147 ********* ok: [managed_node1] => { "ansible_facts": { "__podman_quadlet_name": "bogus", "__podman_quadlet_type": "container", "__podman_rootless": false }, "changed": false } TASK [fedora.linux_system_roles.podman : Check user and group information] ***** task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_quadlet_spec.yml:57 Saturday 27 July 2024 12:36:04 -0400 (0:00:00.049) 0:00:14.196 ********* included: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml for managed_node1 TASK [fedora.linux_system_roles.podman : Get user information] ***************** task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:2 Saturday 27 July 2024 12:36:04 -0400 (0:00:00.063) 0:00:14.259 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "'getent_passwd' not in ansible_facts or __podman_user not in ansible_facts['getent_passwd']", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Fail if user does not exist] ********** task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:9 Saturday 27 July 2024 12:36:04 -0400 (0:00:00.035) 0:00:14.294 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "not ansible_facts[\"getent_passwd\"][__podman_user]", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Set group for podman user] ************ task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:16 Saturday 27 July 2024 12:36:04 -0400 (0:00:00.035) 0:00:14.330 ********* ok: [managed_node1] => { "ansible_facts": { "__podman_group": "0" }, "changed": false } TASK [fedora.linux_system_roles.podman : Get group information] **************** task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:28 Saturday 27 July 2024 12:36:04 -0400 (0:00:00.044) 0:00:14.374 ********* ok: [managed_node1] => { "ansible_facts": { "getent_group": { "root": [ "x", "0", "" ] } }, "changed": false } TASK [fedora.linux_system_roles.podman : Set group name] *********************** task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:35 Saturday 27 July 2024 12:36:04 -0400 (0:00:00.402) 0:00:14.777 ********* ok: [managed_node1] => { "ansible_facts": { "__podman_group_name": "root" }, "changed": false } TASK [fedora.linux_system_roles.podman : See if getsubids exists] ************** task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:39 Saturday 27 July 2024 12:36:04 -0400 (0:00:00.041) 0:00:14.818 ********* ok: [managed_node1] => { "changed": false, "stat": { "atime": 1722097719.096431, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 32, "charset": "binary", "checksum": "86395ad7ce62834c967dc50f963a68f042029188", "ctime": 1722097690.6782997, "dev": 51714, "device_type": 0, "executable": true, "exists": true, "gid": 0, "gr_name": "root", "inode": 4755377, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "application/x-pie-executable", "mode": "0755", "mtime": 1719187200.0, "nlink": 1, "path": "/usr/bin/getsubids", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 15728, "uid": 0, "version": "2656437169", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": true, "xoth": true, "xusr": true } } TASK [fedora.linux_system_roles.podman : Check user with getsubids] ************ task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:50 Saturday 27 July 2024 12:36:05 -0400 (0:00:00.395) 0:00:15.214 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "__podman_user not in [\"root\", \"0\"]", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Check group with getsubids] *********** task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:55 Saturday 27 July 2024 12:36:05 -0400 (0:00:00.029) 0:00:15.243 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "__podman_user not in [\"root\", \"0\"]", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Set user subuid and subgid info] ****** task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:60 Saturday 27 July 2024 12:36:05 -0400 (0:00:00.030) 0:00:15.274 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "__podman_user not in [\"root\", \"0\"]", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Get subuid file] ********************** task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:74 Saturday 27 July 2024 12:36:05 -0400 (0:00:00.029) 0:00:15.304 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "not __podman_stat_getsubids.stat.exists", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Get subgid file] ********************** task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:79 Saturday 27 July 2024 12:36:05 -0400 (0:00:00.031) 0:00:15.335 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "not __podman_stat_getsubids.stat.exists", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Set user subuid and subgid info] ****** task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:84 Saturday 27 July 2024 12:36:05 -0400 (0:00:00.029) 0:00:15.365 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "not __podman_stat_getsubids.stat.exists", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Fail if user not in subuid file] ****** task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:94 Saturday 27 July 2024 12:36:05 -0400 (0:00:00.030) 0:00:15.396 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "not __podman_stat_getsubids.stat.exists", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Fail if group not in subgid file] ***** task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:101 Saturday 27 July 2024 12:36:05 -0400 (0:00:00.029) 0:00:15.426 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "not __podman_stat_getsubids.stat.exists", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Set per-container variables part 3] *** task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_quadlet_spec.yml:62 Saturday 27 July 2024 12:36:05 -0400 (0:00:00.066) 0:00:15.492 ********* ok: [managed_node1] => { "ansible_facts": { "__podman_activate_systemd_unit": false, "__podman_images_found": [ "this_is_a_bogus_image" ], "__podman_kube_yamls_raw": "", "__podman_service_name": "bogus.service", "__podman_systemd_scope": "system", "__podman_user_home_dir": "/root", "__podman_xdg_runtime_dir": "/run/user/0" }, "changed": false } TASK [fedora.linux_system_roles.podman : Set per-container variables part 4] *** task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_quadlet_spec.yml:73 Saturday 27 July 2024 12:36:05 -0400 (0:00:00.054) 0:00:15.546 ********* ok: [managed_node1] => { "ansible_facts": { "__podman_quadlet_path": "/etc/containers/systemd" }, "changed": false } TASK [fedora.linux_system_roles.podman : Get kube yaml contents] *************** task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_quadlet_spec.yml:77 Saturday 27 July 2024 12:36:05 -0400 (0:00:00.035) 0:00:15.582 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "__podman_kube_yamls_raw | length > 0", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Set per-container variables part 5] *** task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_quadlet_spec.yml:87 Saturday 27 July 2024 12:36:05 -0400 (0:00:00.032) 0:00:15.614 ********* ok: [managed_node1] => { "ansible_facts": { "__podman_images": [ "this_is_a_bogus_image" ], "__podman_quadlet_file": "/etc/containers/systemd/bogus.container", "__podman_volumes": [] }, "changed": false } TASK [fedora.linux_system_roles.podman : Set per-container variables part 6] *** task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_quadlet_spec.yml:103 Saturday 27 July 2024 12:36:05 -0400 (0:00:00.079) 0:00:15.693 ********* ok: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } TASK [fedora.linux_system_roles.podman : Cleanup quadlets] ********************* task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_quadlet_spec.yml:110 Saturday 27 July 2024 12:36:05 -0400 (0:00:00.038) 0:00:15.732 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "__podman_state == \"absent\"", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Create and update quadlets] *********** task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_quadlet_spec.yml:114 Saturday 27 July 2024 12:36:05 -0400 (0:00:00.030) 0:00:15.763 ********* included: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/create_update_quadlet_spec.yml for managed_node1 TASK [fedora.linux_system_roles.podman : Manage linger] ************************ task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/create_update_quadlet_spec.yml:2 Saturday 27 July 2024 12:36:05 -0400 (0:00:00.069) 0:00:15.833 ********* included: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/manage_linger.yml for managed_node1 TASK [fedora.linux_system_roles.podman : Enable linger if needed] ************** task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/manage_linger.yml:12 Saturday 27 July 2024 12:36:05 -0400 (0:00:00.053) 0:00:15.887 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "__podman_rootless | bool", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Mark user as not yet needing to cancel linger] *** task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/manage_linger.yml:18 Saturday 27 July 2024 12:36:05 -0400 (0:00:00.030) 0:00:15.917 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "__podman_rootless | bool", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Mark user for possible linger cancel] *** task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/manage_linger.yml:22 Saturday 27 July 2024 12:36:05 -0400 (0:00:00.030) 0:00:15.948 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "__podman_rootless | bool", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Create host directories] ************** task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/create_update_quadlet_spec.yml:7 Saturday 27 July 2024 12:36:05 -0400 (0:00:00.030) 0:00:15.978 ********* skipping: [managed_node1] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.podman : Ensure container images are present] *** task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/create_update_quadlet_spec.yml:18 Saturday 27 July 2024 12:36:05 -0400 (0:00:00.028) 0:00:16.007 ********* ok: [managed_node1] => (item=None) => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } ok: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } TASK [fedora.linux_system_roles.podman : Ensure the quadlet directory is present] *** task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/create_update_quadlet_spec.yml:39 Saturday 27 July 2024 12:36:06 -0400 (0:00:00.698) 0:00:16.706 ********* ok: [managed_node1] => { "changed": false, "gid": 0, "group": "root", "mode": "0755", "owner": "root", "path": "/etc/containers/systemd", "secontext": "system_u:object_r:etc_t:s0", "size": 30, "state": "directory", "uid": 0 } TASK [fedora.linux_system_roles.podman : Ensure quadlet file is copied] ******** task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/create_update_quadlet_spec.yml:48 Saturday 27 July 2024 12:36:06 -0400 (0:00:00.412) 0:00:17.119 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "__podman_quadlet_file_src | length > 0", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Ensure quadlet file content is present] *** task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/create_update_quadlet_spec.yml:58 Saturday 27 July 2024 12:36:07 -0400 (0:00:00.032) 0:00:17.151 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "__podman_quadlet_str | length > 0", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Ensure quadlet file is present] ******* task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/create_update_quadlet_spec.yml:70 Saturday 27 July 2024 12:36:07 -0400 (0:00:00.034) 0:00:17.186 ********* changed: [managed_node1] => { "changed": true, "checksum": "1d087e679d135214e8ac9ccaf33b2222916efb7f", "dest": "/etc/containers/systemd/bogus.container", "gid": 0, "group": "root", "md5sum": "97480a9a73734d9f8007d2c06e7fed1f", "mode": "0644", "owner": "root", "secontext": "system_u:object_r:etc_t:s0", "size": 138, "src": "/root/.ansible/tmp/ansible-tmp-1722098167.0931985-12082-65274845531540/.source.container", "state": "file", "uid": 0 } TASK [fedora.linux_system_roles.podman : Reload systemctl] ********************* task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/create_update_quadlet_spec.yml:82 Saturday 27 July 2024 12:36:07 -0400 (0:00:00.752) 0:00:17.939 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "__podman_activate_systemd_unit | bool", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Start service] ************************ task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/create_update_quadlet_spec.yml:110 Saturday 27 July 2024 12:36:07 -0400 (0:00:00.035) 0:00:17.974 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "__podman_activate_systemd_unit | bool", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Restart service] ********************** task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/create_update_quadlet_spec.yml:125 Saturday 27 July 2024 12:36:07 -0400 (0:00:00.036) 0:00:18.011 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "__podman_activate_systemd_unit | bool", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Cancel linger] ************************ task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:143 Saturday 27 July 2024 12:36:07 -0400 (0:00:00.079) 0:00:18.090 ********* skipping: [managed_node1] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.podman : Handle credential files - absent] ***** task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:149 Saturday 27 July 2024 12:36:07 -0400 (0:00:00.028) 0:00:18.119 ********* skipping: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } TASK [fedora.linux_system_roles.podman : Handle certs.d files - absent] ******** task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:158 Saturday 27 July 2024 12:36:08 -0400 (0:00:00.030) 0:00:18.149 ********* skipping: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } TASK [Verify image not pulled and no error] ************************************ task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/tests/podman/tests_quadlet_basic.yml:90 Saturday 27 July 2024 12:36:08 -0400 (0:00:00.046) 0:00:18.196 ********* ok: [managed_node1] => { "changed": false } MSG: All assertions passed TASK [Cleanup] ***************************************************************** task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/tests/podman/tests_quadlet_basic.yml:97 Saturday 27 July 2024 12:36:08 -0400 (0:00:00.037) 0:00:18.233 ********* included: fedora.linux_system_roles.podman for managed_node1 => (item=nopull) included: fedora.linux_system_roles.podman for managed_node1 => (item=bogus) TASK [fedora.linux_system_roles.podman : Set platform/version specific variables] *** task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:3 Saturday 27 July 2024 12:36:08 -0400 (0:00:00.159) 0:00:18.393 ********* included: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/set_vars.yml for managed_node1 TASK [fedora.linux_system_roles.podman : Ensure ansible_facts used by role] **** task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/set_vars.yml:3 Saturday 27 July 2024 12:36:08 -0400 (0:00:00.059) 0:00:18.453 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "__podman_required_facts | difference(ansible_facts.keys() | list) | length > 0", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Check if system is ostree] ************ task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/set_vars.yml:11 Saturday 27 July 2024 12:36:08 -0400 (0:00:00.039) 0:00:18.492 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "not __podman_is_ostree is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Set flag to indicate system is ostree] *** task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/set_vars.yml:16 Saturday 27 July 2024 12:36:08 -0400 (0:00:00.033) 0:00:18.525 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "not __podman_is_ostree is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Set platform/version specific variables] *** task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/set_vars.yml:20 Saturday 27 July 2024 12:36:08 -0400 (0:00:00.032) 0:00:18.557 ********* [WARNING]: TASK: fedora.linux_system_roles.podman : Set platform/version specific variables: The loop variable 'item' is already in use. You should set the `loop_var` value in the `loop_control` option for the task to something else to avoid variable collisions and unexpected behavior. ok: [managed_node1] => (item=RedHat.yml) => { "ansible_facts": { "__podman_packages": [ "podman", "shadow-utils-subid" ] }, "ansible_included_var_files": [ "/tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/vars/RedHat.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml" } skipping: [managed_node1] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed_node1] => (item=CentOS_10.yml) => { "ansible_facts": { "__podman_packages": [ "iptables-nft", "podman", "shadow-utils-subid" ] }, "ansible_included_var_files": [ "/tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/vars/CentOS_10.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_10.yml" } ok: [managed_node1] => (item=CentOS_10.yml) => { "ansible_facts": { "__podman_packages": [ "iptables-nft", "podman", "shadow-utils-subid" ] }, "ansible_included_var_files": [ "/tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/vars/CentOS_10.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_10.yml" } TASK [fedora.linux_system_roles.podman : Gather the package facts] ************* task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:6 Saturday 27 July 2024 12:36:08 -0400 (0:00:00.077) 0:00:18.635 ********* ok: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } TASK [fedora.linux_system_roles.podman : Enable copr if requested] ************* task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:10 Saturday 27 July 2024 12:36:09 -0400 (0:00:00.809) 0:00:19.444 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "podman_use_copr | d(false)", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Ensure required packages are installed] *** task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:14 Saturday 27 July 2024 12:36:09 -0400 (0:00:00.031) 0:00:19.476 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "(__podman_packages | difference(ansible_facts.packages))", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Get podman version] ******************* task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:22 Saturday 27 July 2024 12:36:09 -0400 (0:00:00.074) 0:00:19.550 ********* ok: [managed_node1] => { "changed": false, "cmd": [ "podman", "--version" ], "delta": "0:00:00.029365", "end": "2024-07-27 12:36:09.758269", "rc": 0, "start": "2024-07-27 12:36:09.728904" } STDOUT: podman version 5.1.2 TASK [fedora.linux_system_roles.podman : Set podman version] ******************* task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:28 Saturday 27 July 2024 12:36:09 -0400 (0:00:00.415) 0:00:19.965 ********* ok: [managed_node1] => { "ansible_facts": { "podman_version": "5.1.2" }, "changed": false } TASK [fedora.linux_system_roles.podman : Podman package version must be 4.2 or later] *** task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:32 Saturday 27 July 2024 12:36:09 -0400 (0:00:00.036) 0:00:20.002 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "podman_version is version(\"4.2\", \"<\")", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Podman package version must be 4.4 or later for quadlet, secrets] *** task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:39 Saturday 27 July 2024 12:36:09 -0400 (0:00:00.031) 0:00:20.033 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "podman_version is version(\"4.4\", \"<\")", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Podman package version must be 4.4 or later for quadlet, secrets] *** task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:49 Saturday 27 July 2024 12:36:09 -0400 (0:00:00.035) 0:00:20.069 ********* META: end_host conditional evaluated to False, continuing execution for managed_node1 skipping: [managed_node1] => { "skip_reason": "end_host conditional evaluated to False, continuing execution for managed_node1" } MSG: end_host conditional evaluated to false, continuing execution for managed_node1 TASK [fedora.linux_system_roles.podman : Check user and group information] ***** task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:56 Saturday 27 July 2024 12:36:09 -0400 (0:00:00.036) 0:00:20.106 ********* included: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml for managed_node1 TASK [fedora.linux_system_roles.podman : Get user information] ***************** task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:2 Saturday 27 July 2024 12:36:10 -0400 (0:00:00.069) 0:00:20.175 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "'getent_passwd' not in ansible_facts or __podman_user not in ansible_facts['getent_passwd']", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Fail if user does not exist] ********** task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:9 Saturday 27 July 2024 12:36:10 -0400 (0:00:00.036) 0:00:20.212 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "not ansible_facts[\"getent_passwd\"][__podman_user]", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Set group for podman user] ************ task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:16 Saturday 27 July 2024 12:36:10 -0400 (0:00:00.037) 0:00:20.249 ********* ok: [managed_node1] => { "ansible_facts": { "__podman_group": "0" }, "changed": false } TASK [fedora.linux_system_roles.podman : Get group information] **************** task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:28 Saturday 27 July 2024 12:36:10 -0400 (0:00:00.044) 0:00:20.293 ********* ok: [managed_node1] => { "ansible_facts": { "getent_group": { "root": [ "x", "0", "" ] } }, "changed": false } TASK [fedora.linux_system_roles.podman : Set group name] *********************** task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:35 Saturday 27 July 2024 12:36:10 -0400 (0:00:00.411) 0:00:20.705 ********* ok: [managed_node1] => { "ansible_facts": { "__podman_group_name": "root" }, "changed": false } TASK [fedora.linux_system_roles.podman : See if getsubids exists] ************** task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:39 Saturday 27 July 2024 12:36:10 -0400 (0:00:00.041) 0:00:20.747 ********* ok: [managed_node1] => { "changed": false, "stat": { "atime": 1722097719.096431, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 32, "charset": "binary", "checksum": "86395ad7ce62834c967dc50f963a68f042029188", "ctime": 1722097690.6782997, "dev": 51714, "device_type": 0, "executable": true, "exists": true, "gid": 0, "gr_name": "root", "inode": 4755377, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "application/x-pie-executable", "mode": "0755", "mtime": 1719187200.0, "nlink": 1, "path": "/usr/bin/getsubids", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 15728, "uid": 0, "version": "2656437169", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": true, "xoth": true, "xusr": true } } TASK [fedora.linux_system_roles.podman : Check user with getsubids] ************ task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:50 Saturday 27 July 2024 12:36:10 -0400 (0:00:00.390) 0:00:21.138 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "__podman_user not in [\"root\", \"0\"]", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Check group with getsubids] *********** task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:55 Saturday 27 July 2024 12:36:11 -0400 (0:00:00.030) 0:00:21.169 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "__podman_user not in [\"root\", \"0\"]", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Set user subuid and subgid info] ****** task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:60 Saturday 27 July 2024 12:36:11 -0400 (0:00:00.031) 0:00:21.200 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "__podman_user not in [\"root\", \"0\"]", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Get subuid file] ********************** task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:74 Saturday 27 July 2024 12:36:11 -0400 (0:00:00.030) 0:00:21.231 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "not __podman_stat_getsubids.stat.exists", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Get subgid file] ********************** task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:79 Saturday 27 July 2024 12:36:11 -0400 (0:00:00.031) 0:00:21.262 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "not __podman_stat_getsubids.stat.exists", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Set user subuid and subgid info] ****** task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:84 Saturday 27 July 2024 12:36:11 -0400 (0:00:00.029) 0:00:21.292 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "not __podman_stat_getsubids.stat.exists", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Fail if user not in subuid file] ****** task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:94 Saturday 27 July 2024 12:36:11 -0400 (0:00:00.030) 0:00:21.323 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "not __podman_stat_getsubids.stat.exists", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Fail if group not in subgid file] ***** task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:101 Saturday 27 July 2024 12:36:11 -0400 (0:00:00.030) 0:00:21.353 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "not __podman_stat_getsubids.stat.exists", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Set config file paths] **************** task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:62 Saturday 27 July 2024 12:36:11 -0400 (0:00:00.070) 0:00:21.424 ********* ok: [managed_node1] => { "ansible_facts": { "__podman_container_conf_file": "/etc/containers/containers.conf.d/50-systemroles.conf", "__podman_policy_json_file": "/etc/containers/policy.json", "__podman_registries_conf_file": "/etc/containers/registries.conf.d/50-systemroles.conf", "__podman_storage_conf_file": "/etc/containers/storage.conf" }, "changed": false } TASK [fedora.linux_system_roles.podman : Handle container.conf.d] ************** task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:71 Saturday 27 July 2024 12:36:11 -0400 (0:00:00.042) 0:00:21.466 ********* included: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_container_conf_d.yml for managed_node1 TASK [fedora.linux_system_roles.podman : Ensure containers.d exists] *********** task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_container_conf_d.yml:5 Saturday 27 July 2024 12:36:11 -0400 (0:00:00.065) 0:00:21.532 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "podman_containers_conf | length > 0", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Update container config file] ********* task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_container_conf_d.yml:13 Saturday 27 July 2024 12:36:11 -0400 (0:00:00.032) 0:00:21.564 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "podman_containers_conf | length > 0", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Handle registries.conf.d] ************* task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:74 Saturday 27 July 2024 12:36:11 -0400 (0:00:00.033) 0:00:21.598 ********* included: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_registries_conf_d.yml for managed_node1 TASK [fedora.linux_system_roles.podman : Ensure registries.d exists] *********** task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_registries_conf_d.yml:5 Saturday 27 July 2024 12:36:11 -0400 (0:00:00.064) 0:00:21.663 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "podman_registries_conf | length > 0", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Update registries config file] ******** task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_registries_conf_d.yml:13 Saturday 27 July 2024 12:36:11 -0400 (0:00:00.031) 0:00:21.695 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "podman_registries_conf | length > 0", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Handle storage.conf] ****************** task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:77 Saturday 27 July 2024 12:36:11 -0400 (0:00:00.033) 0:00:21.728 ********* included: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_storage_conf.yml for managed_node1 TASK [fedora.linux_system_roles.podman : Ensure storage.conf parent dir exists] *** task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_storage_conf.yml:5 Saturday 27 July 2024 12:36:11 -0400 (0:00:00.066) 0:00:21.795 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "podman_storage_conf | length > 0", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Update storage config file] *********** task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_storage_conf.yml:13 Saturday 27 July 2024 12:36:11 -0400 (0:00:00.031) 0:00:21.827 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "podman_storage_conf | length > 0", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Handle policy.json] ******************* task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:80 Saturday 27 July 2024 12:36:11 -0400 (0:00:00.032) 0:00:21.859 ********* included: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_policy_json.yml for managed_node1 TASK [fedora.linux_system_roles.podman : Ensure policy.json parent dir exists] *** task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_policy_json.yml:6 Saturday 27 July 2024 12:36:11 -0400 (0:00:00.068) 0:00:21.927 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "podman_policy_json | length > 0", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Stat the policy.json file] ************ task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_policy_json.yml:14 Saturday 27 July 2024 12:36:11 -0400 (0:00:00.032) 0:00:21.960 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "podman_policy_json | length > 0", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Get the existing policy.json] ********* task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_policy_json.yml:19 Saturday 27 July 2024 12:36:11 -0400 (0:00:00.031) 0:00:21.991 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "podman_policy_json | length > 0", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Write new policy.json file] *********** task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_policy_json.yml:25 Saturday 27 July 2024 12:36:11 -0400 (0:00:00.032) 0:00:22.024 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "podman_policy_json | length > 0", "skip_reason": "Conditional result was False" } TASK [Manage firewall for specified ports] ************************************* task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:86 Saturday 27 July 2024 12:36:11 -0400 (0:00:00.031) 0:00:22.055 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "podman_firewall | length > 0", "skip_reason": "Conditional result was False" } TASK [Manage selinux for specified ports] ************************************** task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:93 Saturday 27 July 2024 12:36:11 -0400 (0:00:00.032) 0:00:22.088 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "podman_selinux_ports | length > 0", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Keep track of users that need to cancel linger] *** task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:100 Saturday 27 July 2024 12:36:12 -0400 (0:00:00.069) 0:00:22.158 ********* ok: [managed_node1] => { "ansible_facts": { "__podman_cancel_user_linger": [] }, "changed": false } TASK [fedora.linux_system_roles.podman : Handle certs.d files - present] ******* task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:104 Saturday 27 July 2024 12:36:12 -0400 (0:00:00.033) 0:00:22.191 ********* skipping: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } TASK [fedora.linux_system_roles.podman : Handle credential files - present] **** task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:113 Saturday 27 July 2024 12:36:12 -0400 (0:00:00.028) 0:00:22.219 ********* skipping: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } TASK [fedora.linux_system_roles.podman : Handle secrets] *********************** task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:122 Saturday 27 July 2024 12:36:12 -0400 (0:00:00.029) 0:00:22.249 ********* skipping: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } TASK [fedora.linux_system_roles.podman : Handle Kubernetes specifications] ***** task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:129 Saturday 27 July 2024 12:36:12 -0400 (0:00:00.028) 0:00:22.277 ********* skipping: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } TASK [fedora.linux_system_roles.podman : Handle Quadlet specifications] ******** task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:136 Saturday 27 July 2024 12:36:12 -0400 (0:00:00.029) 0:00:22.307 ********* included: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_quadlet_spec.yml for managed_node1 => (item=(censored due to no_log)) TASK [fedora.linux_system_roles.podman : Set per-container variables part 0] *** task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_quadlet_spec.yml:14 Saturday 27 July 2024 12:36:12 -0400 (0:00:00.081) 0:00:22.389 ********* ok: [managed_node1] => { "ansible_facts": { "__podman_quadlet_file_src": "", "__podman_quadlet_spec": {}, "__podman_quadlet_str": "", "__podman_quadlet_template_src": "" }, "changed": false } TASK [fedora.linux_system_roles.podman : Set per-container variables part 1] *** task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_quadlet_spec.yml:25 Saturday 27 July 2024 12:36:12 -0400 (0:00:00.042) 0:00:22.431 ********* ok: [managed_node1] => { "ansible_facts": { "__podman_continue_if_pull_fails": false, "__podman_pull_image": true, "__podman_state": "absent", "__podman_systemd_unit_scope": "", "__podman_user": "root" }, "changed": false } TASK [fedora.linux_system_roles.podman : Fail if no quadlet spec is given] ***** task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_quadlet_spec.yml:35 Saturday 27 July 2024 12:36:12 -0400 (0:00:00.041) 0:00:22.473 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "__podman_state != \"absent\"", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Set per-container variables part 2] *** task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_quadlet_spec.yml:48 Saturday 27 July 2024 12:36:12 -0400 (0:00:00.034) 0:00:22.508 ********* ok: [managed_node1] => { "ansible_facts": { "__podman_quadlet_name": "nopull", "__podman_quadlet_type": "container", "__podman_rootless": false }, "changed": false } TASK [fedora.linux_system_roles.podman : Check user and group information] ***** task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_quadlet_spec.yml:57 Saturday 27 July 2024 12:36:12 -0400 (0:00:00.050) 0:00:22.559 ********* included: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml for managed_node1 TASK [fedora.linux_system_roles.podman : Get user information] ***************** task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:2 Saturday 27 July 2024 12:36:12 -0400 (0:00:00.062) 0:00:22.621 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "'getent_passwd' not in ansible_facts or __podman_user not in ansible_facts['getent_passwd']", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Fail if user does not exist] ********** task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:9 Saturday 27 July 2024 12:36:12 -0400 (0:00:00.035) 0:00:22.657 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "not ansible_facts[\"getent_passwd\"][__podman_user]", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Set group for podman user] ************ task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:16 Saturday 27 July 2024 12:36:12 -0400 (0:00:00.037) 0:00:22.694 ********* ok: [managed_node1] => { "ansible_facts": { "__podman_group": "0" }, "changed": false } TASK [fedora.linux_system_roles.podman : Get group information] **************** task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:28 Saturday 27 July 2024 12:36:12 -0400 (0:00:00.044) 0:00:22.738 ********* ok: [managed_node1] => { "ansible_facts": { "getent_group": { "root": [ "x", "0", "" ] } }, "changed": false } TASK [fedora.linux_system_roles.podman : Set group name] *********************** task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:35 Saturday 27 July 2024 12:36:13 -0400 (0:00:00.402) 0:00:23.141 ********* ok: [managed_node1] => { "ansible_facts": { "__podman_group_name": "root" }, "changed": false } TASK [fedora.linux_system_roles.podman : See if getsubids exists] ************** task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:39 Saturday 27 July 2024 12:36:13 -0400 (0:00:00.041) 0:00:23.183 ********* ok: [managed_node1] => { "changed": false, "stat": { "atime": 1722097719.096431, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 32, "charset": "binary", "checksum": "86395ad7ce62834c967dc50f963a68f042029188", "ctime": 1722097690.6782997, "dev": 51714, "device_type": 0, "executable": true, "exists": true, "gid": 0, "gr_name": "root", "inode": 4755377, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "application/x-pie-executable", "mode": "0755", "mtime": 1719187200.0, "nlink": 1, "path": "/usr/bin/getsubids", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 15728, "uid": 0, "version": "2656437169", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": true, "xoth": true, "xusr": true } } TASK [fedora.linux_system_roles.podman : Check user with getsubids] ************ task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:50 Saturday 27 July 2024 12:36:13 -0400 (0:00:00.399) 0:00:23.583 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "__podman_user not in [\"root\", \"0\"]", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Check group with getsubids] *********** task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:55 Saturday 27 July 2024 12:36:13 -0400 (0:00:00.030) 0:00:23.614 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "__podman_user not in [\"root\", \"0\"]", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Set user subuid and subgid info] ****** task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:60 Saturday 27 July 2024 12:36:13 -0400 (0:00:00.070) 0:00:23.684 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "__podman_user not in [\"root\", \"0\"]", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Get subuid file] ********************** task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:74 Saturday 27 July 2024 12:36:13 -0400 (0:00:00.031) 0:00:23.716 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "not __podman_stat_getsubids.stat.exists", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Get subgid file] ********************** task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:79 Saturday 27 July 2024 12:36:13 -0400 (0:00:00.030) 0:00:23.747 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "not __podman_stat_getsubids.stat.exists", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Set user subuid and subgid info] ****** task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:84 Saturday 27 July 2024 12:36:13 -0400 (0:00:00.032) 0:00:23.779 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "not __podman_stat_getsubids.stat.exists", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Fail if user not in subuid file] ****** task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:94 Saturday 27 July 2024 12:36:13 -0400 (0:00:00.031) 0:00:23.810 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "not __podman_stat_getsubids.stat.exists", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Fail if group not in subgid file] ***** task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:101 Saturday 27 July 2024 12:36:13 -0400 (0:00:00.032) 0:00:23.843 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "not __podman_stat_getsubids.stat.exists", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Set per-container variables part 3] *** task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_quadlet_spec.yml:62 Saturday 27 July 2024 12:36:13 -0400 (0:00:00.030) 0:00:23.874 ********* ok: [managed_node1] => { "ansible_facts": { "__podman_activate_systemd_unit": true, "__podman_images_found": [], "__podman_kube_yamls_raw": "", "__podman_service_name": "nopull.service", "__podman_systemd_scope": "system", "__podman_user_home_dir": "/root", "__podman_xdg_runtime_dir": "/run/user/0" }, "changed": false } TASK [fedora.linux_system_roles.podman : Set per-container variables part 4] *** task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_quadlet_spec.yml:73 Saturday 27 July 2024 12:36:13 -0400 (0:00:00.056) 0:00:23.930 ********* ok: [managed_node1] => { "ansible_facts": { "__podman_quadlet_path": "/etc/containers/systemd" }, "changed": false } TASK [fedora.linux_system_roles.podman : Get kube yaml contents] *************** task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_quadlet_spec.yml:77 Saturday 27 July 2024 12:36:13 -0400 (0:00:00.034) 0:00:23.965 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "__podman_state != \"absent\"", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Set per-container variables part 5] *** task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_quadlet_spec.yml:87 Saturday 27 July 2024 12:36:13 -0400 (0:00:00.031) 0:00:23.996 ********* ok: [managed_node1] => { "ansible_facts": { "__podman_images": [], "__podman_quadlet_file": "/etc/containers/systemd/nopull.container", "__podman_volumes": [] }, "changed": false } TASK [fedora.linux_system_roles.podman : Set per-container variables part 6] *** task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_quadlet_spec.yml:103 Saturday 27 July 2024 12:36:13 -0400 (0:00:00.079) 0:00:24.076 ********* ok: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } TASK [fedora.linux_system_roles.podman : Cleanup quadlets] ********************* task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_quadlet_spec.yml:110 Saturday 27 July 2024 12:36:13 -0400 (0:00:00.041) 0:00:24.117 ********* included: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/cleanup_quadlet_spec.yml for managed_node1 TASK [fedora.linux_system_roles.podman : Stat XDG_RUNTIME_DIR] ***************** task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/cleanup_quadlet_spec.yml:4 Saturday 27 July 2024 12:36:14 -0400 (0:00:00.079) 0:00:24.197 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "__podman_rootless | bool", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Stop and disable service] ************* task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/cleanup_quadlet_spec.yml:12 Saturday 27 July 2024 12:36:14 -0400 (0:00:00.031) 0:00:24.228 ********* ok: [managed_node1] => { "changed": false, "failed_when_result": false } MSG: Could not find the requested service nopull.service: host TASK [fedora.linux_system_roles.podman : See if quadlet file exists] *********** task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/cleanup_quadlet_spec.yml:33 Saturday 27 July 2024 12:36:14 -0400 (0:00:00.832) 0:00:25.061 ********* ok: [managed_node1] => { "changed": false, "stat": { "atime": 1722098159.5199149, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "670d64fc68a9768edb20cad26df2acc703542d85", "ctime": 1722098159.5269148, "dev": 51714, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 157286595, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0644", "mtime": 1722098159.1399264, "nlink": 1, "path": "/etc/containers/systemd/nopull.container", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 151, "uid": 0, "version": "3585088134", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.podman : Parse quadlet file] ******************* task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/cleanup_quadlet_spec.yml:38 Saturday 27 July 2024 12:36:15 -0400 (0:00:00.407) 0:00:25.468 ********* included: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/parse_quadlet_file.yml for managed_node1 TASK [fedora.linux_system_roles.podman : Slurp quadlet file] ******************* task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/parse_quadlet_file.yml:6 Saturday 27 July 2024 12:36:15 -0400 (0:00:00.060) 0:00:25.529 ********* ok: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } TASK [fedora.linux_system_roles.podman : Parse quadlet file] ******************* task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/parse_quadlet_file.yml:12 Saturday 27 July 2024 12:36:15 -0400 (0:00:00.537) 0:00:26.067 ********* ok: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } TASK [fedora.linux_system_roles.podman : Parse quadlet yaml file] ************** task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/parse_quadlet_file.yml:44 Saturday 27 July 2024 12:36:15 -0400 (0:00:00.052) 0:00:26.119 ********* skipping: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } TASK [fedora.linux_system_roles.podman : Reset raw variable] ******************* task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/parse_quadlet_file.yml:52 Saturday 27 July 2024 12:36:16 -0400 (0:00:00.074) 0:00:26.194 ********* ok: [managed_node1] => { "ansible_facts": { "__podman_quadlet_raw": null }, "changed": false } TASK [fedora.linux_system_roles.podman : Remove quadlet file] ****************** task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/cleanup_quadlet_spec.yml:42 Saturday 27 July 2024 12:36:16 -0400 (0:00:00.034) 0:00:26.228 ********* changed: [managed_node1] => { "changed": true, "path": "/etc/containers/systemd/nopull.container", "state": "absent" } TASK [fedora.linux_system_roles.podman : Refresh systemd] ********************** task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/cleanup_quadlet_spec.yml:48 Saturday 27 July 2024 12:36:16 -0400 (0:00:00.408) 0:00:26.636 ********* ok: [managed_node1] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.podman : Remove managed resource] ************** task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/cleanup_quadlet_spec.yml:58 Saturday 27 July 2024 12:36:17 -0400 (0:00:00.798) 0:00:27.435 ********* ok: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } TASK [fedora.linux_system_roles.podman : Remove volumes] *********************** task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/cleanup_quadlet_spec.yml:95 Saturday 27 July 2024 12:36:17 -0400 (0:00:00.452) 0:00:27.887 ********* skipping: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } TASK [fedora.linux_system_roles.podman : Clear parsed podman variable] ********* task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/cleanup_quadlet_spec.yml:112 Saturday 27 July 2024 12:36:17 -0400 (0:00:00.048) 0:00:27.936 ********* ok: [managed_node1] => { "ansible_facts": { "__podman_quadlet_parsed": null }, "changed": false } TASK [fedora.linux_system_roles.podman : Prune images no longer in use] ******** task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/cleanup_quadlet_spec.yml:116 Saturday 27 July 2024 12:36:17 -0400 (0:00:00.032) 0:00:27.969 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "podman_prune_images | bool", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Manage linger] ************************ task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/cleanup_quadlet_spec.yml:127 Saturday 27 July 2024 12:36:17 -0400 (0:00:00.034) 0:00:28.003 ********* included: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/manage_linger.yml for managed_node1 TASK [fedora.linux_system_roles.podman : Enable linger if needed] ************** task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/manage_linger.yml:12 Saturday 27 July 2024 12:36:17 -0400 (0:00:00.058) 0:00:28.062 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "__podman_rootless | bool", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Mark user as not yet needing to cancel linger] *** task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/manage_linger.yml:18 Saturday 27 July 2024 12:36:17 -0400 (0:00:00.031) 0:00:28.093 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "__podman_rootless | bool", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Mark user for possible linger cancel] *** task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/manage_linger.yml:22 Saturday 27 July 2024 12:36:17 -0400 (0:00:00.032) 0:00:28.125 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "__podman_rootless | bool", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : For testing and debugging - images] *** task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/cleanup_quadlet_spec.yml:137 Saturday 27 July 2024 12:36:18 -0400 (0:00:00.031) 0:00:28.156 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "__podman_test_debug | d(false)", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : For testing and debugging - volumes] *** task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/cleanup_quadlet_spec.yml:146 Saturday 27 July 2024 12:36:18 -0400 (0:00:00.035) 0:00:28.192 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "__podman_test_debug | d(false)", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : For testing and debugging - containers] *** task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/cleanup_quadlet_spec.yml:155 Saturday 27 July 2024 12:36:18 -0400 (0:00:00.034) 0:00:28.226 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "__podman_test_debug | d(false)", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : For testing and debugging - networks] *** task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/cleanup_quadlet_spec.yml:164 Saturday 27 July 2024 12:36:18 -0400 (0:00:00.035) 0:00:28.262 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "__podman_test_debug | d(false)", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : For testing and debugging - secrets] *** task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/cleanup_quadlet_spec.yml:173 Saturday 27 July 2024 12:36:18 -0400 (0:00:00.034) 0:00:28.297 ********* skipping: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } TASK [fedora.linux_system_roles.podman : For testing and debugging - services] *** task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/cleanup_quadlet_spec.yml:183 Saturday 27 July 2024 12:36:18 -0400 (0:00:00.035) 0:00:28.332 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "__podman_test_debug | d(false)", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Create and update quadlets] *********** task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_quadlet_spec.yml:114 Saturday 27 July 2024 12:36:18 -0400 (0:00:00.034) 0:00:28.367 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "__podman_state != \"absent\"", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Cancel linger] ************************ task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:143 Saturday 27 July 2024 12:36:18 -0400 (0:00:00.031) 0:00:28.399 ********* skipping: [managed_node1] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.podman : Handle credential files - absent] ***** task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:149 Saturday 27 July 2024 12:36:18 -0400 (0:00:00.028) 0:00:28.428 ********* skipping: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } TASK [fedora.linux_system_roles.podman : Handle certs.d files - absent] ******** task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:158 Saturday 27 July 2024 12:36:18 -0400 (0:00:00.074) 0:00:28.502 ********* skipping: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } TASK [fedora.linux_system_roles.podman : Set platform/version specific variables] *** task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:3 Saturday 27 July 2024 12:36:18 -0400 (0:00:00.046) 0:00:28.549 ********* included: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/set_vars.yml for managed_node1 TASK [fedora.linux_system_roles.podman : Ensure ansible_facts used by role] **** task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/set_vars.yml:3 Saturday 27 July 2024 12:36:18 -0400 (0:00:00.059) 0:00:28.608 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "__podman_required_facts | difference(ansible_facts.keys() | list) | length > 0", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Check if system is ostree] ************ task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/set_vars.yml:11 Saturday 27 July 2024 12:36:18 -0400 (0:00:00.039) 0:00:28.648 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "not __podman_is_ostree is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Set flag to indicate system is ostree] *** task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/set_vars.yml:16 Saturday 27 July 2024 12:36:18 -0400 (0:00:00.033) 0:00:28.681 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "not __podman_is_ostree is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Set platform/version specific variables] *** task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/set_vars.yml:20 Saturday 27 July 2024 12:36:18 -0400 (0:00:00.033) 0:00:28.715 ********* ok: [managed_node1] => (item=RedHat.yml) => { "ansible_facts": { "__podman_packages": [ "podman", "shadow-utils-subid" ] }, "ansible_included_var_files": [ "/tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/vars/RedHat.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml" } skipping: [managed_node1] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed_node1] => (item=CentOS_10.yml) => { "ansible_facts": { "__podman_packages": [ "iptables-nft", "podman", "shadow-utils-subid" ] }, "ansible_included_var_files": [ "/tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/vars/CentOS_10.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_10.yml" } ok: [managed_node1] => (item=CentOS_10.yml) => { "ansible_facts": { "__podman_packages": [ "iptables-nft", "podman", "shadow-utils-subid" ] }, "ansible_included_var_files": [ "/tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/vars/CentOS_10.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_10.yml" } TASK [fedora.linux_system_roles.podman : Gather the package facts] ************* task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:6 Saturday 27 July 2024 12:36:18 -0400 (0:00:00.077) 0:00:28.793 ********* ok: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } TASK [fedora.linux_system_roles.podman : Enable copr if requested] ************* task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:10 Saturday 27 July 2024 12:36:19 -0400 (0:00:00.806) 0:00:29.599 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "podman_use_copr | d(false)", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Ensure required packages are installed] *** task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:14 Saturday 27 July 2024 12:36:19 -0400 (0:00:00.032) 0:00:29.632 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "(__podman_packages | difference(ansible_facts.packages))", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Get podman version] ******************* task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:22 Saturday 27 July 2024 12:36:19 -0400 (0:00:00.036) 0:00:29.668 ********* ok: [managed_node1] => { "changed": false, "cmd": [ "podman", "--version" ], "delta": "0:00:00.029555", "end": "2024-07-27 12:36:19.878808", "rc": 0, "start": "2024-07-27 12:36:19.849253" } STDOUT: podman version 5.1.2 TASK [fedora.linux_system_roles.podman : Set podman version] ******************* task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:28 Saturday 27 July 2024 12:36:19 -0400 (0:00:00.417) 0:00:30.086 ********* ok: [managed_node1] => { "ansible_facts": { "podman_version": "5.1.2" }, "changed": false } TASK [fedora.linux_system_roles.podman : Podman package version must be 4.2 or later] *** task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:32 Saturday 27 July 2024 12:36:19 -0400 (0:00:00.035) 0:00:30.122 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "podman_version is version(\"4.2\", \"<\")", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Podman package version must be 4.4 or later for quadlet, secrets] *** task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:39 Saturday 27 July 2024 12:36:20 -0400 (0:00:00.031) 0:00:30.153 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "podman_version is version(\"4.4\", \"<\")", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Podman package version must be 4.4 or later for quadlet, secrets] *** task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:49 Saturday 27 July 2024 12:36:20 -0400 (0:00:00.035) 0:00:30.189 ********* META: end_host conditional evaluated to False, continuing execution for managed_node1 skipping: [managed_node1] => { "skip_reason": "end_host conditional evaluated to False, continuing execution for managed_node1" } MSG: end_host conditional evaluated to false, continuing execution for managed_node1 TASK [fedora.linux_system_roles.podman : Check user and group information] ***** task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:56 Saturday 27 July 2024 12:36:20 -0400 (0:00:00.035) 0:00:30.225 ********* included: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml for managed_node1 TASK [fedora.linux_system_roles.podman : Get user information] ***************** task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:2 Saturday 27 July 2024 12:36:20 -0400 (0:00:00.110) 0:00:30.336 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "'getent_passwd' not in ansible_facts or __podman_user not in ansible_facts['getent_passwd']", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Fail if user does not exist] ********** task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:9 Saturday 27 July 2024 12:36:20 -0400 (0:00:00.036) 0:00:30.373 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "not ansible_facts[\"getent_passwd\"][__podman_user]", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Set group for podman user] ************ task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:16 Saturday 27 July 2024 12:36:20 -0400 (0:00:00.038) 0:00:30.411 ********* ok: [managed_node1] => { "ansible_facts": { "__podman_group": "0" }, "changed": false } TASK [fedora.linux_system_roles.podman : Get group information] **************** task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:28 Saturday 27 July 2024 12:36:20 -0400 (0:00:00.044) 0:00:30.456 ********* ok: [managed_node1] => { "ansible_facts": { "getent_group": { "root": [ "x", "0", "" ] } }, "changed": false } TASK [fedora.linux_system_roles.podman : Set group name] *********************** task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:35 Saturday 27 July 2024 12:36:20 -0400 (0:00:00.394) 0:00:30.851 ********* ok: [managed_node1] => { "ansible_facts": { "__podman_group_name": "root" }, "changed": false } TASK [fedora.linux_system_roles.podman : See if getsubids exists] ************** task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:39 Saturday 27 July 2024 12:36:20 -0400 (0:00:00.042) 0:00:30.894 ********* ok: [managed_node1] => { "changed": false, "stat": { "atime": 1722097719.096431, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 32, "charset": "binary", "checksum": "86395ad7ce62834c967dc50f963a68f042029188", "ctime": 1722097690.6782997, "dev": 51714, "device_type": 0, "executable": true, "exists": true, "gid": 0, "gr_name": "root", "inode": 4755377, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "application/x-pie-executable", "mode": "0755", "mtime": 1719187200.0, "nlink": 1, "path": "/usr/bin/getsubids", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 15728, "uid": 0, "version": "2656437169", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": true, "xoth": true, "xusr": true } } TASK [fedora.linux_system_roles.podman : Check user with getsubids] ************ task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:50 Saturday 27 July 2024 12:36:21 -0400 (0:00:00.393) 0:00:31.287 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "__podman_user not in [\"root\", \"0\"]", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Check group with getsubids] *********** task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:55 Saturday 27 July 2024 12:36:21 -0400 (0:00:00.032) 0:00:31.320 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "__podman_user not in [\"root\", \"0\"]", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Set user subuid and subgid info] ****** task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:60 Saturday 27 July 2024 12:36:21 -0400 (0:00:00.031) 0:00:31.352 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "__podman_user not in [\"root\", \"0\"]", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Get subuid file] ********************** task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:74 Saturday 27 July 2024 12:36:21 -0400 (0:00:00.033) 0:00:31.385 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "not __podman_stat_getsubids.stat.exists", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Get subgid file] ********************** task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:79 Saturday 27 July 2024 12:36:21 -0400 (0:00:00.032) 0:00:31.417 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "not __podman_stat_getsubids.stat.exists", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Set user subuid and subgid info] ****** task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:84 Saturday 27 July 2024 12:36:21 -0400 (0:00:00.033) 0:00:31.451 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "not __podman_stat_getsubids.stat.exists", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Fail if user not in subuid file] ****** task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:94 Saturday 27 July 2024 12:36:21 -0400 (0:00:00.031) 0:00:31.483 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "not __podman_stat_getsubids.stat.exists", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Fail if group not in subgid file] ***** task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:101 Saturday 27 July 2024 12:36:21 -0400 (0:00:00.032) 0:00:31.515 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "not __podman_stat_getsubids.stat.exists", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Set config file paths] **************** task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:62 Saturday 27 July 2024 12:36:21 -0400 (0:00:00.031) 0:00:31.547 ********* ok: [managed_node1] => { "ansible_facts": { "__podman_container_conf_file": "/etc/containers/containers.conf.d/50-systemroles.conf", "__podman_policy_json_file": "/etc/containers/policy.json", "__podman_registries_conf_file": "/etc/containers/registries.conf.d/50-systemroles.conf", "__podman_storage_conf_file": "/etc/containers/storage.conf" }, "changed": false } TASK [fedora.linux_system_roles.podman : Handle container.conf.d] ************** task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:71 Saturday 27 July 2024 12:36:21 -0400 (0:00:00.043) 0:00:31.591 ********* included: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_container_conf_d.yml for managed_node1 TASK [fedora.linux_system_roles.podman : Ensure containers.d exists] *********** task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_container_conf_d.yml:5 Saturday 27 July 2024 12:36:21 -0400 (0:00:00.066) 0:00:31.657 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "podman_containers_conf | length > 0", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Update container config file] ********* task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_container_conf_d.yml:13 Saturday 27 July 2024 12:36:21 -0400 (0:00:00.033) 0:00:31.690 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "podman_containers_conf | length > 0", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Handle registries.conf.d] ************* task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:74 Saturday 27 July 2024 12:36:21 -0400 (0:00:00.034) 0:00:31.725 ********* included: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_registries_conf_d.yml for managed_node1 TASK [fedora.linux_system_roles.podman : Ensure registries.d exists] *********** task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_registries_conf_d.yml:5 Saturday 27 July 2024 12:36:21 -0400 (0:00:00.115) 0:00:31.841 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "podman_registries_conf | length > 0", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Update registries config file] ******** task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_registries_conf_d.yml:13 Saturday 27 July 2024 12:36:21 -0400 (0:00:00.032) 0:00:31.874 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "podman_registries_conf | length > 0", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Handle storage.conf] ****************** task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:77 Saturday 27 July 2024 12:36:21 -0400 (0:00:00.034) 0:00:31.908 ********* included: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_storage_conf.yml for managed_node1 TASK [fedora.linux_system_roles.podman : Ensure storage.conf parent dir exists] *** task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_storage_conf.yml:5 Saturday 27 July 2024 12:36:21 -0400 (0:00:00.068) 0:00:31.976 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "podman_storage_conf | length > 0", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Update storage config file] *********** task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_storage_conf.yml:13 Saturday 27 July 2024 12:36:21 -0400 (0:00:00.033) 0:00:32.009 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "podman_storage_conf | length > 0", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Handle policy.json] ******************* task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:80 Saturday 27 July 2024 12:36:21 -0400 (0:00:00.034) 0:00:32.044 ********* included: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_policy_json.yml for managed_node1 TASK [fedora.linux_system_roles.podman : Ensure policy.json parent dir exists] *** task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_policy_json.yml:6 Saturday 27 July 2024 12:36:21 -0400 (0:00:00.071) 0:00:32.115 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "podman_policy_json | length > 0", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Stat the policy.json file] ************ task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_policy_json.yml:14 Saturday 27 July 2024 12:36:22 -0400 (0:00:00.033) 0:00:32.149 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "podman_policy_json | length > 0", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Get the existing policy.json] ********* task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_policy_json.yml:19 Saturday 27 July 2024 12:36:22 -0400 (0:00:00.032) 0:00:32.182 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "podman_policy_json | length > 0", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Write new policy.json file] *********** task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_policy_json.yml:25 Saturday 27 July 2024 12:36:22 -0400 (0:00:00.034) 0:00:32.216 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "podman_policy_json | length > 0", "skip_reason": "Conditional result was False" } TASK [Manage firewall for specified ports] ************************************* task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:86 Saturday 27 July 2024 12:36:22 -0400 (0:00:00.033) 0:00:32.249 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "podman_firewall | length > 0", "skip_reason": "Conditional result was False" } TASK [Manage selinux for specified ports] ************************************** task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:93 Saturday 27 July 2024 12:36:22 -0400 (0:00:00.034) 0:00:32.284 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "podman_selinux_ports | length > 0", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Keep track of users that need to cancel linger] *** task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:100 Saturday 27 July 2024 12:36:22 -0400 (0:00:00.033) 0:00:32.317 ********* ok: [managed_node1] => { "ansible_facts": { "__podman_cancel_user_linger": [] }, "changed": false } TASK [fedora.linux_system_roles.podman : Handle certs.d files - present] ******* task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:104 Saturday 27 July 2024 12:36:22 -0400 (0:00:00.035) 0:00:32.352 ********* skipping: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } TASK [fedora.linux_system_roles.podman : Handle credential files - present] **** task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:113 Saturday 27 July 2024 12:36:22 -0400 (0:00:00.029) 0:00:32.382 ********* skipping: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } TASK [fedora.linux_system_roles.podman : Handle secrets] *********************** task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:122 Saturday 27 July 2024 12:36:22 -0400 (0:00:00.030) 0:00:32.413 ********* skipping: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } TASK [fedora.linux_system_roles.podman : Handle Kubernetes specifications] ***** task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:129 Saturday 27 July 2024 12:36:22 -0400 (0:00:00.029) 0:00:32.442 ********* skipping: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } TASK [fedora.linux_system_roles.podman : Handle Quadlet specifications] ******** task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:136 Saturday 27 July 2024 12:36:22 -0400 (0:00:00.030) 0:00:32.472 ********* included: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_quadlet_spec.yml for managed_node1 => (item=(censored due to no_log)) TASK [fedora.linux_system_roles.podman : Set per-container variables part 0] *** task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_quadlet_spec.yml:14 Saturday 27 July 2024 12:36:22 -0400 (0:00:00.135) 0:00:32.607 ********* ok: [managed_node1] => { "ansible_facts": { "__podman_quadlet_file_src": "", "__podman_quadlet_spec": {}, "__podman_quadlet_str": "", "__podman_quadlet_template_src": "" }, "changed": false } TASK [fedora.linux_system_roles.podman : Set per-container variables part 1] *** task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_quadlet_spec.yml:25 Saturday 27 July 2024 12:36:22 -0400 (0:00:00.044) 0:00:32.652 ********* ok: [managed_node1] => { "ansible_facts": { "__podman_continue_if_pull_fails": false, "__podman_pull_image": true, "__podman_state": "absent", "__podman_systemd_unit_scope": "", "__podman_user": "root" }, "changed": false } TASK [fedora.linux_system_roles.podman : Fail if no quadlet spec is given] ***** task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_quadlet_spec.yml:35 Saturday 27 July 2024 12:36:22 -0400 (0:00:00.043) 0:00:32.696 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "__podman_state != \"absent\"", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Set per-container variables part 2] *** task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_quadlet_spec.yml:48 Saturday 27 July 2024 12:36:22 -0400 (0:00:00.037) 0:00:32.733 ********* ok: [managed_node1] => { "ansible_facts": { "__podman_quadlet_name": "bogus", "__podman_quadlet_type": "container", "__podman_rootless": false }, "changed": false } TASK [fedora.linux_system_roles.podman : Check user and group information] ***** task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_quadlet_spec.yml:57 Saturday 27 July 2024 12:36:22 -0400 (0:00:00.052) 0:00:32.786 ********* included: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml for managed_node1 TASK [fedora.linux_system_roles.podman : Get user information] ***************** task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:2 Saturday 27 July 2024 12:36:22 -0400 (0:00:00.065) 0:00:32.851 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "'getent_passwd' not in ansible_facts or __podman_user not in ansible_facts['getent_passwd']", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Fail if user does not exist] ********** task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:9 Saturday 27 July 2024 12:36:22 -0400 (0:00:00.036) 0:00:32.888 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "not ansible_facts[\"getent_passwd\"][__podman_user]", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Set group for podman user] ************ task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:16 Saturday 27 July 2024 12:36:22 -0400 (0:00:00.038) 0:00:32.927 ********* ok: [managed_node1] => { "ansible_facts": { "__podman_group": "0" }, "changed": false } TASK [fedora.linux_system_roles.podman : Get group information] **************** task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:28 Saturday 27 July 2024 12:36:22 -0400 (0:00:00.045) 0:00:32.972 ********* ok: [managed_node1] => { "ansible_facts": { "getent_group": { "root": [ "x", "0", "" ] } }, "changed": false } TASK [fedora.linux_system_roles.podman : Set group name] *********************** task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:35 Saturday 27 July 2024 12:36:23 -0400 (0:00:00.399) 0:00:33.372 ********* ok: [managed_node1] => { "ansible_facts": { "__podman_group_name": "root" }, "changed": false } TASK [fedora.linux_system_roles.podman : See if getsubids exists] ************** task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:39 Saturday 27 July 2024 12:36:23 -0400 (0:00:00.042) 0:00:33.415 ********* ok: [managed_node1] => { "changed": false, "stat": { "atime": 1722097719.096431, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 32, "charset": "binary", "checksum": "86395ad7ce62834c967dc50f963a68f042029188", "ctime": 1722097690.6782997, "dev": 51714, "device_type": 0, "executable": true, "exists": true, "gid": 0, "gr_name": "root", "inode": 4755377, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "application/x-pie-executable", "mode": "0755", "mtime": 1719187200.0, "nlink": 1, "path": "/usr/bin/getsubids", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 15728, "uid": 0, "version": "2656437169", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": true, "xoth": true, "xusr": true } } TASK [fedora.linux_system_roles.podman : Check user with getsubids] ************ task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:50 Saturday 27 July 2024 12:36:23 -0400 (0:00:00.396) 0:00:33.811 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "__podman_user not in [\"root\", \"0\"]", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Check group with getsubids] *********** task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:55 Saturday 27 July 2024 12:36:23 -0400 (0:00:00.031) 0:00:33.843 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "__podman_user not in [\"root\", \"0\"]", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Set user subuid and subgid info] ****** task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:60 Saturday 27 July 2024 12:36:23 -0400 (0:00:00.030) 0:00:33.874 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "__podman_user not in [\"root\", \"0\"]", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Get subuid file] ********************** task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:74 Saturday 27 July 2024 12:36:23 -0400 (0:00:00.032) 0:00:33.906 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "not __podman_stat_getsubids.stat.exists", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Get subgid file] ********************** task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:79 Saturday 27 July 2024 12:36:23 -0400 (0:00:00.031) 0:00:33.937 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "not __podman_stat_getsubids.stat.exists", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Set user subuid and subgid info] ****** task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:84 Saturday 27 July 2024 12:36:23 -0400 (0:00:00.032) 0:00:33.969 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "not __podman_stat_getsubids.stat.exists", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Fail if user not in subuid file] ****** task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:94 Saturday 27 July 2024 12:36:23 -0400 (0:00:00.030) 0:00:34.000 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "not __podman_stat_getsubids.stat.exists", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Fail if group not in subgid file] ***** task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:101 Saturday 27 July 2024 12:36:23 -0400 (0:00:00.031) 0:00:34.032 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "not __podman_stat_getsubids.stat.exists", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Set per-container variables part 3] *** task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_quadlet_spec.yml:62 Saturday 27 July 2024 12:36:23 -0400 (0:00:00.030) 0:00:34.062 ********* ok: [managed_node1] => { "ansible_facts": { "__podman_activate_systemd_unit": true, "__podman_images_found": [], "__podman_kube_yamls_raw": "", "__podman_service_name": "bogus.service", "__podman_systemd_scope": "system", "__podman_user_home_dir": "/root", "__podman_xdg_runtime_dir": "/run/user/0" }, "changed": false } TASK [fedora.linux_system_roles.podman : Set per-container variables part 4] *** task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_quadlet_spec.yml:73 Saturday 27 July 2024 12:36:23 -0400 (0:00:00.056) 0:00:34.119 ********* ok: [managed_node1] => { "ansible_facts": { "__podman_quadlet_path": "/etc/containers/systemd" }, "changed": false } TASK [fedora.linux_system_roles.podman : Get kube yaml contents] *************** task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_quadlet_spec.yml:77 Saturday 27 July 2024 12:36:24 -0400 (0:00:00.078) 0:00:34.197 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "__podman_state != \"absent\"", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Set per-container variables part 5] *** task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_quadlet_spec.yml:87 Saturday 27 July 2024 12:36:24 -0400 (0:00:00.031) 0:00:34.229 ********* ok: [managed_node1] => { "ansible_facts": { "__podman_images": [], "__podman_quadlet_file": "/etc/containers/systemd/bogus.container", "__podman_volumes": [] }, "changed": false } TASK [fedora.linux_system_roles.podman : Set per-container variables part 6] *** task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_quadlet_spec.yml:103 Saturday 27 July 2024 12:36:24 -0400 (0:00:00.080) 0:00:34.309 ********* ok: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } TASK [fedora.linux_system_roles.podman : Cleanup quadlets] ********************* task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_quadlet_spec.yml:110 Saturday 27 July 2024 12:36:24 -0400 (0:00:00.041) 0:00:34.351 ********* included: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/cleanup_quadlet_spec.yml for managed_node1 TASK [fedora.linux_system_roles.podman : Stat XDG_RUNTIME_DIR] ***************** task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/cleanup_quadlet_spec.yml:4 Saturday 27 July 2024 12:36:24 -0400 (0:00:00.078) 0:00:34.429 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "__podman_rootless | bool", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Stop and disable service] ************* task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/cleanup_quadlet_spec.yml:12 Saturday 27 July 2024 12:36:24 -0400 (0:00:00.031) 0:00:34.460 ********* changed: [managed_node1] => { "changed": true, "enabled": false, "failed_when_result": false, "name": "bogus.service", "state": "stopped", "status": { "AccessSELinuxContext": "system_u:object_r:systemd_unit_file_t:s0", "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "After": "sysinit.target basic.target system.slice -.mount systemd-journald.socket", "AllowIsolate": "no", "AssertResult": "no", "AssertTimestampMonotonic": "0", "Before": "multi-user.target shutdown.target", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "yes", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "yes", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf cap_checkpoint_restore", "CleanResult": "success", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "Conflicts": "shutdown.target", "ControlGroupId": "0", "ControlPID": "0", "CoredumpFilter": "0x33", "CoredumpReceive": "no", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "DefaultStartupMemoryLow": "0", "Delegate": "yes", "DelegateControllers": "cpu cpuset io memory pids", "Description": "bogus.service", "DevicePolicy": "auto", "DynamicUser": "no", "EffectiveMemoryHigh": "3700936704", "EffectiveMemoryMax": "3700936704", "EffectiveTasksMax": "22402", "Environment": "PODMAN_SYSTEMD_UNIT=bogus.service", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/bin/podman ; argv[]=/usr/bin/podman run --name=bogus --cidfile=/run/bogus.cid --replace --rm --cgroups=split --sdnotify=conmon -d this_is_a_bogus_image ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/bin/podman ; argv[]=/usr/bin/podman run --name=bogus --cidfile=/run/bogus.cid --replace --rm --cgroups=split --sdnotify=conmon -d this_is_a_bogus_image ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStop": "{ path=/usr/bin/podman ; argv[]=/usr/bin/podman rm -v -f -i --cidfile=/run/bogus.cid ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStopEx": "{ path=/usr/bin/podman ; argv[]=/usr/bin/podman rm -v -f -i --cidfile=/run/bogus.cid ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStopPost": "{ path=/usr/bin/podman ; argv[]=/usr/bin/podman rm -v -f -i --cidfile=/run/bogus.cid ; ignore_errors=yes ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStopPostEx": "{ path=/usr/bin/podman ; argv[]=/usr/bin/podman rm -v -f -i --cidfile=/run/bogus.cid ; flags=ignore-failure ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExitType": "main", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FileDescriptorStorePreserve": "restart", "FinalKillSignal": "9", "FragmentPath": "/run/systemd/generator/bogus.service", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "IOWeight": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "IPAccounting": "no", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "Id": "bogus.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "mixed", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "524288", "LimitNOFILESoft": "1024", "LimitNPROC": "14001", "LimitNPROCSoft": "14001", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14001", "LimitSIGPENDINGSoft": "14001", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadState": "loaded", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "ManagedOOMSwap": "auto", "MemoryAccounting": "yes", "MemoryAvailable": "3165241344", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryKSM": "no", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemoryPeak": "[not set]", "MemoryPressureThresholdUSec": "200ms", "MemoryPressureWatch": "auto", "MemorySwapCurrent": "[not set]", "MemorySwapMax": "infinity", "MemorySwapPeak": "[not set]", "MemoryZSwapCurrent": "[not set]", "MemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MountAPIVFS": "no", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAPolicy": "n/a", "Names": "bogus.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "all", "OOMPolicy": "continue", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "OnSuccessJobMode": "fail", "Perpetual": "no", "PrivateDevices": "no", "PrivateIPC": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProcSubset": "all", "ProtectClock": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectHostname": "no", "ProtectKernelLogs": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectProc": "default", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "ReloadResult": "success", "ReloadSignal": "1", "RemainAfterExit": "no", "RemoveIPC": "no", "Requires": "-.mount sysinit.target system.slice", "RequiresMountsFor": "/run/containers", "Restart": "no", "RestartKillSignal": "15", "RestartMaxDelayUSec": "infinity", "RestartMode": "normal", "RestartSteps": "0", "RestartUSec": "100ms", "RestartUSecNext": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RootEphemeral": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "SetLoginEnvironment": "no", "Slice": "system.slice", "SourcePath": "/etc/containers/systemd/bogus.container", "StandardError": "inherit", "StandardInput": "null", "StandardOutput": "journal", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StartupMemoryHigh": "infinity", "StartupMemoryLow": "0", "StartupMemoryMax": "infinity", "StartupMemorySwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "StateChangeTimestampMonotonic": "0", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SurviveFinalKillSignal": "no", "SyslogFacility": "3", "SyslogIdentifier": "bogus", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "2147483646", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "22402", "TimeoutAbortUSec": "1min 30s", "TimeoutCleanUSec": "infinity", "TimeoutStartFailureMode": "terminate", "TimeoutStartUSec": "1min 30s", "TimeoutStopFailureMode": "terminate", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "Type": "notify", "UID": "[not set]", "UMask": "0022", "UnitFilePreset": "disabled", "UnitFileState": "generated", "UtmpMode": "init", "WantedBy": "multi-user.target", "WatchdogSignal": "6", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "infinity" } } TASK [fedora.linux_system_roles.podman : See if quadlet file exists] *********** task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/cleanup_quadlet_spec.yml:33 Saturday 27 July 2024 12:36:25 -0400 (0:00:00.828) 0:00:35.288 ********* ok: [managed_node1] => { "changed": false, "stat": { "atime": 1722098167.7246652, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "1d087e679d135214e8ac9ccaf33b2222916efb7f", "ctime": 1722098167.730665, "dev": 51714, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 178258114, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0644", "mtime": 1722098167.4416738, "nlink": 1, "path": "/etc/containers/systemd/bogus.container", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 138, "uid": 0, "version": "2240788748", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.podman : Parse quadlet file] ******************* task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/cleanup_quadlet_spec.yml:38 Saturday 27 July 2024 12:36:25 -0400 (0:00:00.399) 0:00:35.688 ********* included: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/parse_quadlet_file.yml for managed_node1 TASK [fedora.linux_system_roles.podman : Slurp quadlet file] ******************* task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/parse_quadlet_file.yml:6 Saturday 27 July 2024 12:36:25 -0400 (0:00:00.059) 0:00:35.747 ********* ok: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } TASK [fedora.linux_system_roles.podman : Parse quadlet file] ******************* task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/parse_quadlet_file.yml:12 Saturday 27 July 2024 12:36:25 -0400 (0:00:00.379) 0:00:36.126 ********* ok: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } TASK [fedora.linux_system_roles.podman : Parse quadlet yaml file] ************** task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/parse_quadlet_file.yml:44 Saturday 27 July 2024 12:36:26 -0400 (0:00:00.054) 0:00:36.180 ********* skipping: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } TASK [fedora.linux_system_roles.podman : Reset raw variable] ******************* task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/parse_quadlet_file.yml:52 Saturday 27 July 2024 12:36:26 -0400 (0:00:00.034) 0:00:36.214 ********* ok: [managed_node1] => { "ansible_facts": { "__podman_quadlet_raw": null }, "changed": false } TASK [fedora.linux_system_roles.podman : Remove quadlet file] ****************** task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/cleanup_quadlet_spec.yml:42 Saturday 27 July 2024 12:36:26 -0400 (0:00:00.036) 0:00:36.251 ********* changed: [managed_node1] => { "changed": true, "path": "/etc/containers/systemd/bogus.container", "state": "absent" } TASK [fedora.linux_system_roles.podman : Refresh systemd] ********************** task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/cleanup_quadlet_spec.yml:48 Saturday 27 July 2024 12:36:26 -0400 (0:00:00.400) 0:00:36.652 ********* ok: [managed_node1] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.podman : Remove managed resource] ************** task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/cleanup_quadlet_spec.yml:58 Saturday 27 July 2024 12:36:27 -0400 (0:00:00.775) 0:00:37.428 ********* ok: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } TASK [fedora.linux_system_roles.podman : Remove volumes] *********************** task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/cleanup_quadlet_spec.yml:95 Saturday 27 July 2024 12:36:27 -0400 (0:00:00.464) 0:00:37.892 ********* skipping: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } TASK [fedora.linux_system_roles.podman : Clear parsed podman variable] ********* task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/cleanup_quadlet_spec.yml:112 Saturday 27 July 2024 12:36:27 -0400 (0:00:00.049) 0:00:37.942 ********* ok: [managed_node1] => { "ansible_facts": { "__podman_quadlet_parsed": null }, "changed": false } TASK [fedora.linux_system_roles.podman : Prune images no longer in use] ******** task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/cleanup_quadlet_spec.yml:116 Saturday 27 July 2024 12:36:27 -0400 (0:00:00.033) 0:00:37.975 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "podman_prune_images | bool", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Manage linger] ************************ task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/cleanup_quadlet_spec.yml:127 Saturday 27 July 2024 12:36:27 -0400 (0:00:00.082) 0:00:38.058 ********* included: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/manage_linger.yml for managed_node1 TASK [fedora.linux_system_roles.podman : Enable linger if needed] ************** task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/manage_linger.yml:12 Saturday 27 July 2024 12:36:27 -0400 (0:00:00.059) 0:00:38.118 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "__podman_rootless | bool", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Mark user as not yet needing to cancel linger] *** task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/manage_linger.yml:18 Saturday 27 July 2024 12:36:28 -0400 (0:00:00.031) 0:00:38.150 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "__podman_rootless | bool", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Mark user for possible linger cancel] *** task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/manage_linger.yml:22 Saturday 27 July 2024 12:36:28 -0400 (0:00:00.033) 0:00:38.184 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "__podman_rootless | bool", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : For testing and debugging - images] *** task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/cleanup_quadlet_spec.yml:137 Saturday 27 July 2024 12:36:28 -0400 (0:00:00.032) 0:00:38.216 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "__podman_test_debug | d(false)", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : For testing and debugging - volumes] *** task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/cleanup_quadlet_spec.yml:146 Saturday 27 July 2024 12:36:28 -0400 (0:00:00.037) 0:00:38.254 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "__podman_test_debug | d(false)", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : For testing and debugging - containers] *** task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/cleanup_quadlet_spec.yml:155 Saturday 27 July 2024 12:36:28 -0400 (0:00:00.036) 0:00:38.290 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "__podman_test_debug | d(false)", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : For testing and debugging - networks] *** task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/cleanup_quadlet_spec.yml:164 Saturday 27 July 2024 12:36:28 -0400 (0:00:00.036) 0:00:38.326 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "__podman_test_debug | d(false)", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : For testing and debugging - secrets] *** task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/cleanup_quadlet_spec.yml:173 Saturday 27 July 2024 12:36:28 -0400 (0:00:00.035) 0:00:38.362 ********* skipping: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } TASK [fedora.linux_system_roles.podman : For testing and debugging - services] *** task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/cleanup_quadlet_spec.yml:183 Saturday 27 July 2024 12:36:28 -0400 (0:00:00.036) 0:00:38.399 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "__podman_test_debug | d(false)", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Create and update quadlets] *********** task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_quadlet_spec.yml:114 Saturday 27 July 2024 12:36:28 -0400 (0:00:00.035) 0:00:38.434 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "__podman_state != \"absent\"", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Cancel linger] ************************ task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:143 Saturday 27 July 2024 12:36:28 -0400 (0:00:00.033) 0:00:38.467 ********* skipping: [managed_node1] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.podman : Handle credential files - absent] ***** task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:149 Saturday 27 July 2024 12:36:28 -0400 (0:00:00.029) 0:00:38.497 ********* skipping: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } TASK [fedora.linux_system_roles.podman : Handle certs.d files - absent] ******** task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:158 Saturday 27 July 2024 12:36:28 -0400 (0:00:00.030) 0:00:38.528 ********* skipping: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } TASK [Create user for testing] ************************************************* task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/tests/podman/tests_quadlet_basic.yml:109 Saturday 27 July 2024 12:36:28 -0400 (0:00:00.046) 0:00:38.575 ********* changed: [managed_node1] => { "changed": true, "comment": "", "create_home": true, "group": 1111, "home": "/home/user_quadlet_basic", "name": "user_quadlet_basic", "shell": "/bin/bash", "state": "present", "system": false, "uid": 1111 } TASK [Get local machine ID] **************************************************** task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/tests/podman/tests_quadlet_basic.yml:122 Saturday 27 July 2024 12:36:29 -0400 (0:00:00.657) 0:00:39.232 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_facts[\"distribution_version\"] is version(\"9\", \"<\")", "skip_reason": "Conditional result was False" } TASK [Skip test if cannot reboot] ********************************************** task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/tests/podman/tests_quadlet_basic.yml:128 Saturday 27 July 2024 12:36:29 -0400 (0:00:00.043) 0:00:39.275 ********* META: end_host conditional evaluated to False, continuing execution for managed_node1 skipping: [managed_node1] => { "skip_reason": "end_host conditional evaluated to False, continuing execution for managed_node1" } MSG: end_host conditional evaluated to false, continuing execution for managed_node1 TASK [Enable cgroup controllers] *********************************************** task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/tests/podman/tests_quadlet_basic.yml:134 Saturday 27 July 2024 12:36:29 -0400 (0:00:00.041) 0:00:39.316 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_facts[\"distribution_version\"] is version(\"9\", \"<\")", "skip_reason": "Conditional result was False" } TASK [Configure cgroups in kernel] ********************************************* task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/tests/podman/tests_quadlet_basic.yml:166 Saturday 27 July 2024 12:36:29 -0400 (0:00:00.040) 0:00:39.357 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_facts[\"distribution_version\"] is version(\"9\", \"<\")", "skip_reason": "Conditional result was False" } TASK [Reboot] ****************************************************************** task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/tests/podman/tests_quadlet_basic.yml:172 Saturday 27 July 2024 12:36:29 -0400 (0:00:00.096) 0:00:39.453 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_facts[\"distribution_version\"] is version(\"9\", \"<\")", "skip_reason": "Conditional result was False" } TASK [Run the role - user] ***************************************************** task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/tests/podman/tests_quadlet_basic.yml:175 Saturday 27 July 2024 12:36:29 -0400 (0:00:00.041) 0:00:39.494 ********* included: fedora.linux_system_roles.podman for managed_node1 TASK [fedora.linux_system_roles.podman : Set platform/version specific variables] *** task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:3 Saturday 27 July 2024 12:36:29 -0400 (0:00:00.083) 0:00:39.577 ********* included: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/set_vars.yml for managed_node1 TASK [fedora.linux_system_roles.podman : Ensure ansible_facts used by role] **** task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/set_vars.yml:3 Saturday 27 July 2024 12:36:29 -0400 (0:00:00.058) 0:00:39.636 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "__podman_required_facts | difference(ansible_facts.keys() | list) | length > 0", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Check if system is ostree] ************ task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/set_vars.yml:11 Saturday 27 July 2024 12:36:29 -0400 (0:00:00.039) 0:00:39.675 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "not __podman_is_ostree is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Set flag to indicate system is ostree] *** task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/set_vars.yml:16 Saturday 27 July 2024 12:36:29 -0400 (0:00:00.033) 0:00:39.709 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "not __podman_is_ostree is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Set platform/version specific variables] *** task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/set_vars.yml:20 Saturday 27 July 2024 12:36:29 -0400 (0:00:00.032) 0:00:39.742 ********* ok: [managed_node1] => (item=RedHat.yml) => { "ansible_facts": { "__podman_packages": [ "podman", "shadow-utils-subid" ] }, "ansible_included_var_files": [ "/tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/vars/RedHat.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml" } skipping: [managed_node1] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed_node1] => (item=CentOS_10.yml) => { "ansible_facts": { "__podman_packages": [ "iptables-nft", "podman", "shadow-utils-subid" ] }, "ansible_included_var_files": [ "/tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/vars/CentOS_10.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_10.yml" } ok: [managed_node1] => (item=CentOS_10.yml) => { "ansible_facts": { "__podman_packages": [ "iptables-nft", "podman", "shadow-utils-subid" ] }, "ansible_included_var_files": [ "/tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/vars/CentOS_10.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_10.yml" } TASK [fedora.linux_system_roles.podman : Gather the package facts] ************* task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:6 Saturday 27 July 2024 12:36:29 -0400 (0:00:00.075) 0:00:39.817 ********* ok: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } TASK [fedora.linux_system_roles.podman : Enable copr if requested] ************* task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:10 Saturday 27 July 2024 12:36:30 -0400 (0:00:00.804) 0:00:40.622 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "podman_use_copr | d(false)", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Ensure required packages are installed] *** task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:14 Saturday 27 July 2024 12:36:30 -0400 (0:00:00.034) 0:00:40.656 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "(__podman_packages | difference(ansible_facts.packages))", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Get podman version] ******************* task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:22 Saturday 27 July 2024 12:36:30 -0400 (0:00:00.036) 0:00:40.692 ********* ok: [managed_node1] => { "changed": false, "cmd": [ "podman", "--version" ], "delta": "0:00:00.029686", "end": "2024-07-27 12:36:30.900538", "rc": 0, "start": "2024-07-27 12:36:30.870852" } STDOUT: podman version 5.1.2 TASK [fedora.linux_system_roles.podman : Set podman version] ******************* task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:28 Saturday 27 July 2024 12:36:30 -0400 (0:00:00.416) 0:00:41.109 ********* ok: [managed_node1] => { "ansible_facts": { "podman_version": "5.1.2" }, "changed": false } TASK [fedora.linux_system_roles.podman : Podman package version must be 4.2 or later] *** task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:32 Saturday 27 July 2024 12:36:31 -0400 (0:00:00.034) 0:00:41.143 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "podman_version is version(\"4.2\", \"<\")", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Podman package version must be 4.4 or later for quadlet, secrets] *** task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:39 Saturday 27 July 2024 12:36:31 -0400 (0:00:00.031) 0:00:41.175 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "podman_version is version(\"4.4\", \"<\")", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Podman package version must be 4.4 or later for quadlet, secrets] *** task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:49 Saturday 27 July 2024 12:36:31 -0400 (0:00:00.041) 0:00:41.217 ********* META: end_host conditional evaluated to False, continuing execution for managed_node1 skipping: [managed_node1] => { "skip_reason": "end_host conditional evaluated to False, continuing execution for managed_node1" } MSG: end_host conditional evaluated to false, continuing execution for managed_node1 TASK [fedora.linux_system_roles.podman : Check user and group information] ***** task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:56 Saturday 27 July 2024 12:36:31 -0400 (0:00:00.087) 0:00:41.304 ********* included: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml for managed_node1 TASK [fedora.linux_system_roles.podman : Get user information] ***************** task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:2 Saturday 27 July 2024 12:36:31 -0400 (0:00:00.067) 0:00:41.371 ********* ok: [managed_node1] => { "ansible_facts": { "getent_passwd": { "user_quadlet_basic": [ "x", "1111", "1111", "", "/home/user_quadlet_basic", "/bin/bash" ] } }, "changed": false } TASK [fedora.linux_system_roles.podman : Fail if user does not exist] ********** task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:9 Saturday 27 July 2024 12:36:31 -0400 (0:00:00.400) 0:00:41.772 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "not ansible_facts[\"getent_passwd\"][__podman_user]", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Set group for podman user] ************ task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:16 Saturday 27 July 2024 12:36:31 -0400 (0:00:00.036) 0:00:41.808 ********* ok: [managed_node1] => { "ansible_facts": { "__podman_group": "1111" }, "changed": false } TASK [fedora.linux_system_roles.podman : Get group information] **************** task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:28 Saturday 27 July 2024 12:36:31 -0400 (0:00:00.044) 0:00:41.853 ********* ok: [managed_node1] => { "ansible_facts": { "getent_group": { "user_quadlet_basic": [ "x", "1111", "" ] } }, "changed": false } TASK [fedora.linux_system_roles.podman : Set group name] *********************** task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:35 Saturday 27 July 2024 12:36:32 -0400 (0:00:00.391) 0:00:42.245 ********* ok: [managed_node1] => { "ansible_facts": { "__podman_group_name": "user_quadlet_basic" }, "changed": false } TASK [fedora.linux_system_roles.podman : See if getsubids exists] ************** task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:39 Saturday 27 July 2024 12:36:32 -0400 (0:00:00.043) 0:00:42.288 ********* ok: [managed_node1] => { "changed": false, "stat": { "atime": 1722097719.096431, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 32, "charset": "binary", "checksum": "86395ad7ce62834c967dc50f963a68f042029188", "ctime": 1722097690.6782997, "dev": 51714, "device_type": 0, "executable": true, "exists": true, "gid": 0, "gr_name": "root", "inode": 4755377, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "application/x-pie-executable", "mode": "0755", "mtime": 1719187200.0, "nlink": 1, "path": "/usr/bin/getsubids", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 15728, "uid": 0, "version": "2656437169", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": true, "xoth": true, "xusr": true } } TASK [fedora.linux_system_roles.podman : Check user with getsubids] ************ task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:50 Saturday 27 July 2024 12:36:32 -0400 (0:00:00.392) 0:00:42.681 ********* ok: [managed_node1] => { "changed": false, "cmd": [ "getsubids", "user_quadlet_basic" ], "delta": "0:00:00.004289", "end": "2024-07-27 12:36:32.869808", "rc": 0, "start": "2024-07-27 12:36:32.865519" } STDOUT: 0: user_quadlet_basic 655360 65536 TASK [fedora.linux_system_roles.podman : Check group with getsubids] *********** task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:55 Saturday 27 July 2024 12:36:32 -0400 (0:00:00.398) 0:00:43.079 ********* ok: [managed_node1] => { "changed": false, "cmd": [ "getsubids", "-g", "user_quadlet_basic" ], "delta": "0:00:00.005567", "end": "2024-07-27 12:36:33.274199", "rc": 0, "start": "2024-07-27 12:36:33.268632" } STDOUT: 0: user_quadlet_basic 655360 65536 TASK [fedora.linux_system_roles.podman : Set user subuid and subgid info] ****** task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:60 Saturday 27 July 2024 12:36:33 -0400 (0:00:00.403) 0:00:43.482 ********* ok: [managed_node1] => { "ansible_facts": { "podman_subgid_info": { "user_quadlet_basic": { "range": 65536, "start": 655360 } }, "podman_subuid_info": { "user_quadlet_basic": { "range": 65536, "start": 655360 } } }, "changed": false } TASK [fedora.linux_system_roles.podman : Get subuid file] ********************** task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:74 Saturday 27 July 2024 12:36:33 -0400 (0:00:00.051) 0:00:43.534 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "not __podman_stat_getsubids.stat.exists", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Get subgid file] ********************** task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:79 Saturday 27 July 2024 12:36:33 -0400 (0:00:00.031) 0:00:43.565 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "not __podman_stat_getsubids.stat.exists", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Set user subuid and subgid info] ****** task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:84 Saturday 27 July 2024 12:36:33 -0400 (0:00:00.032) 0:00:43.598 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "not __podman_stat_getsubids.stat.exists", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Fail if user not in subuid file] ****** task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:94 Saturday 27 July 2024 12:36:33 -0400 (0:00:00.031) 0:00:43.630 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "not __podman_stat_getsubids.stat.exists", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Fail if group not in subgid file] ***** task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:101 Saturday 27 July 2024 12:36:33 -0400 (0:00:00.032) 0:00:43.663 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "not __podman_stat_getsubids.stat.exists", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Set config file paths] **************** task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:62 Saturday 27 July 2024 12:36:33 -0400 (0:00:00.031) 0:00:43.694 ********* ok: [managed_node1] => { "ansible_facts": { "__podman_container_conf_file": "/root/.config/containers/containers.conf.d/50-systemroles.conf", "__podman_policy_json_file": "/root/.config/containers/policy.json", "__podman_registries_conf_file": "/root/.config/containers/registries.conf.d/50-systemroles.conf", "__podman_storage_conf_file": "/root/.config/containers/storage.conf" }, "changed": false } TASK [fedora.linux_system_roles.podman : Handle container.conf.d] ************** task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:71 Saturday 27 July 2024 12:36:33 -0400 (0:00:00.044) 0:00:43.738 ********* included: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_container_conf_d.yml for managed_node1 TASK [fedora.linux_system_roles.podman : Ensure containers.d exists] *********** task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_container_conf_d.yml:5 Saturday 27 July 2024 12:36:33 -0400 (0:00:00.063) 0:00:43.801 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "podman_containers_conf | length > 0", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Update container config file] ********* task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_container_conf_d.yml:13 Saturday 27 July 2024 12:36:33 -0400 (0:00:00.034) 0:00:43.836 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "podman_containers_conf | length > 0", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Handle registries.conf.d] ************* task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:74 Saturday 27 July 2024 12:36:33 -0400 (0:00:00.083) 0:00:43.920 ********* included: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_registries_conf_d.yml for managed_node1 TASK [fedora.linux_system_roles.podman : Ensure registries.d exists] *********** task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_registries_conf_d.yml:5 Saturday 27 July 2024 12:36:33 -0400 (0:00:00.065) 0:00:43.985 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "podman_registries_conf | length > 0", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Update registries config file] ******** task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_registries_conf_d.yml:13 Saturday 27 July 2024 12:36:33 -0400 (0:00:00.034) 0:00:44.019 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "podman_registries_conf | length > 0", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Handle storage.conf] ****************** task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:77 Saturday 27 July 2024 12:36:33 -0400 (0:00:00.033) 0:00:44.052 ********* included: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_storage_conf.yml for managed_node1 TASK [fedora.linux_system_roles.podman : Ensure storage.conf parent dir exists] *** task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_storage_conf.yml:5 Saturday 27 July 2024 12:36:33 -0400 (0:00:00.065) 0:00:44.118 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "podman_storage_conf | length > 0", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Update storage config file] *********** task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_storage_conf.yml:13 Saturday 27 July 2024 12:36:34 -0400 (0:00:00.033) 0:00:44.152 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "podman_storage_conf | length > 0", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Handle policy.json] ******************* task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:80 Saturday 27 July 2024 12:36:34 -0400 (0:00:00.032) 0:00:44.185 ********* included: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_policy_json.yml for managed_node1 TASK [fedora.linux_system_roles.podman : Ensure policy.json parent dir exists] *** task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_policy_json.yml:6 Saturday 27 July 2024 12:36:34 -0400 (0:00:00.068) 0:00:44.254 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "podman_policy_json | length > 0", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Stat the policy.json file] ************ task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_policy_json.yml:14 Saturday 27 July 2024 12:36:34 -0400 (0:00:00.032) 0:00:44.287 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "podman_policy_json | length > 0", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Get the existing policy.json] ********* task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_policy_json.yml:19 Saturday 27 July 2024 12:36:34 -0400 (0:00:00.034) 0:00:44.321 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "podman_policy_json | length > 0", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Write new policy.json file] *********** task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_policy_json.yml:25 Saturday 27 July 2024 12:36:34 -0400 (0:00:00.032) 0:00:44.354 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "podman_policy_json | length > 0", "skip_reason": "Conditional result was False" } TASK [Manage firewall for specified ports] ************************************* task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:86 Saturday 27 July 2024 12:36:34 -0400 (0:00:00.034) 0:00:44.388 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "podman_firewall | length > 0", "skip_reason": "Conditional result was False" } TASK [Manage selinux for specified ports] ************************************** task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:93 Saturday 27 July 2024 12:36:34 -0400 (0:00:00.032) 0:00:44.421 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "podman_selinux_ports | length > 0", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Keep track of users that need to cancel linger] *** task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:100 Saturday 27 July 2024 12:36:34 -0400 (0:00:00.033) 0:00:44.454 ********* ok: [managed_node1] => { "ansible_facts": { "__podman_cancel_user_linger": [] }, "changed": false } TASK [fedora.linux_system_roles.podman : Handle certs.d files - present] ******* task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:104 Saturday 27 July 2024 12:36:34 -0400 (0:00:00.032) 0:00:44.487 ********* skipping: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } TASK [fedora.linux_system_roles.podman : Handle credential files - present] **** task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:113 Saturday 27 July 2024 12:36:34 -0400 (0:00:00.030) 0:00:44.518 ********* skipping: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } TASK [fedora.linux_system_roles.podman : Handle secrets] *********************** task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:122 Saturday 27 July 2024 12:36:34 -0400 (0:00:00.029) 0:00:44.547 ********* fatal: [managed_node1]: FAILED! => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result" } TASK [Debug3] ****************************************************************** task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/tests/podman/tests_quadlet_basic.yml:253 Saturday 27 July 2024 12:36:34 -0400 (0:00:00.083) 0:00:44.631 ********* fatal: [managed_node1]: FAILED! => { "changed": false, "cmd": "set -x\nset -o pipefail\nexec 1>&2\n#podman volume rm --all\n#podman network prune -f\npodman volume ls\npodman network ls\npodman secret ls\npodman container ls\npodman pod ls\npodman images\nsystemctl list-units | grep quadlet\n", "delta": "0:00:00.217232", "end": "2024-07-27 12:36:35.025221", "rc": 1, "start": "2024-07-27 12:36:34.807989" } STDERR: + set -o pipefail + exec + podman volume ls + podman network ls NETWORK ID NAME DRIVER 2f259bab93aa podman bridge 8f5134bcda5f podman-default-kube-network bridge + podman secret ls ID NAME DRIVER CREATED UPDATED + podman container ls CONTAINER ID IMAGE COMMAND CREATED STATUS PORTS NAMES + podman pod ls POD ID NAME STATUS CREATED INFRA ID # OF CONTAINERS + podman images REPOSITORY TAG IMAGE ID CREATED SIZE localhost/podman-pause 5.1.2-1720656000 dd68369a4c98 6 minutes ago 705 kB quay.io/libpod/testimage 20210610 9f9ec7f2fdef 3 years ago 7.99 MB + systemctl list-units + grep quadlet MSG: non-zero return code TASK [Cleanup user] ************************************************************ task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/tests/podman/tests_quadlet_basic.yml:282 Saturday 27 July 2024 12:36:35 -0400 (0:00:00.600) 0:00:45.231 ********* included: fedora.linux_system_roles.podman for managed_node1 TASK [fedora.linux_system_roles.podman : Set platform/version specific variables] *** task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:3 Saturday 27 July 2024 12:36:35 -0400 (0:00:00.075) 0:00:45.307 ********* included: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/set_vars.yml for managed_node1 TASK [fedora.linux_system_roles.podman : Ensure ansible_facts used by role] **** task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/set_vars.yml:3 Saturday 27 July 2024 12:36:35 -0400 (0:00:00.058) 0:00:45.366 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "__podman_required_facts | difference(ansible_facts.keys() | list) | length > 0", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Check if system is ostree] ************ task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/set_vars.yml:11 Saturday 27 July 2024 12:36:35 -0400 (0:00:00.039) 0:00:45.405 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "not __podman_is_ostree is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Set flag to indicate system is ostree] *** task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/set_vars.yml:16 Saturday 27 July 2024 12:36:35 -0400 (0:00:00.033) 0:00:45.438 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "not __podman_is_ostree is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Set platform/version specific variables] *** task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/set_vars.yml:20 Saturday 27 July 2024 12:36:35 -0400 (0:00:00.032) 0:00:45.471 ********* ok: [managed_node1] => (item=RedHat.yml) => { "ansible_facts": { "__podman_packages": [ "podman", "shadow-utils-subid" ] }, "ansible_included_var_files": [ "/tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/vars/RedHat.yml" ], "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml" } skipping: [managed_node1] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed_node1] => (item=CentOS_10.yml) => { "ansible_facts": { "__podman_packages": [ "iptables-nft", "podman", "shadow-utils-subid" ] }, "ansible_included_var_files": [ "/tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/vars/CentOS_10.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_10.yml" } ok: [managed_node1] => (item=CentOS_10.yml) => { "ansible_facts": { "__podman_packages": [ "iptables-nft", "podman", "shadow-utils-subid" ] }, "ansible_included_var_files": [ "/tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/vars/CentOS_10.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_10.yml" } TASK [fedora.linux_system_roles.podman : Gather the package facts] ************* task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:6 Saturday 27 July 2024 12:36:35 -0400 (0:00:00.074) 0:00:45.546 ********* ok: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } TASK [fedora.linux_system_roles.podman : Enable copr if requested] ************* task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:10 Saturday 27 July 2024 12:36:36 -0400 (0:00:00.809) 0:00:46.355 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "podman_use_copr | d(false)", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Ensure required packages are installed] *** task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:14 Saturday 27 July 2024 12:36:36 -0400 (0:00:00.033) 0:00:46.389 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "(__podman_packages | difference(ansible_facts.packages))", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Get podman version] ******************* task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:22 Saturday 27 July 2024 12:36:36 -0400 (0:00:00.038) 0:00:46.427 ********* ok: [managed_node1] => { "changed": false, "cmd": [ "podman", "--version" ], "delta": "0:00:00.029739", "end": "2024-07-27 12:36:36.642493", "rc": 0, "start": "2024-07-27 12:36:36.612754" } STDOUT: podman version 5.1.2 TASK [fedora.linux_system_roles.podman : Set podman version] ******************* task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:28 Saturday 27 July 2024 12:36:36 -0400 (0:00:00.423) 0:00:46.851 ********* ok: [managed_node1] => { "ansible_facts": { "podman_version": "5.1.2" }, "changed": false } TASK [fedora.linux_system_roles.podman : Podman package version must be 4.2 or later] *** task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:32 Saturday 27 July 2024 12:36:36 -0400 (0:00:00.036) 0:00:46.887 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "podman_version is version(\"4.2\", \"<\")", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Podman package version must be 4.4 or later for quadlet, secrets] *** task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:39 Saturday 27 July 2024 12:36:36 -0400 (0:00:00.031) 0:00:46.919 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "podman_version is version(\"4.4\", \"<\")", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Podman package version must be 4.4 or later for quadlet, secrets] *** task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:49 Saturday 27 July 2024 12:36:36 -0400 (0:00:00.045) 0:00:46.964 ********* META: end_host conditional evaluated to False, continuing execution for managed_node1 skipping: [managed_node1] => { "skip_reason": "end_host conditional evaluated to False, continuing execution for managed_node1" } MSG: end_host conditional evaluated to false, continuing execution for managed_node1 TASK [fedora.linux_system_roles.podman : Check user and group information] ***** task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:56 Saturday 27 July 2024 12:36:36 -0400 (0:00:00.098) 0:00:47.063 ********* included: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml for managed_node1 TASK [fedora.linux_system_roles.podman : Get user information] ***************** task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:2 Saturday 27 July 2024 12:36:36 -0400 (0:00:00.071) 0:00:47.135 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "'getent_passwd' not in ansible_facts or __podman_user not in ansible_facts['getent_passwd']", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Fail if user does not exist] ********** task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:9 Saturday 27 July 2024 12:36:37 -0400 (0:00:00.037) 0:00:47.172 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "not ansible_facts[\"getent_passwd\"][__podman_user]", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Set group for podman user] ************ task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:16 Saturday 27 July 2024 12:36:37 -0400 (0:00:00.038) 0:00:47.211 ********* ok: [managed_node1] => { "ansible_facts": { "__podman_group": "1111" }, "changed": false } TASK [fedora.linux_system_roles.podman : Get group information] **************** task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:28 Saturday 27 July 2024 12:36:37 -0400 (0:00:00.045) 0:00:47.256 ********* ok: [managed_node1] => { "ansible_facts": { "getent_group": { "user_quadlet_basic": [ "x", "1111", "" ] } }, "changed": false } TASK [fedora.linux_system_roles.podman : Set group name] *********************** task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:35 Saturday 27 July 2024 12:36:37 -0400 (0:00:00.399) 0:00:47.655 ********* ok: [managed_node1] => { "ansible_facts": { "__podman_group_name": "user_quadlet_basic" }, "changed": false } TASK [fedora.linux_system_roles.podman : See if getsubids exists] ************** task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:39 Saturday 27 July 2024 12:36:37 -0400 (0:00:00.042) 0:00:47.698 ********* ok: [managed_node1] => { "changed": false, "stat": { "atime": 1722097719.096431, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 32, "charset": "binary", "checksum": "86395ad7ce62834c967dc50f963a68f042029188", "ctime": 1722097690.6782997, "dev": 51714, "device_type": 0, "executable": true, "exists": true, "gid": 0, "gr_name": "root", "inode": 4755377, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "application/x-pie-executable", "mode": "0755", "mtime": 1719187200.0, "nlink": 1, "path": "/usr/bin/getsubids", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 15728, "uid": 0, "version": "2656437169", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": true, "xoth": true, "xusr": true } } TASK [fedora.linux_system_roles.podman : Check user with getsubids] ************ task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:50 Saturday 27 July 2024 12:36:37 -0400 (0:00:00.401) 0:00:48.100 ********* ok: [managed_node1] => { "changed": false, "cmd": [ "getsubids", "user_quadlet_basic" ], "delta": "0:00:00.004291", "end": "2024-07-27 12:36:38.293744", "rc": 0, "start": "2024-07-27 12:36:38.289453" } STDOUT: 0: user_quadlet_basic 655360 65536 TASK [fedora.linux_system_roles.podman : Check group with getsubids] *********** task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:55 Saturday 27 July 2024 12:36:38 -0400 (0:00:00.402) 0:00:48.503 ********* ok: [managed_node1] => { "changed": false, "cmd": [ "getsubids", "-g", "user_quadlet_basic" ], "delta": "0:00:00.005750", "end": "2024-07-27 12:36:38.699004", "rc": 0, "start": "2024-07-27 12:36:38.693254" } STDOUT: 0: user_quadlet_basic 655360 65536 TASK [fedora.linux_system_roles.podman : Set user subuid and subgid info] ****** task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:60 Saturday 27 July 2024 12:36:38 -0400 (0:00:00.404) 0:00:48.907 ********* ok: [managed_node1] => { "ansible_facts": { "podman_subgid_info": { "user_quadlet_basic": { "range": 65536, "start": 655360 } }, "podman_subuid_info": { "user_quadlet_basic": { "range": 65536, "start": 655360 } } }, "changed": false } TASK [fedora.linux_system_roles.podman : Get subuid file] ********************** task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:74 Saturday 27 July 2024 12:36:38 -0400 (0:00:00.050) 0:00:48.957 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "not __podman_stat_getsubids.stat.exists", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Get subgid file] ********************** task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:79 Saturday 27 July 2024 12:36:38 -0400 (0:00:00.032) 0:00:48.989 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "not __podman_stat_getsubids.stat.exists", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Set user subuid and subgid info] ****** task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:84 Saturday 27 July 2024 12:36:38 -0400 (0:00:00.030) 0:00:49.020 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "not __podman_stat_getsubids.stat.exists", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Fail if user not in subuid file] ****** task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:94 Saturday 27 July 2024 12:36:38 -0400 (0:00:00.031) 0:00:49.052 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "not __podman_stat_getsubids.stat.exists", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Fail if group not in subgid file] ***** task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_user_group.yml:101 Saturday 27 July 2024 12:36:38 -0400 (0:00:00.030) 0:00:49.082 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "not __podman_stat_getsubids.stat.exists", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Set config file paths] **************** task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:62 Saturday 27 July 2024 12:36:38 -0400 (0:00:00.031) 0:00:49.114 ********* ok: [managed_node1] => { "ansible_facts": { "__podman_container_conf_file": "/root/.config/containers/containers.conf.d/50-systemroles.conf", "__podman_policy_json_file": "/root/.config/containers/policy.json", "__podman_registries_conf_file": "/root/.config/containers/registries.conf.d/50-systemroles.conf", "__podman_storage_conf_file": "/root/.config/containers/storage.conf" }, "changed": false } TASK [fedora.linux_system_roles.podman : Handle container.conf.d] ************** task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:71 Saturday 27 July 2024 12:36:39 -0400 (0:00:00.042) 0:00:49.156 ********* included: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_container_conf_d.yml for managed_node1 TASK [fedora.linux_system_roles.podman : Ensure containers.d exists] *********** task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_container_conf_d.yml:5 Saturday 27 July 2024 12:36:39 -0400 (0:00:00.065) 0:00:49.221 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "podman_containers_conf | length > 0", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Update container config file] ********* task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_container_conf_d.yml:13 Saturday 27 July 2024 12:36:39 -0400 (0:00:00.032) 0:00:49.254 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "podman_containers_conf | length > 0", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Handle registries.conf.d] ************* task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:74 Saturday 27 July 2024 12:36:39 -0400 (0:00:00.080) 0:00:49.335 ********* included: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_registries_conf_d.yml for managed_node1 TASK [fedora.linux_system_roles.podman : Ensure registries.d exists] *********** task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_registries_conf_d.yml:5 Saturday 27 July 2024 12:36:39 -0400 (0:00:00.066) 0:00:49.401 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "podman_registries_conf | length > 0", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Update registries config file] ******** task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_registries_conf_d.yml:13 Saturday 27 July 2024 12:36:39 -0400 (0:00:00.032) 0:00:49.434 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "podman_registries_conf | length > 0", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Handle storage.conf] ****************** task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:77 Saturday 27 July 2024 12:36:39 -0400 (0:00:00.033) 0:00:49.468 ********* included: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_storage_conf.yml for managed_node1 TASK [fedora.linux_system_roles.podman : Ensure storage.conf parent dir exists] *** task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_storage_conf.yml:5 Saturday 27 July 2024 12:36:39 -0400 (0:00:00.066) 0:00:49.534 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "podman_storage_conf | length > 0", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Update storage config file] *********** task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_storage_conf.yml:13 Saturday 27 July 2024 12:36:39 -0400 (0:00:00.032) 0:00:49.567 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "podman_storage_conf | length > 0", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Handle policy.json] ******************* task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:80 Saturday 27 July 2024 12:36:39 -0400 (0:00:00.034) 0:00:49.601 ********* included: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_policy_json.yml for managed_node1 TASK [fedora.linux_system_roles.podman : Ensure policy.json parent dir exists] *** task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_policy_json.yml:6 Saturday 27 July 2024 12:36:39 -0400 (0:00:00.069) 0:00:49.671 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "podman_policy_json | length > 0", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Stat the policy.json file] ************ task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_policy_json.yml:14 Saturday 27 July 2024 12:36:39 -0400 (0:00:00.034) 0:00:49.705 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "podman_policy_json | length > 0", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Get the existing policy.json] ********* task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_policy_json.yml:19 Saturday 27 July 2024 12:36:39 -0400 (0:00:00.032) 0:00:49.738 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "podman_policy_json | length > 0", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Write new policy.json file] *********** task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/handle_policy_json.yml:25 Saturday 27 July 2024 12:36:39 -0400 (0:00:00.034) 0:00:49.772 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "podman_policy_json | length > 0", "skip_reason": "Conditional result was False" } TASK [Manage firewall for specified ports] ************************************* task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:86 Saturday 27 July 2024 12:36:39 -0400 (0:00:00.032) 0:00:49.805 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "podman_firewall | length > 0", "skip_reason": "Conditional result was False" } TASK [Manage selinux for specified ports] ************************************** task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:93 Saturday 27 July 2024 12:36:39 -0400 (0:00:00.033) 0:00:49.839 ********* skipping: [managed_node1] => { "changed": false, "false_condition": "podman_selinux_ports | length > 0", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Keep track of users that need to cancel linger] *** task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:100 Saturday 27 July 2024 12:36:39 -0400 (0:00:00.033) 0:00:49.872 ********* ok: [managed_node1] => { "ansible_facts": { "__podman_cancel_user_linger": [] }, "changed": false } TASK [fedora.linux_system_roles.podman : Handle certs.d files - present] ******* task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:104 Saturday 27 July 2024 12:36:39 -0400 (0:00:00.033) 0:00:49.905 ********* skipping: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } TASK [fedora.linux_system_roles.podman : Handle credential files - present] **** task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:113 Saturday 27 July 2024 12:36:39 -0400 (0:00:00.028) 0:00:49.934 ********* skipping: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } TASK [fedora.linux_system_roles.podman : Handle secrets] *********************** task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:122 Saturday 27 July 2024 12:36:39 -0400 (0:00:00.029) 0:00:49.964 ********* fatal: [managed_node1]: FAILED! => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result" } TASK [Dump journal] ************************************************************ task path: /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/tests/podman/tests_quadlet_basic.yml:319 Saturday 27 July 2024 12:36:39 -0400 (0:00:00.037) 0:00:50.001 ********* fatal: [managed_node1]: FAILED! => { "changed": false, "cmd": [ "journalctl", "-ex" ], "delta": "0:00:00.037433", "end": "2024-07-27 12:36:40.218081", "failed_when_result": true, "rc": 0, "start": "2024-07-27 12:36:40.180648" } STDOUT: Jul 27 12:31:16 ip-10-31-14-141.us-east-1.aws.redhat.com /usr/bin/podman[24971]: time="2024-07-27T12:31:16-04:00" level=debug msg="Using run root /run/user/3001/containers" Jul 27 12:31:16 ip-10-31-14-141.us-east-1.aws.redhat.com /usr/bin/podman[24971]: time="2024-07-27T12:31:16-04:00" level=debug msg="Using static dir /home/podman_basic_user/.local/share/containers/storage/libpod" Jul 27 12:31:16 ip-10-31-14-141.us-east-1.aws.redhat.com /usr/bin/podman[24971]: time="2024-07-27T12:31:16-04:00" level=debug msg="Using tmp dir /run/user/3001/libpod/tmp" Jul 27 12:31:16 ip-10-31-14-141.us-east-1.aws.redhat.com /usr/bin/podman[24971]: time="2024-07-27T12:31:16-04:00" level=debug msg="Using volume path /home/podman_basic_user/.local/share/containers/storage/volumes" Jul 27 12:31:16 ip-10-31-14-141.us-east-1.aws.redhat.com /usr/bin/podman[24971]: time="2024-07-27T12:31:16-04:00" level=debug msg="Using transient store: false" Jul 27 12:31:16 ip-10-31-14-141.us-east-1.aws.redhat.com /usr/bin/podman[24971]: time="2024-07-27T12:31:16-04:00" level=debug msg="[graphdriver] trying provided driver \"overlay\"" Jul 27 12:31:16 ip-10-31-14-141.us-east-1.aws.redhat.com /usr/bin/podman[24971]: time="2024-07-27T12:31:16-04:00" level=debug msg="Cached value indicated that overlay is supported" Jul 27 12:31:16 ip-10-31-14-141.us-east-1.aws.redhat.com /usr/bin/podman[24971]: time="2024-07-27T12:31:16-04:00" level=debug msg="Cached value indicated that overlay is supported" Jul 27 12:31:16 ip-10-31-14-141.us-east-1.aws.redhat.com /usr/bin/podman[24971]: time="2024-07-27T12:31:16-04:00" level=debug msg="Cached value indicated that metacopy is not being used" Jul 27 12:31:16 ip-10-31-14-141.us-east-1.aws.redhat.com /usr/bin/podman[24971]: time="2024-07-27T12:31:16-04:00" level=debug msg="Cached value indicated that native-diff is usable" Jul 27 12:31:16 ip-10-31-14-141.us-east-1.aws.redhat.com /usr/bin/podman[24971]: time="2024-07-27T12:31:16-04:00" level=debug msg="backingFs=xfs, projectQuotaSupported=false, useNativeDiff=true, usingMetacopy=false" Jul 27 12:31:16 ip-10-31-14-141.us-east-1.aws.redhat.com /usr/bin/podman[24971]: time="2024-07-27T12:31:16-04:00" level=debug msg="Initializing event backend file" Jul 27 12:31:16 ip-10-31-14-141.us-east-1.aws.redhat.com /usr/bin/podman[24971]: time="2024-07-27T12:31:16-04:00" level=debug msg="Configured OCI runtime runsc initialization failed: no valid executable found for OCI runtime runsc: invalid argument" Jul 27 12:31:16 ip-10-31-14-141.us-east-1.aws.redhat.com /usr/bin/podman[24971]: time="2024-07-27T12:31:16-04:00" level=debug msg="Configured OCI runtime ocijail initialization failed: no valid executable found for OCI runtime ocijail: invalid argument" Jul 27 12:31:16 ip-10-31-14-141.us-east-1.aws.redhat.com /usr/bin/podman[24971]: time="2024-07-27T12:31:16-04:00" level=debug msg="Configured OCI runtime runc initialization failed: no valid executable found for OCI runtime runc: invalid argument" Jul 27 12:31:16 ip-10-31-14-141.us-east-1.aws.redhat.com /usr/bin/podman[24971]: time="2024-07-27T12:31:16-04:00" level=debug msg="Configured OCI runtime runj initialization failed: no valid executable found for OCI runtime runj: invalid argument" Jul 27 12:31:16 ip-10-31-14-141.us-east-1.aws.redhat.com /usr/bin/podman[24971]: time="2024-07-27T12:31:16-04:00" level=debug msg="Configured OCI runtime kata initialization failed: no valid executable found for OCI runtime kata: invalid argument" Jul 27 12:31:16 ip-10-31-14-141.us-east-1.aws.redhat.com /usr/bin/podman[24971]: time="2024-07-27T12:31:16-04:00" level=debug msg="Configured OCI runtime youki initialization failed: no valid executable found for OCI runtime youki: invalid argument" Jul 27 12:31:16 ip-10-31-14-141.us-east-1.aws.redhat.com /usr/bin/podman[24971]: time="2024-07-27T12:31:16-04:00" level=debug msg="Configured OCI runtime krun initialization failed: no valid executable found for OCI runtime krun: invalid argument" Jul 27 12:31:16 ip-10-31-14-141.us-east-1.aws.redhat.com /usr/bin/podman[24971]: time="2024-07-27T12:31:16-04:00" level=debug msg="Configured OCI runtime crun-vm initialization failed: no valid executable found for OCI runtime crun-vm: invalid argument" Jul 27 12:31:16 ip-10-31-14-141.us-east-1.aws.redhat.com /usr/bin/podman[24971]: time="2024-07-27T12:31:16-04:00" level=debug msg="Configured OCI runtime crun-wasm initialization failed: no valid executable found for OCI runtime crun-wasm: invalid argument" Jul 27 12:31:16 ip-10-31-14-141.us-east-1.aws.redhat.com /usr/bin/podman[24971]: time="2024-07-27T12:31:16-04:00" level=debug msg="Using OCI runtime \"/usr/bin/crun\"" Jul 27 12:31:16 ip-10-31-14-141.us-east-1.aws.redhat.com /usr/bin/podman[24971]: time="2024-07-27T12:31:16-04:00" level=info msg="Setting parallel job count to 7" Jul 27 12:31:16 ip-10-31-14-141.us-east-1.aws.redhat.com /usr/bin/podman[24971]: time="2024-07-27T12:31:16-04:00" level=debug msg="Called cleanup.PersistentPostRunE(/usr/bin/podman --root /home/podman_basic_user/.local/share/containers/storage --runroot /run/user/3001/containers --log-level debug --cgroup-manager systemd --tmpdir /run/user/3001/libpod/tmp --network-config-dir --network-backend netavark --volumepath /home/podman_basic_user/.local/share/containers/storage/volumes --db-backend sqlite --transient-store=false --runtime crun --storage-driver overlay --events-backend file --syslog container cleanup 76c131a2b32a52a9eaadea10a40ace519e973b74ff69ea52e0b11c23c34611de)" Jul 27 12:31:16 ip-10-31-14-141.us-east-1.aws.redhat.com /usr/bin/podman[24971]: time="2024-07-27T12:31:16-04:00" level=debug msg="Shutting down engines" Jul 27 12:31:26 ip-10-31-14-141.us-east-1.aws.redhat.com podman[24960]: time="2024-07-27T12:31:26-04:00" level=warning msg="StopSignal SIGTERM failed to stop container httpd1-httpd1 in 10 seconds, resorting to SIGKILL" Jul 27 12:31:26 ip-10-31-14-141.us-east-1.aws.redhat.com conmon[24476]: conmon dc15e8e7f8dd137dfb1c : container 24478 exited with status 137 Jul 27 12:31:26 ip-10-31-14-141.us-east-1.aws.redhat.com /usr/bin/podman[24989]: time="2024-07-27T12:31:26-04:00" level=debug msg="Called cleanup.PersistentPreRunE(/usr/bin/podman --root /home/podman_basic_user/.local/share/containers/storage --runroot /run/user/3001/containers --log-level debug --cgroup-manager systemd --tmpdir /run/user/3001/libpod/tmp --network-config-dir --network-backend netavark --volumepath /home/podman_basic_user/.local/share/containers/storage/volumes --db-backend sqlite --transient-store=false --runtime crun --storage-driver overlay --events-backend file --syslog container cleanup dc15e8e7f8dd137dfb1cdda98da4f4285f4beb4b0f7b827d72e1d53a2cc2b6fd)" Jul 27 12:31:26 ip-10-31-14-141.us-east-1.aws.redhat.com /usr/bin/podman[24989]: time="2024-07-27T12:31:26-04:00" level=debug msg="Setting custom database backend: \"sqlite\"" Jul 27 12:31:26 ip-10-31-14-141.us-east-1.aws.redhat.com /usr/bin/podman[24989]: time="2024-07-27T12:31:26-04:00" level=debug msg="Using conmon: \"/usr/bin/conmon\"" Jul 27 12:31:26 ip-10-31-14-141.us-east-1.aws.redhat.com /usr/bin/podman[24989]: time="2024-07-27T12:31:26-04:00" level=info msg="Using sqlite as database backend" Jul 27 12:31:26 ip-10-31-14-141.us-east-1.aws.redhat.com systemd[23350]: Stopping libpod-conmon-dc15e8e7f8dd137dfb1cdda98da4f4285f4beb4b0f7b827d72e1d53a2cc2b6fd.scope... ░░ Subject: A stop job for unit UNIT has begun execution ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ A stop job for unit UNIT has begun execution. ░░ ░░ The job identifier is 89. Jul 27 12:31:26 ip-10-31-14-141.us-east-1.aws.redhat.com /usr/bin/podman[24989]: time="2024-07-27T12:31:26-04:00" level=info msg="Received shutdown signal \"terminated\", terminating!" PID=24989 Jul 27 12:31:26 ip-10-31-14-141.us-east-1.aws.redhat.com /usr/bin/podman[24989]: time="2024-07-27T12:31:26-04:00" level=info msg="Invoking shutdown handler \"libpod\"" PID=24989 Jul 27 12:31:26 ip-10-31-14-141.us-east-1.aws.redhat.com systemd[23350]: Stopped libpod-conmon-dc15e8e7f8dd137dfb1cdda98da4f4285f4beb4b0f7b827d72e1d53a2cc2b6fd.scope. ░░ Subject: A stop job for unit UNIT has finished ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ A stop job for unit UNIT has finished. ░░ ░░ The job identifier is 89 and the job result is done. Jul 27 12:31:26 ip-10-31-14-141.us-east-1.aws.redhat.com systemd[23350]: Removed slice user-libpod_pod_c2a70b574e2ebb5c1899aa1f54c69ac32999587dc9dccd553a33bb4100100c51.slice - cgroup user-libpod_pod_c2a70b574e2ebb5c1899aa1f54c69ac32999587dc9dccd553a33bb4100100c51.slice. ░░ Subject: A stop job for unit UNIT has finished ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ A stop job for unit UNIT has finished. ░░ ░░ The job identifier is 88 and the job result is done. Jul 27 12:31:26 ip-10-31-14-141.us-east-1.aws.redhat.com systemd[23350]: user-libpod_pod_c2a70b574e2ebb5c1899aa1f54c69ac32999587dc9dccd553a33bb4100100c51.slice: Failed to open /run/user/3001/systemd/transient/user-libpod_pod_c2a70b574e2ebb5c1899aa1f54c69ac32999587dc9dccd553a33bb4100100c51.slice: No such file or directory Jul 27 12:31:26 ip-10-31-14-141.us-east-1.aws.redhat.com podman[24960]: time="2024-07-27T12:31:26-04:00" level=error msg="Checking if infra needs to be stopped: removing pod c2a70b574e2ebb5c1899aa1f54c69ac32999587dc9dccd553a33bb4100100c51 cgroup: Unit user-libpod_pod_c2a70b574e2ebb5c1899aa1f54c69ac32999587dc9dccd553a33bb4100100c51.slice not loaded." Jul 27 12:31:26 ip-10-31-14-141.us-east-1.aws.redhat.com systemd[23350]: user-libpod_pod_c2a70b574e2ebb5c1899aa1f54c69ac32999587dc9dccd553a33bb4100100c51.slice: Failed to open /run/user/3001/systemd/transient/user-libpod_pod_c2a70b574e2ebb5c1899aa1f54c69ac32999587dc9dccd553a33bb4100100c51.slice: No such file or directory Jul 27 12:31:26 ip-10-31-14-141.us-east-1.aws.redhat.com podman[24960]: Pods stopped: Jul 27 12:31:26 ip-10-31-14-141.us-east-1.aws.redhat.com podman[24960]: c2a70b574e2ebb5c1899aa1f54c69ac32999587dc9dccd553a33bb4100100c51 Jul 27 12:31:26 ip-10-31-14-141.us-east-1.aws.redhat.com podman[24960]: Pods removed: Jul 27 12:31:26 ip-10-31-14-141.us-east-1.aws.redhat.com podman[24960]: Error: removing pod c2a70b574e2ebb5c1899aa1f54c69ac32999587dc9dccd553a33bb4100100c51 cgroup: removing pod c2a70b574e2ebb5c1899aa1f54c69ac32999587dc9dccd553a33bb4100100c51 cgroup: Unit user-libpod_pod_c2a70b574e2ebb5c1899aa1f54c69ac32999587dc9dccd553a33bb4100100c51.slice not loaded. Jul 27 12:31:26 ip-10-31-14-141.us-east-1.aws.redhat.com podman[24960]: Secrets removed: Jul 27 12:31:26 ip-10-31-14-141.us-east-1.aws.redhat.com podman[24960]: Error: %!s() Jul 27 12:31:26 ip-10-31-14-141.us-east-1.aws.redhat.com podman[24960]: Volumes removed: Jul 27 12:31:26 ip-10-31-14-141.us-east-1.aws.redhat.com systemd[23350]: Created slice user-libpod_pod_eb0fd24ab262a446ab756535a25f500b5ec073e9a28a2dc378eb03b2412f15c9.slice - cgroup user-libpod_pod_eb0fd24ab262a446ab756535a25f500b5ec073e9a28a2dc378eb03b2412f15c9.slice. ░░ Subject: A start job for unit UNIT has finished successfully ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ A start job for unit UNIT has finished successfully. ░░ ░░ The job identifier is 90. Jul 27 12:31:26 ip-10-31-14-141.us-east-1.aws.redhat.com systemd[23350]: Started libpod-0d75ad12c35c94688926ef6462d99017b60a05a0ac7d54ba33afb9c5ba9a32d9.scope - libcrun container. ░░ Subject: A start job for unit UNIT has finished successfully ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ A start job for unit UNIT has finished successfully. ░░ ░░ The job identifier is 94. Jul 27 12:31:26 ip-10-31-14-141.us-east-1.aws.redhat.com systemd[23350]: Started rootless-netns-f99ab1a0.scope. ░░ Subject: A start job for unit UNIT has finished successfully ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ A start job for unit UNIT has finished successfully. ░░ ░░ The job identifier is 98. Jul 27 12:31:26 ip-10-31-14-141.us-east-1.aws.redhat.com kernel: podman1: port 1(veth0) entered blocking state Jul 27 12:31:26 ip-10-31-14-141.us-east-1.aws.redhat.com kernel: podman1: port 1(veth0) entered disabled state Jul 27 12:31:26 ip-10-31-14-141.us-east-1.aws.redhat.com kernel: veth0: entered allmulticast mode Jul 27 12:31:26 ip-10-31-14-141.us-east-1.aws.redhat.com kernel: veth0: entered promiscuous mode Jul 27 12:31:26 ip-10-31-14-141.us-east-1.aws.redhat.com kernel: podman1: port 1(veth0) entered blocking state Jul 27 12:31:26 ip-10-31-14-141.us-east-1.aws.redhat.com kernel: podman1: port 1(veth0) entered forwarding state Jul 27 12:31:26 ip-10-31-14-141.us-east-1.aws.redhat.com systemd[23350]: Started run-rdcf39d7dc492494fa8727ea1eb5b9554.scope - /usr/libexec/podman/aardvark-dns --config /run/user/3001/containers/networks/aardvark-dns -p 53 run. ░░ Subject: A start job for unit UNIT has finished successfully ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ A start job for unit UNIT has finished successfully. ░░ ░░ The job identifier is 102. Jul 27 12:31:26 ip-10-31-14-141.us-east-1.aws.redhat.com systemd[23350]: Started libpod-4d3f746ddfdc8e839e07191dc7d9454e765282c5fa18837b0c9d6e959dc28147.scope - libcrun container. ░░ Subject: A start job for unit UNIT has finished successfully ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ A start job for unit UNIT has finished successfully. ░░ ░░ The job identifier is 106. Jul 27 12:31:27 ip-10-31-14-141.us-east-1.aws.redhat.com systemd[23350]: Started libpod-76f2ca2651d698039f05d1d26188e9285c257012702c15c7df24f3cff7078f97.scope - libcrun container. ░░ Subject: A start job for unit UNIT has finished successfully ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ A start job for unit UNIT has finished successfully. ░░ ░░ The job identifier is 111. Jul 27 12:31:27 ip-10-31-14-141.us-east-1.aws.redhat.com podman[24960]: Pod: Jul 27 12:31:27 ip-10-31-14-141.us-east-1.aws.redhat.com podman[24960]: eb0fd24ab262a446ab756535a25f500b5ec073e9a28a2dc378eb03b2412f15c9 Jul 27 12:31:27 ip-10-31-14-141.us-east-1.aws.redhat.com podman[24960]: Container: Jul 27 12:31:27 ip-10-31-14-141.us-east-1.aws.redhat.com podman[24960]: 76f2ca2651d698039f05d1d26188e9285c257012702c15c7df24f3cff7078f97 Jul 27 12:31:27 ip-10-31-14-141.us-east-1.aws.redhat.com systemd[23350]: Started podman-kube@-home-podman_basic_user-.config-containers-ansible\x2dkubernetes.d-httpd1.yml.service - A template for running K8s workloads via podman-kube-play. ░░ Subject: A start job for unit UNIT has finished successfully ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ A start job for unit UNIT has finished successfully. ░░ ░░ The job identifier is 75. Jul 27 12:31:27 ip-10-31-14-141.us-east-1.aws.redhat.com sudo[24954]: pam_unix(sudo:session): session closed for user podman_basic_user Jul 27 12:31:27 ip-10-31-14-141.us-east-1.aws.redhat.com python3.12[25160]: ansible-getent Invoked with database=passwd key=root fail_key=False service=None split=None Jul 27 12:31:28 ip-10-31-14-141.us-east-1.aws.redhat.com python3.12[25274]: ansible-getent Invoked with database=group key=0 fail_key=False service=None split=None Jul 27 12:31:28 ip-10-31-14-141.us-east-1.aws.redhat.com python3.12[25388]: ansible-stat Invoked with path=/usr/bin/getsubids follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Jul 27 12:31:29 ip-10-31-14-141.us-east-1.aws.redhat.com python3.12[25503]: ansible-ansible.legacy.command Invoked with _raw_params=systemd-escape --template podman-kube@.service /etc/containers/ansible-kubernetes.d/httpd2.yml _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Jul 27 12:31:30 ip-10-31-14-141.us-east-1.aws.redhat.com python3.12[25617]: ansible-file Invoked with path=/tmp/lsr_ykmfaaw1_podman/httpd2 state=directory owner=root group=root recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None setype=None attributes=None Jul 27 12:31:31 ip-10-31-14-141.us-east-1.aws.redhat.com python3.12[25730]: ansible-file Invoked with path=/tmp/lsr_ykmfaaw1_podman/httpd2-create state=directory owner=root group=root recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None setype=None attributes=None Jul 27 12:31:31 ip-10-31-14-141.us-east-1.aws.redhat.com systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully. ░░ Subject: Unit succeeded ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ The unit var-lib-containers-storage-overlay.mount has successfully entered the 'dead' state. Jul 27 12:31:31 ip-10-31-14-141.us-east-1.aws.redhat.com systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully. ░░ Subject: Unit succeeded ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ The unit var-lib-containers-storage-overlay.mount has successfully entered the 'dead' state. Jul 27 12:31:32 ip-10-31-14-141.us-east-1.aws.redhat.com podman[25859]: 2024-07-27 12:31:32.240860999 -0400 EDT m=+0.341601332 image pull 9f9ec7f2fdef9168f74e9d057f307955db14d782cff22ded51d277d74798cb2f quay.io/libpod/testimage:20210610 Jul 27 12:31:32 ip-10-31-14-141.us-east-1.aws.redhat.com python3.12[25987]: ansible-stat Invoked with path=/etc/containers/ansible-kubernetes.d/httpd2.yml follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Jul 27 12:31:32 ip-10-31-14-141.us-east-1.aws.redhat.com systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully. ░░ Subject: Unit succeeded ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ The unit var-lib-containers-storage-overlay.mount has successfully entered the 'dead' state. Jul 27 12:31:33 ip-10-31-14-141.us-east-1.aws.redhat.com python3.12[26100]: ansible-file Invoked with path=/etc/containers/ansible-kubernetes.d state=directory owner=root group=0 mode=0755 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Jul 27 12:31:33 ip-10-31-14-141.us-east-1.aws.redhat.com python3.12[26213]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/ansible-kubernetes.d/httpd2.yml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True Jul 27 12:31:33 ip-10-31-14-141.us-east-1.aws.redhat.com python3.12[26303]: ansible-ansible.legacy.copy Invoked with dest=/etc/containers/ansible-kubernetes.d/httpd2.yml owner=root group=0 mode=0644 src=/root/.ansible/tmp/ansible-tmp-1722097893.2574754-8233-220178835220762/.source.yml _original_basename=.b8zurbx6 follow=False checksum=091c16d925d6727a426e671d7cddd074cf4d4a44 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Jul 27 12:31:34 ip-10-31-14-141.us-east-1.aws.redhat.com python3.12[26416]: ansible-containers.podman.podman_play Invoked with state=started debug=True log_level=debug kube_file=/etc/containers/ansible-kubernetes.d/httpd2.yml executable=podman annotation=None authfile=None build=None cert_dir=None configmap=None context_dir=None seccomp_profile_root=None username=None password=NOT_LOGGING_PARAMETER log_driver=None log_opt=None network=None tls_verify=None quiet=None recreate=None userns=None quadlet_dir=None quadlet_filename=None quadlet_options=None Jul 27 12:31:34 ip-10-31-14-141.us-east-1.aws.redhat.com systemd[1]: Created slice machine-libpod_pod_ab3f47105f53d1730873f106426ed61a89d93228bfacb6ee670475b2d79fdeba.slice - cgroup machine-libpod_pod_ab3f47105f53d1730873f106426ed61a89d93228bfacb6ee670475b2d79fdeba.slice. ░░ Subject: A start job for unit machine-libpod_pod_ab3f47105f53d1730873f106426ed61a89d93228bfacb6ee670475b2d79fdeba.slice has finished successfully ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ A start job for unit machine-libpod_pod_ab3f47105f53d1730873f106426ed61a89d93228bfacb6ee670475b2d79fdeba.slice has finished successfully. ░░ ░░ The job identifier is 2400. Jul 27 12:31:34 ip-10-31-14-141.us-east-1.aws.redhat.com podman[26424]: 2024-07-27 12:31:34.480997315 -0400 EDT m=+0.092060451 container create dbbab1074644820422d30320b7d51b465feadc09461c0f66b072f67d201c5833 (image=localhost/podman-pause:5.1.2-1720656000, name=ab3f47105f53-infra, pod_id=ab3f47105f53d1730873f106426ed61a89d93228bfacb6ee670475b2d79fdeba, io.buildah.version=1.36.0) Jul 27 12:31:34 ip-10-31-14-141.us-east-1.aws.redhat.com podman[26424]: 2024-07-27 12:31:34.486968544 -0400 EDT m=+0.098031812 pod create ab3f47105f53d1730873f106426ed61a89d93228bfacb6ee670475b2d79fdeba (image=, name=httpd2) Jul 27 12:31:34 ip-10-31-14-141.us-east-1.aws.redhat.com podman[26424]: 2024-07-27 12:31:34.491218115 -0400 EDT m=+0.102281418 image pull 9f9ec7f2fdef9168f74e9d057f307955db14d782cff22ded51d277d74798cb2f quay.io/libpod/testimage:20210610 Jul 27 12:31:34 ip-10-31-14-141.us-east-1.aws.redhat.com podman[26424]: 2024-07-27 12:31:34.5195566 -0400 EDT m=+0.130619732 container create 93532dadc1783be2948aef80b74ba1a5371faf20d5d5253e0eb8cacae7fd37bf (image=quay.io/libpod/testimage:20210610, name=httpd2-httpd2, pod_id=ab3f47105f53d1730873f106426ed61a89d93228bfacb6ee670475b2d79fdeba, io.containers.autoupdate=registry, created_at=2021-06-10T18:55:36Z, created_by=test/system/build-testimage, io.buildah.version=1.21.0, app=test) Jul 27 12:31:34 ip-10-31-14-141.us-east-1.aws.redhat.com NetworkManager[678]: [1722097894.5330] manager: (podman1): new Bridge device (/org/freedesktop/NetworkManager/Devices/3) Jul 27 12:31:34 ip-10-31-14-141.us-east-1.aws.redhat.com kernel: podman1: port 1(veth0) entered blocking state Jul 27 12:31:34 ip-10-31-14-141.us-east-1.aws.redhat.com kernel: podman1: port 1(veth0) entered disabled state Jul 27 12:31:34 ip-10-31-14-141.us-east-1.aws.redhat.com kernel: veth0: entered allmulticast mode Jul 27 12:31:34 ip-10-31-14-141.us-east-1.aws.redhat.com kernel: veth0: entered promiscuous mode Jul 27 12:31:34 ip-10-31-14-141.us-east-1.aws.redhat.com NetworkManager[678]: [1722097894.5556] manager: (veth0): new Veth device (/org/freedesktop/NetworkManager/Devices/4) Jul 27 12:31:34 ip-10-31-14-141.us-east-1.aws.redhat.com kernel: podman1: port 1(veth0) entered blocking state Jul 27 12:31:34 ip-10-31-14-141.us-east-1.aws.redhat.com kernel: podman1: port 1(veth0) entered forwarding state Jul 27 12:31:34 ip-10-31-14-141.us-east-1.aws.redhat.com NetworkManager[678]: [1722097894.5613] device (veth0): carrier: link connected Jul 27 12:31:34 ip-10-31-14-141.us-east-1.aws.redhat.com NetworkManager[678]: [1722097894.5615] device (podman1): carrier: link connected Jul 27 12:31:34 ip-10-31-14-141.us-east-1.aws.redhat.com (udev-worker)[26438]: Network interface NamePolicy= disabled on kernel command line. Jul 27 12:31:34 ip-10-31-14-141.us-east-1.aws.redhat.com (udev-worker)[26437]: Network interface NamePolicy= disabled on kernel command line. Jul 27 12:31:34 ip-10-31-14-141.us-east-1.aws.redhat.com NetworkManager[678]: [1722097894.6273] device (podman1): state change: unmanaged -> unavailable (reason 'connection-assumed', sys-iface-state: 'external') Jul 27 12:31:34 ip-10-31-14-141.us-east-1.aws.redhat.com NetworkManager[678]: [1722097894.6280] device (podman1): state change: unavailable -> disconnected (reason 'connection-assumed', sys-iface-state: 'external') Jul 27 12:31:34 ip-10-31-14-141.us-east-1.aws.redhat.com NetworkManager[678]: [1722097894.6293] device (podman1): Activation: starting connection 'podman1' (dc037794-997c-44a1-b081-d822a2c4882e) Jul 27 12:31:34 ip-10-31-14-141.us-east-1.aws.redhat.com NetworkManager[678]: [1722097894.6296] device (podman1): state change: disconnected -> prepare (reason 'none', sys-iface-state: 'external') Jul 27 12:31:34 ip-10-31-14-141.us-east-1.aws.redhat.com NetworkManager[678]: [1722097894.6304] device (podman1): state change: prepare -> config (reason 'none', sys-iface-state: 'external') Jul 27 12:31:34 ip-10-31-14-141.us-east-1.aws.redhat.com NetworkManager[678]: [1722097894.6308] device (podman1): state change: config -> ip-config (reason 'none', sys-iface-state: 'external') Jul 27 12:31:34 ip-10-31-14-141.us-east-1.aws.redhat.com NetworkManager[678]: [1722097894.6311] device (podman1): state change: ip-config -> ip-check (reason 'none', sys-iface-state: 'external') Jul 27 12:31:34 ip-10-31-14-141.us-east-1.aws.redhat.com systemd[1]: Starting NetworkManager-dispatcher.service - Network Manager Script Dispatcher Service... ░░ Subject: A start job for unit NetworkManager-dispatcher.service has begun execution ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ A start job for unit NetworkManager-dispatcher.service has begun execution. ░░ ░░ The job identifier is 2407. Jul 27 12:31:34 ip-10-31-14-141.us-east-1.aws.redhat.com systemd[1]: Started NetworkManager-dispatcher.service - Network Manager Script Dispatcher Service. ░░ Subject: A start job for unit NetworkManager-dispatcher.service has finished successfully ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ A start job for unit NetworkManager-dispatcher.service has finished successfully. ░░ ░░ The job identifier is 2407. Jul 27 12:31:34 ip-10-31-14-141.us-east-1.aws.redhat.com NetworkManager[678]: [1722097894.6707] device (podman1): state change: ip-check -> secondaries (reason 'none', sys-iface-state: 'external') Jul 27 12:31:34 ip-10-31-14-141.us-east-1.aws.redhat.com NetworkManager[678]: [1722097894.6710] device (podman1): state change: secondaries -> activated (reason 'none', sys-iface-state: 'external') Jul 27 12:31:34 ip-10-31-14-141.us-east-1.aws.redhat.com NetworkManager[678]: [1722097894.6716] device (podman1): Activation: successful, device activated. Jul 27 12:31:34 ip-10-31-14-141.us-east-1.aws.redhat.com systemd[1]: Started run-r195f68d412d64ef19135845924417b7c.scope - /usr/libexec/podman/aardvark-dns --config /run/containers/networks/aardvark-dns -p 53 run. ░░ Subject: A start job for unit run-r195f68d412d64ef19135845924417b7c.scope has finished successfully ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ A start job for unit run-r195f68d412d64ef19135845924417b7c.scope has finished successfully. ░░ ░░ The job identifier is 2487. Jul 27 12:31:34 ip-10-31-14-141.us-east-1.aws.redhat.com aardvark-dns[26463]: starting aardvark on a child with pid 26474 Jul 27 12:31:34 ip-10-31-14-141.us-east-1.aws.redhat.com aardvark-dns[26474]: Successfully parsed config Jul 27 12:31:34 ip-10-31-14-141.us-east-1.aws.redhat.com aardvark-dns[26474]: Listen v4 ip {"podman-default-kube-network": [10.89.0.1]} Jul 27 12:31:34 ip-10-31-14-141.us-east-1.aws.redhat.com aardvark-dns[26474]: Listen v6 ip {} Jul 27 12:31:34 ip-10-31-14-141.us-east-1.aws.redhat.com aardvark-dns[26474]: Will Forward dns requests to udp://1.1.1.1:53 Jul 27 12:31:34 ip-10-31-14-141.us-east-1.aws.redhat.com aardvark-dns[26474]: Starting listen on udp 10.89.0.1:53 Jul 27 12:31:34 ip-10-31-14-141.us-east-1.aws.redhat.com systemd[1]: Started libpod-conmon-dbbab1074644820422d30320b7d51b465feadc09461c0f66b072f67d201c5833.scope. ░░ Subject: A start job for unit libpod-conmon-dbbab1074644820422d30320b7d51b465feadc09461c0f66b072f67d201c5833.scope has finished successfully ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ A start job for unit libpod-conmon-dbbab1074644820422d30320b7d51b465feadc09461c0f66b072f67d201c5833.scope has finished successfully. ░░ ░░ The job identifier is 2493. Jul 27 12:31:34 ip-10-31-14-141.us-east-1.aws.redhat.com conmon[26480]: conmon dbbab1074644820422d3 : addr{sun_family=AF_UNIX, sun_path=/proc/self/fd/12/attach} Jul 27 12:31:34 ip-10-31-14-141.us-east-1.aws.redhat.com conmon[26480]: conmon dbbab1074644820422d3 : terminal_ctrl_fd: 12 Jul 27 12:31:34 ip-10-31-14-141.us-east-1.aws.redhat.com conmon[26480]: conmon dbbab1074644820422d3 : winsz read side: 16, winsz write side: 17 Jul 27 12:31:34 ip-10-31-14-141.us-east-1.aws.redhat.com systemd[1]: tmp-crun.mO1K4v.mount: Deactivated successfully. ░░ Subject: Unit succeeded ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ The unit tmp-crun.mO1K4v.mount has successfully entered the 'dead' state. Jul 27 12:31:34 ip-10-31-14-141.us-east-1.aws.redhat.com systemd[1]: Started libpod-dbbab1074644820422d30320b7d51b465feadc09461c0f66b072f67d201c5833.scope - libcrun container. ░░ Subject: A start job for unit libpod-dbbab1074644820422d30320b7d51b465feadc09461c0f66b072f67d201c5833.scope has finished successfully ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ A start job for unit libpod-dbbab1074644820422d30320b7d51b465feadc09461c0f66b072f67d201c5833.scope has finished successfully. ░░ ░░ The job identifier is 2500. Jul 27 12:31:34 ip-10-31-14-141.us-east-1.aws.redhat.com conmon[26480]: conmon dbbab1074644820422d3 : container PID: 26482 Jul 27 12:31:34 ip-10-31-14-141.us-east-1.aws.redhat.com podman[26424]: 2024-07-27 12:31:34.779301625 -0400 EDT m=+0.390365067 container init dbbab1074644820422d30320b7d51b465feadc09461c0f66b072f67d201c5833 (image=localhost/podman-pause:5.1.2-1720656000, name=ab3f47105f53-infra, pod_id=ab3f47105f53d1730873f106426ed61a89d93228bfacb6ee670475b2d79fdeba, io.buildah.version=1.36.0) Jul 27 12:31:34 ip-10-31-14-141.us-east-1.aws.redhat.com podman[26424]: 2024-07-27 12:31:34.783331928 -0400 EDT m=+0.394395361 container start dbbab1074644820422d30320b7d51b465feadc09461c0f66b072f67d201c5833 (image=localhost/podman-pause:5.1.2-1720656000, name=ab3f47105f53-infra, pod_id=ab3f47105f53d1730873f106426ed61a89d93228bfacb6ee670475b2d79fdeba, io.buildah.version=1.36.0) Jul 27 12:31:34 ip-10-31-14-141.us-east-1.aws.redhat.com systemd[1]: Started libpod-conmon-93532dadc1783be2948aef80b74ba1a5371faf20d5d5253e0eb8cacae7fd37bf.scope. ░░ Subject: A start job for unit libpod-conmon-93532dadc1783be2948aef80b74ba1a5371faf20d5d5253e0eb8cacae7fd37bf.scope has finished successfully ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ A start job for unit libpod-conmon-93532dadc1783be2948aef80b74ba1a5371faf20d5d5253e0eb8cacae7fd37bf.scope has finished successfully. ░░ ░░ The job identifier is 2507. Jul 27 12:31:34 ip-10-31-14-141.us-east-1.aws.redhat.com conmon[26486]: conmon 93532dadc1783be2948a : addr{sun_family=AF_UNIX, sun_path=/proc/self/fd/11/attach} Jul 27 12:31:34 ip-10-31-14-141.us-east-1.aws.redhat.com conmon[26486]: conmon 93532dadc1783be2948a : terminal_ctrl_fd: 11 Jul 27 12:31:34 ip-10-31-14-141.us-east-1.aws.redhat.com conmon[26486]: conmon 93532dadc1783be2948a : winsz read side: 15, winsz write side: 16 Jul 27 12:31:34 ip-10-31-14-141.us-east-1.aws.redhat.com systemd[1]: Started libpod-93532dadc1783be2948aef80b74ba1a5371faf20d5d5253e0eb8cacae7fd37bf.scope - libcrun container. ░░ Subject: A start job for unit libpod-93532dadc1783be2948aef80b74ba1a5371faf20d5d5253e0eb8cacae7fd37bf.scope has finished successfully ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ A start job for unit libpod-93532dadc1783be2948aef80b74ba1a5371faf20d5d5253e0eb8cacae7fd37bf.scope has finished successfully. ░░ ░░ The job identifier is 2514. Jul 27 12:31:34 ip-10-31-14-141.us-east-1.aws.redhat.com conmon[26486]: conmon 93532dadc1783be2948a : container PID: 26488 Jul 27 12:31:34 ip-10-31-14-141.us-east-1.aws.redhat.com podman[26424]: 2024-07-27 12:31:34.84980643 -0400 EDT m=+0.460869953 container init 93532dadc1783be2948aef80b74ba1a5371faf20d5d5253e0eb8cacae7fd37bf (image=quay.io/libpod/testimage:20210610, name=httpd2-httpd2, pod_id=ab3f47105f53d1730873f106426ed61a89d93228bfacb6ee670475b2d79fdeba, io.buildah.version=1.21.0, io.containers.autoupdate=registry, app=test, created_at=2021-06-10T18:55:36Z, created_by=test/system/build-testimage) Jul 27 12:31:34 ip-10-31-14-141.us-east-1.aws.redhat.com podman[26424]: 2024-07-27 12:31:34.853382187 -0400 EDT m=+0.464445620 container start 93532dadc1783be2948aef80b74ba1a5371faf20d5d5253e0eb8cacae7fd37bf (image=quay.io/libpod/testimage:20210610, name=httpd2-httpd2, pod_id=ab3f47105f53d1730873f106426ed61a89d93228bfacb6ee670475b2d79fdeba, app=test, created_at=2021-06-10T18:55:36Z, created_by=test/system/build-testimage, io.buildah.version=1.21.0, io.containers.autoupdate=registry) Jul 27 12:31:34 ip-10-31-14-141.us-east-1.aws.redhat.com podman[26424]: 2024-07-27 12:31:34.859125838 -0400 EDT m=+0.470188983 pod start ab3f47105f53d1730873f106426ed61a89d93228bfacb6ee670475b2d79fdeba (image=, name=httpd2) Jul 27 12:31:34 ip-10-31-14-141.us-east-1.aws.redhat.com python3.12[26416]: ansible-containers.podman.podman_play PODMAN-PLAY-KUBE command: /usr/bin/podman play kube --start=true --log-level=debug /etc/containers/ansible-kubernetes.d/httpd2.yml Jul 27 12:31:34 ip-10-31-14-141.us-east-1.aws.redhat.com python3.12[26416]: ansible-containers.podman.podman_play PODMAN-PLAY-KUBE stdout: Pod: ab3f47105f53d1730873f106426ed61a89d93228bfacb6ee670475b2d79fdeba Container: 93532dadc1783be2948aef80b74ba1a5371faf20d5d5253e0eb8cacae7fd37bf Jul 27 12:31:34 ip-10-31-14-141.us-east-1.aws.redhat.com python3.12[26416]: ansible-containers.podman.podman_play PODMAN-PLAY-KUBE stderr: time="2024-07-27T12:31:34-04:00" level=info msg="/usr/bin/podman filtering at log level debug" time="2024-07-27T12:31:34-04:00" level=debug msg="Called kube.PersistentPreRunE(/usr/bin/podman play kube --start=true --log-level=debug /etc/containers/ansible-kubernetes.d/httpd2.yml)" time="2024-07-27T12:31:34-04:00" level=debug msg="Using conmon: \"/usr/bin/conmon\"" time="2024-07-27T12:31:34-04:00" level=info msg="Using sqlite as database backend" time="2024-07-27T12:31:34-04:00" level=debug msg="Using graph driver overlay" time="2024-07-27T12:31:34-04:00" level=debug msg="Using graph root /var/lib/containers/storage" time="2024-07-27T12:31:34-04:00" level=debug msg="Using run root /run/containers/storage" time="2024-07-27T12:31:34-04:00" level=debug msg="Using static dir /var/lib/containers/storage/libpod" time="2024-07-27T12:31:34-04:00" level=debug msg="Using tmp dir /run/libpod" time="2024-07-27T12:31:34-04:00" level=debug msg="Using volume path /var/lib/containers/storage/volumes" time="2024-07-27T12:31:34-04:00" level=debug msg="Using transient store: false" time="2024-07-27T12:31:34-04:00" level=debug msg="[graphdriver] trying provided driver \"overlay\"" time="2024-07-27T12:31:34-04:00" level=debug msg="Cached value indicated that overlay is supported" time="2024-07-27T12:31:34-04:00" level=debug msg="Cached value indicated that overlay is supported" time="2024-07-27T12:31:34-04:00" level=debug msg="Cached value indicated that metacopy is being used" time="2024-07-27T12:31:34-04:00" level=debug msg="Cached value indicated that native-diff is not being used" time="2024-07-27T12:31:34-04:00" level=info msg="Not using native diff for overlay, this may cause degraded performance for building images: kernel has CONFIG_OVERLAY_FS_REDIRECT_DIR enabled" time="2024-07-27T12:31:34-04:00" level=debug msg="backingFs=xfs, projectQuotaSupported=false, useNativeDiff=false, usingMetacopy=true" time="2024-07-27T12:31:34-04:00" level=debug msg="Initializing event backend journald" time="2024-07-27T12:31:34-04:00" level=debug msg="Configured OCI runtime runsc initialization failed: no valid executable found for OCI runtime runsc: invalid argument" time="2024-07-27T12:31:34-04:00" level=debug msg="Configured OCI runtime ocijail initialization failed: no valid executable found for OCI runtime ocijail: invalid argument" time="2024-07-27T12:31:34-04:00" level=debug msg="Configured OCI runtime crun-vm initialization failed: no valid executable found for OCI runtime crun-vm: invalid argument" time="2024-07-27T12:31:34-04:00" level=debug msg="Configured OCI runtime runc initialization failed: no valid executable found for OCI runtime runc: invalid argument" time="2024-07-27T12:31:34-04:00" level=debug msg="Configured OCI runtime runj initialization failed: no valid executable found for OCI runtime runj: invalid argument" time="2024-07-27T12:31:34-04:00" level=debug msg="Configured OCI runtime kata initialization failed: no valid executable found for OCI runtime kata: invalid argument" time="2024-07-27T12:31:34-04:00" level=debug msg="Configured OCI runtime crun-wasm initialization failed: no valid executable found for OCI runtime crun-wasm: invalid argument" time="2024-07-27T12:31:34-04:00" level=debug msg="Configured OCI runtime youki initialization failed: no valid executable found for OCI runtime youki: invalid argument" time="2024-07-27T12:31:34-04:00" level=debug msg="Configured OCI runtime krun initialization failed: no valid executable found for OCI runtime krun: invalid argument" time="2024-07-27T12:31:34-04:00" level=debug msg="Using OCI runtime \"/usr/bin/crun\"" time="2024-07-27T12:31:34-04:00" level=info msg="Setting parallel job count to 7" time="2024-07-27T12:31:34-04:00" level=debug msg="Successfully loaded network podman-default-kube-network: &{podman-default-kube-network 8f5134bcda5f516a62e074f65b89e1dc156f1d967eb9102e712a946d948ca3ab bridge podman1 2024-07-27 12:29:35.997811221 -0400 EDT [{{{10.89.0.0 ffffff00}} 10.89.0.1 }] [] false false true [] map[] map[] map[driver:host-local]}" time="2024-07-27T12:31:34-04:00" level=debug msg="Successfully loaded 2 networks" time="2024-07-27T12:31:34-04:00" level=debug msg="Looking up image \"localhost/podman-pause:5.1.2-1720656000\" in local containers storage" time="2024-07-27T12:31:34-04:00" level=debug msg="Normalized platform linux/amd64 to {amd64 linux [] }" time="2024-07-27T12:31:34-04:00" level=debug msg="Trying \"localhost/podman-pause:5.1.2-1720656000\" ..." time="2024-07-27T12:31:34-04:00" level=debug msg="parsed reference into \"[overlay@/var/lib/containers/storage+/run/containers/storage:overlay.mountopt=nodev,metacopy=on]@dd68369a4c98aa462789d69687e5025b82f278ebe66afc63c07c145727b83b12\"" time="2024-07-27T12:31:34-04:00" level=debug msg="Found image \"localhost/podman-pause:5.1.2-1720656000\" as \"localhost/podman-pause:5.1.2-1720656000\" in local containers storage" time="2024-07-27T12:31:34-04:00" level=debug msg="Found image \"localhost/podman-pause:5.1.2-1720656000\" as \"localhost/podman-pause:5.1.2-1720656000\" in local containers storage ([overlay@/var/lib/containers/storage+/run/containers/storage:overlay.mountopt=nodev,metacopy=on]@dd68369a4c98aa462789d69687e5025b82f278ebe66afc63c07c145727b83b12)" time="2024-07-27T12:31:34-04:00" level=debug msg="exporting opaque data as blob \"sha256:dd68369a4c98aa462789d69687e5025b82f278ebe66afc63c07c145727b83b12\"" time="2024-07-27T12:31:34-04:00" level=debug msg="Pod using bridge network mode" time="2024-07-27T12:31:34-04:00" level=debug msg="Created cgroup path machine.slice/machine-libpod_pod_ab3f47105f53d1730873f106426ed61a89d93228bfacb6ee670475b2d79fdeba.slice for parent machine.slice and name libpod_pod_ab3f47105f53d1730873f106426ed61a89d93228bfacb6ee670475b2d79fdeba" time="2024-07-27T12:31:34-04:00" level=debug msg="Created cgroup machine.slice/machine-libpod_pod_ab3f47105f53d1730873f106426ed61a89d93228bfacb6ee670475b2d79fdeba.slice" time="2024-07-27T12:31:34-04:00" level=debug msg="Got pod cgroup as machine.slice/machine-libpod_pod_ab3f47105f53d1730873f106426ed61a89d93228bfacb6ee670475b2d79fdeba.slice" time="2024-07-27T12:31:34-04:00" level=debug msg="Looking up image \"localhost/podman-pause:5.1.2-1720656000\" in local containers storage" time="2024-07-27T12:31:34-04:00" level=debug msg="Normalized platform linux/amd64 to {amd64 linux [] }" time="2024-07-27T12:31:34-04:00" level=debug msg="Trying \"localhost/podman-pause:5.1.2-1720656000\" ..." time="2024-07-27T12:31:34-04:00" level=debug msg="parsed reference into \"[overlay@/var/lib/containers/storage+/run/containers/storage:overlay.mountopt=nodev,metacopy=on]@dd68369a4c98aa462789d69687e5025b82f278ebe66afc63c07c145727b83b12\"" time="2024-07-27T12:31:34-04:00" level=debug msg="Found image \"localhost/podman-pause:5.1.2-1720656000\" as \"localhost/podman-pause:5.1.2-1720656000\" in local containers storage" time="2024-07-27T12:31:34-04:00" level=debug msg="Found image \"localhost/podman-pause:5.1.2-1720656000\" as \"localhost/podman-pause:5.1.2-1720656000\" in local containers storage ([overlay@/var/lib/containers/storage+/run/containers/storage:overlay.mountopt=nodev,metacopy=on]@dd68369a4c98aa462789d69687e5025b82f278ebe66afc63c07c145727b83b12)" time="2024-07-27T12:31:34-04:00" level=debug msg="exporting opaque data as blob \"sha256:dd68369a4c98aa462789d69687e5025b82f278ebe66afc63c07c145727b83b12\"" time="2024-07-27T12:31:34-04:00" level=debug msg="Inspecting image dd68369a4c98aa462789d69687e5025b82f278ebe66afc63c07c145727b83b12" time="2024-07-27T12:31:34-04:00" level=debug msg="exporting opaque data as blob \"sha256:dd68369a4c98aa462789d69687e5025b82f278ebe66afc63c07c145727b83b12\"" time="2024-07-27T12:31:34-04:00" level=debug msg="Inspecting image dd68369a4c98aa462789d69687e5025b82f278ebe66afc63c07c145727b83b12" time="2024-07-27T12:31:34-04:00" level=debug msg="Inspecting image dd68369a4c98aa462789d69687e5025b82f278ebe66afc63c07c145727b83b12" time="2024-07-27T12:31:34-04:00" level=debug msg="using systemd mode: false" time="2024-07-27T12:31:34-04:00" level=debug msg="setting container name ab3f47105f53-infra" time="2024-07-27T12:31:34-04:00" level=debug msg="Loading seccomp profile from \"/usr/share/containers/seccomp.json\"" time="2024-07-27T12:31:34-04:00" level=debug msg="Allocated lock 1 for container dbbab1074644820422d30320b7d51b465feadc09461c0f66b072f67d201c5833" time="2024-07-27T12:31:34-04:00" level=debug msg="exporting opaque data as blob \"sha256:dd68369a4c98aa462789d69687e5025b82f278ebe66afc63c07c145727b83b12\"" time="2024-07-27T12:31:34-04:00" level=debug msg="Cached value indicated that idmapped mounts for overlay are supported" time="2024-07-27T12:31:34-04:00" level=debug msg="Created container \"dbbab1074644820422d30320b7d51b465feadc09461c0f66b072f67d201c5833\"" time="2024-07-27T12:31:34-04:00" level=debug msg="Container \"dbbab1074644820422d30320b7d51b465feadc09461c0f66b072f67d201c5833\" has work directory \"/var/lib/containers/storage/overlay-containers/dbbab1074644820422d30320b7d51b465feadc09461c0f66b072f67d201c5833/userdata\"" time="2024-07-27T12:31:34-04:00" level=debug msg="Container \"dbbab1074644820422d30320b7d51b465feadc09461c0f66b072f67d201c5833\" has run directory \"/run/containers/storage/overlay-containers/dbbab1074644820422d30320b7d51b465feadc09461c0f66b072f67d201c5833/userdata\"" time="2024-07-27T12:31:34-04:00" level=debug msg="Looking up image \"quay.io/libpod/testimage:20210610\" in local containers storage" time="2024-07-27T12:31:34-04:00" level=debug msg="Normalized platform linux/amd64 to {amd64 linux [] }" time="2024-07-27T12:31:34-04:00" level=debug msg="Trying \"quay.io/libpod/testimage:20210610\" ..." time="2024-07-27T12:31:34-04:00" level=debug msg="parsed reference into \"[overlay@/var/lib/containers/storage+/run/containers/storage:overlay.mountopt=nodev,metacopy=on]@9f9ec7f2fdef9168f74e9d057f307955db14d782cff22ded51d277d74798cb2f\"" time="2024-07-27T12:31:34-04:00" level=debug msg="Found image \"quay.io/libpod/testimage:20210610\" as \"quay.io/libpod/testimage:20210610\" in local containers storage" time="2024-07-27T12:31:34-04:00" level=debug msg="Found image \"quay.io/libpod/testimage:20210610\" as \"quay.io/libpod/testimage:20210610\" in local containers storage ([overlay@/var/lib/containers/storage+/run/containers/storage:overlay.mountopt=nodev,metacopy=on]@9f9ec7f2fdef9168f74e9d057f307955db14d782cff22ded51d277d74798cb2f)" time="2024-07-27T12:31:34-04:00" level=debug msg="exporting opaque data as blob \"sha256:9f9ec7f2fdef9168f74e9d057f307955db14d782cff22ded51d277d74798cb2f\"" time="2024-07-27T12:31:34-04:00" level=debug msg="Pulling image quay.io/libpod/testimage:20210610 (policy: missing)" time="2024-07-27T12:31:34-04:00" level=debug msg="Looking up image \"quay.io/libpod/testimage:20210610\" in local containers storage" time="2024-07-27T12:31:34-04:00" level=debug msg="Normalized platform linux/amd64 to {amd64 linux [] }" time="2024-07-27T12:31:34-04:00" level=debug msg="Trying \"quay.io/libpod/testimage:20210610\" ..." time="2024-07-27T12:31:34-04:00" level=debug msg="parsed reference into \"[overlay@/var/lib/containers/storage+/run/containers/storage:overlay.mountopt=nodev,metacopy=on]@9f9ec7f2fdef9168f74e9d057f307955db14d782cff22ded51d277d74798cb2f\"" time="2024-07-27T12:31:34-04:00" level=debug msg="Found image \"quay.io/libpod/testimage:20210610\" as \"quay.io/libpod/testimage:20210610\" in local containers storage" time="2024-07-27T12:31:34-04:00" level=debug msg="Found image \"quay.io/libpod/testimage:20210610\" as \"quay.io/libpod/testimage:20210610\" in local containers storage ([overlay@/var/lib/containers/storage+/run/containers/storage:overlay.mountopt=nodev,metacopy=on]@9f9ec7f2fdef9168f74e9d057f307955db14d782cff22ded51d277d74798cb2f)" time="2024-07-27T12:31:34-04:00" level=debug msg="exporting opaque data as blob \"sha256:9f9ec7f2fdef9168f74e9d057f307955db14d782cff22ded51d277d74798cb2f\"" time="2024-07-27T12:31:34-04:00" level=debug msg="Looking up image \"quay.io/libpod/testimage:20210610\" in local containers storage" time="2024-07-27T12:31:34-04:00" level=debug msg="Normalized platform linux/amd64 to {amd64 linux [] }" time="2024-07-27T12:31:34-04:00" level=debug msg="Trying \"quay.io/libpod/testimage:20210610\" ..." time="2024-07-27T12:31:34-04:00" level=debug msg="parsed reference into \"[overlay@/var/lib/containers/storage+/run/containers/storage:overlay.mountopt=nodev,metacopy=on]@9f9ec7f2fdef9168f74e9d057f307955db14d782cff22ded51d277d74798cb2f\"" time="2024-07-27T12:31:34-04:00" level=debug msg="Found image \"quay.io/libpod/testimage:20210610\" as \"quay.io/libpod/testimage:20210610\" in local containers storage" time="2024-07-27T12:31:34-04:00" level=debug msg="Found image \"quay.io/libpod/testimage:20210610\" as \"quay.io/libpod/testimage:20210610\" in local containers storage ([overlay@/var/lib/containers/storage+/run/containers/storage:overlay.mountopt=nodev,metacopy=on]@9f9ec7f2fdef9168f74e9d057f307955db14d782cff22ded51d277d74798cb2f)" time="2024-07-27T12:31:34-04:00" level=debug msg="exporting opaque data as blob \"sha256:9f9ec7f2fdef9168f74e9d057f307955db14d782cff22ded51d277d74798cb2f\"" time="2024-07-27T12:31:34-04:00" level=debug msg="Inspecting image 9f9ec7f2fdef9168f74e9d057f307955db14d782cff22ded51d277d74798cb2f" time="2024-07-27T12:31:34-04:00" level=debug msg="exporting opaque data as blob \"sha256:9f9ec7f2fdef9168f74e9d057f307955db14d782cff22ded51d277d74798cb2f\"" time="2024-07-27T12:31:34-04:00" level=debug msg="exporting opaque data as blob \"sha256:9f9ec7f2fdef9168f74e9d057f307955db14d782cff22ded51d277d74798cb2f\"" time="2024-07-27T12:31:34-04:00" level=debug msg="Looking up image \"quay.io/libpod/testimage:20210610\" in local containers storage" time="2024-07-27T12:31:34-04:00" level=debug msg="Normalized platform linux/amd64 to {amd64 linux [] }" time="2024-07-27T12:31:34-04:00" level=debug msg="Trying \"quay.io/libpod/testimage:20210610\" ..." time="2024-07-27T12:31:34-04:00" level=debug msg="parsed reference into \"[overlay@/var/lib/containers/storage+/run/containers/storage:overlay.mountopt=nodev,metacopy=on]@9f9ec7f2fdef9168f74e9d057f307955db14d782cff22ded51d277d74798cb2f\"" time="2024-07-27T12:31:34-04:00" level=debug msg="Found image \"quay.io/libpod/testimage:20210610\" as \"quay.io/libpod/testimage:20210610\" in local containers storage" time="2024-07-27T12:31:34-04:00" level=debug msg="Found image \"quay.io/libpod/testimage:20210610\" as \"quay.io/libpod/testimage:20210610\" in local containers storage ([overlay@/var/lib/containers/storage+/run/containers/storage:overlay.mountopt=nodev,metacopy=on]@9f9ec7f2fdef9168f74e9d057f307955db14d782cff22ded51d277d74798cb2f)" time="2024-07-27T12:31:34-04:00" level=debug msg="exporting opaque data as blob \"sha256:9f9ec7f2fdef9168f74e9d057f307955db14d782cff22ded51d277d74798cb2f\"" time="2024-07-27T12:31:34-04:00" level=debug msg="Inspecting image 9f9ec7f2fdef9168f74e9d057f307955db14d782cff22ded51d277d74798cb2f" time="2024-07-27T12:31:34-04:00" level=debug msg="exporting opaque data as blob \"sha256:9f9ec7f2fdef9168f74e9d057f307955db14d782cff22ded51d277d74798cb2f\"" time="2024-07-27T12:31:34-04:00" level=debug msg="exporting opaque data as blob \"sha256:9f9ec7f2fdef9168f74e9d057f307955db14d782cff22ded51d277d74798cb2f\"" time="2024-07-27T12:31:34-04:00" level=debug msg="Inspecting image 9f9ec7f2fdef9168f74e9d057f307955db14d782cff22ded51d277d74798cb2f" time="2024-07-27T12:31:34-04:00" level=debug msg="Inspecting image 9f9ec7f2fdef9168f74e9d057f307955db14d782cff22ded51d277d74798cb2f" time="2024-07-27T12:31:34-04:00" level=debug msg="using systemd mode: false" time="2024-07-27T12:31:34-04:00" level=debug msg="adding container to pod httpd2" time="2024-07-27T12:31:34-04:00" level=debug msg="setting container name httpd2-httpd2" time="2024-07-27T12:31:34-04:00" level=debug msg="Loading seccomp profile from \"/usr/share/containers/seccomp.json\"" time="2024-07-27T12:31:34-04:00" level=info msg="Sysctl net.ipv4.ping_group_range=0 0 ignored in containers.conf, since Network Namespace set to host" time="2024-07-27T12:31:34-04:00" level=debug msg="Adding mount /proc" time="2024-07-27T12:31:34-04:00" level=debug msg="Adding mount /dev" time="2024-07-27T12:31:34-04:00" level=debug msg="Adding mount /dev/pts" time="2024-07-27T12:31:34-04:00" level=debug msg="Adding mount /dev/mqueue" time="2024-07-27T12:31:34-04:00" level=debug msg="Adding mount /sys" time="2024-07-27T12:31:34-04:00" level=debug msg="Adding mount /sys/fs/cgroup" time="2024-07-27T12:31:34-04:00" level=debug msg="Allocated lock 2 for container 93532dadc1783be2948aef80b74ba1a5371faf20d5d5253e0eb8cacae7fd37bf" time="2024-07-27T12:31:34-04:00" level=debug msg="exporting opaque data as blob \"sha256:9f9ec7f2fdef9168f74e9d057f307955db14d782cff22ded51d277d74798cb2f\"" time="2024-07-27T12:31:34-04:00" level=debug msg="Created container \"93532dadc1783be2948aef80b74ba1a5371faf20d5d5253e0eb8cacae7fd37bf\"" time="2024-07-27T12:31:34-04:00" level=debug msg="Container \"93532dadc1783be2948aef80b74ba1a5371faf20d5d5253e0eb8cacae7fd37bf\" has work directory \"/var/lib/containers/storage/overlay-containers/93532dadc1783be2948aef80b74ba1a5371faf20d5d5253e0eb8cacae7fd37bf/userdata\"" time="2024-07-27T12:31:34-04:00" level=debug msg="Container \"93532dadc1783be2948aef80b74ba1a5371faf20d5d5253e0eb8cacae7fd37bf\" has run directory \"/run/containers/storage/overlay-containers/93532dadc1783be2948aef80b74ba1a5371faf20d5d5253e0eb8cacae7fd37bf/userdata\"" time="2024-07-27T12:31:34-04:00" level=debug msg="Strongconnecting node dbbab1074644820422d30320b7d51b465feadc09461c0f66b072f67d201c5833" time="2024-07-27T12:31:34-04:00" level=debug msg="Pushed dbbab1074644820422d30320b7d51b465feadc09461c0f66b072f67d201c5833 onto stack" time="2024-07-27T12:31:34-04:00" level=debug msg="Finishing node dbbab1074644820422d30320b7d51b465feadc09461c0f66b072f67d201c5833. Popped dbbab1074644820422d30320b7d51b465feadc09461c0f66b072f67d201c5833 off stack" time="2024-07-27T12:31:34-04:00" level=debug msg="Strongconnecting node 93532dadc1783be2948aef80b74ba1a5371faf20d5d5253e0eb8cacae7fd37bf" time="2024-07-27T12:31:34-04:00" level=debug msg="Pushed 93532dadc1783be2948aef80b74ba1a5371faf20d5d5253e0eb8cacae7fd37bf onto stack" time="2024-07-27T12:31:34-04:00" level=debug msg="Finishing node 93532dadc1783be2948aef80b74ba1a5371faf20d5d5253e0eb8cacae7fd37bf. Popped 93532dadc1783be2948aef80b74ba1a5371faf20d5d5253e0eb8cacae7fd37bf off stack" time="2024-07-27T12:31:34-04:00" level=debug msg="overlay: mount_data=lowerdir=/var/lib/containers/storage/overlay/l/XHP5KYIFQTOBEZIWH55P4XSJMS,upperdir=/var/lib/containers/storage/overlay/41f4fbe1d2e8d3fc1965435eec797ddd0ee4533f30fb8fafae206940b3c6c4b1/diff,workdir=/var/lib/containers/storage/overlay/41f4fbe1d2e8d3fc1965435eec797ddd0ee4533f30fb8fafae206940b3c6c4b1/work,nodev,metacopy=on,context=\"system_u:object_r:container_file_t:s0:c508,c881\"" time="2024-07-27T12:31:34-04:00" level=debug msg="Mounted container \"dbbab1074644820422d30320b7d51b465feadc09461c0f66b072f67d201c5833\" at \"/var/lib/containers/storage/overlay/41f4fbe1d2e8d3fc1965435eec797ddd0ee4533f30fb8fafae206940b3c6c4b1/merged\"" time="2024-07-27T12:31:34-04:00" level=debug msg="Created root filesystem for container dbbab1074644820422d30320b7d51b465feadc09461c0f66b072f67d201c5833 at /var/lib/containers/storage/overlay/41f4fbe1d2e8d3fc1965435eec797ddd0ee4533f30fb8fafae206940b3c6c4b1/merged" time="2024-07-27T12:31:34-04:00" level=debug msg="Made network namespace at /run/netns/netns-ad67feaf-bee7-774e-b54b-8987f6ef7689 for container dbbab1074644820422d30320b7d51b465feadc09461c0f66b072f67d201c5833" [DEBUG netavark::network::validation] "Validating network namespace..." [DEBUG netavark::commands::setup] "Setting up..." [INFO netavark::firewall] Using nftables firewall driver [DEBUG netavark::network::bridge] Setup network podman-default-kube-network [DEBUG netavark::network::bridge] Container interface name: eth0 with IP addresses [10.89.0.2/24] [DEBUG netavark::network::bridge] Bridge name: podman1 with IP addresses [10.89.0.1/24] [DEBUG netavark::network::core_utils] Setting sysctl value for net.ipv4.ip_forward to 1 [DEBUG netavark::network::core_utils] Setting sysctl value for /proc/sys/net/ipv4/conf/podman1/rp_filter to 2 [DEBUG netavark::network::core_utils] Setting sysctl value for /proc/sys/net/ipv6/conf/eth0/autoconf to 0 [DEBUG netavark::network::core_utils] Setting sysctl value for /proc/sys/net/ipv4/conf/eth0/arp_notify to 1 [DEBUG netavark::network::core_utils] Setting sysctl value for /proc/sys/net/ipv4/conf/eth0/rp_filter to 2 [INFO netavark::network::netlink] Adding route (dest: 0.0.0.0/0 ,gw: 10.89.0.1, metric 100) [DEBUG netavark::firewall::firewalld] Adding firewalld rules for network 10.89.0.0/24 [DEBUG netavark::firewall::firewalld] Adding subnet 10.89.0.0/24 to zone trusted as source [INFO netavark::firewall::nft] Creating container chain nv_8f5134bc_10_89_0_0_nm24 [DEBUG netavark::network::core_utils] Setting sysctl value for net.ipv4.conf.podman1.route_localnet to 1 [DEBUG netavark::dns::aardvark] Spawning aardvark server [DEBUG netavark::dns::aardvark] start aardvark-dns: ["systemd-run", "-q", "--scope", "/usr/libexec/podman/aardvark-dns", "--config", "/run/containers/networks/aardvark-dns", "-p", "53", "run"] [DEBUG netavark::commands::setup] { "podman-default-kube-network": StatusBlock { dns_search_domains: Some( [ "dns.podman", ], ), dns_server_ips: Some( [ 10.89.0.1, ], ), interfaces: Some( { "eth0": NetInterface { mac_address: "4e:2d:a7:87:2c:f2", subnets: Some( [ NetAddress { gateway: Some( 10.89.0.1, ), ipnet: 10.89.0.2/24, }, ], ), }, }, ), }, } [DEBUG netavark::commands::setup] "Setup complete" time="2024-07-27T12:31:34-04:00" level=debug msg="/etc/system-fips does not exist on host, not mounting FIPS mode subscription" time="2024-07-27T12:31:34-04:00" level=debug msg="Setting Cgroups for container dbbab1074644820422d30320b7d51b465feadc09461c0f66b072f67d201c5833 to machine-libpod_pod_ab3f47105f53d1730873f106426ed61a89d93228bfacb6ee670475b2d79fdeba.slice:libpod:dbbab1074644820422d30320b7d51b465feadc09461c0f66b072f67d201c5833" time="2024-07-27T12:31:34-04:00" level=debug msg="reading hooks from /usr/share/containers/oci/hooks.d" time="2024-07-27T12:31:34-04:00" level=debug msg="Workdir \"/\" resolved to host path \"/var/lib/containers/storage/overlay/41f4fbe1d2e8d3fc1965435eec797ddd0ee4533f30fb8fafae206940b3c6c4b1/merged\"" time="2024-07-27T12:31:34-04:00" level=debug msg="Created OCI spec for container dbbab1074644820422d30320b7d51b465feadc09461c0f66b072f67d201c5833 at /var/lib/containers/storage/overlay-containers/dbbab1074644820422d30320b7d51b465feadc09461c0f66b072f67d201c5833/userdata/config.json" time="2024-07-27T12:31:34-04:00" level=debug msg="Created cgroup path machine.slice/machine-libpod_pod_ab3f47105f53d1730873f106426ed61a89d93228bfacb6ee670475b2d79fdeba.slice for parent machine.slice and name libpod_pod_ab3f47105f53d1730873f106426ed61a89d93228bfacb6ee670475b2d79fdeba" time="2024-07-27T12:31:34-04:00" level=debug msg="Created cgroup machine.slice/machine-libpod_pod_ab3f47105f53d1730873f106426ed61a89d93228bfacb6ee670475b2d79fdeba.slice" time="2024-07-27T12:31:34-04:00" level=debug msg="Got pod cgroup as machine.slice/machine-libpod_pod_ab3f47105f53d1730873f106426ed61a89d93228bfacb6ee670475b2d79fdeba.slice" time="2024-07-27T12:31:34-04:00" level=debug msg="/usr/bin/conmon messages will be logged to syslog" time="2024-07-27T12:31:34-04:00" level=debug msg="running conmon: /usr/bin/conmon" args="[--api-version 1 -c dbbab1074644820422d30320b7d51b465feadc09461c0f66b072f67d201c5833 -u dbbab1074644820422d30320b7d51b465feadc09461c0f66b072f67d201c5833 -r /usr/bin/crun -b /var/lib/containers/storage/overlay-containers/dbbab1074644820422d30320b7d51b465feadc09461c0f66b072f67d201c5833/userdata -p /run/containers/storage/overlay-containers/dbbab1074644820422d30320b7d51b465feadc09461c0f66b072f67d201c5833/userdata/pidfile -n ab3f47105f53-infra --exit-dir /run/libpod/exits --persist-dir /run/libpod/persist/dbbab1074644820422d30320b7d51b465feadc09461c0f66b072f67d201c5833 --full-attach -s -l journald --log-level debug --syslog --conmon-pidfile /run/containers/storage/overlay-containers/dbbab1074644820422d30320b7d51b465feadc09461c0f66b072f67d201c5833/userdata/conmon.pid --exit-command /usr/bin/podman --exit-command-arg --root --exit-command-arg /var/lib/containers/storage --exit-command-arg --runroot --exit-command-arg /run/containers/storage --exit-command-arg --log-level --exit-command-arg debug --exit-command-arg --cgroup-manager --exit-command-arg systemd --exit-command-arg --tmpdir --exit-command-arg /run/libpod --exit-command-arg --network-config-dir --exit-command-arg --exit-command-arg --network-backend --exit-command-arg netavark --exit-command-arg --volumepath --exit-command-arg /var/lib/containers/storage/volumes --exit-command-arg --db-backend --exit-command-arg sqlite --exit-command-arg --transient-store=false --exit-command-arg --runtime --exit-command-arg crun --exit-command-arg --storage-driver --exit-command-arg overlay --exit-command-arg --storage-opt --exit-command-arg overlay.mountopt=nodev,metacopy=on --exit-command-arg --events-backend --exit-command-arg journald --exit-command-arg --syslog --exit-command-arg container --exit-command-arg cleanup --exit-command-arg dbbab1074644820422d30320b7d51b465feadc09461c0f66b072f67d201c5833]" time="2024-07-27T12:31:34-04:00" level=info msg="Running conmon under slice machine-libpod_pod_ab3f47105f53d1730873f106426ed61a89d93228bfacb6ee670475b2d79fdeba.slice and unitName libpod-conmon-dbbab1074644820422d30320b7d51b465feadc09461c0f66b072f67d201c5833.scope" time="2024-07-27T12:31:34-04:00" level=debug msg="Received: 26482" time="2024-07-27T12:31:34-04:00" level=info msg="Got Conmon PID as 26480" time="2024-07-27T12:31:34-04:00" level=debug msg="Created container dbbab1074644820422d30320b7d51b465feadc09461c0f66b072f67d201c5833 in OCI runtime" time="2024-07-27T12:31:34-04:00" level=debug msg="Adding nameserver(s) from network status of '[\"10.89.0.1\"]'" time="2024-07-27T12:31:34-04:00" level=debug msg="Adding search domain(s) from network status of '[\"dns.podman\"]'" time="2024-07-27T12:31:34-04:00" level=debug msg="Starting container dbbab1074644820422d30320b7d51b465feadc09461c0f66b072f67d201c5833 with command [/catatonit -P]" time="2024-07-27T12:31:34-04:00" level=debug msg="Started container dbbab1074644820422d30320b7d51b465feadc09461c0f66b072f67d201c5833" time="2024-07-27T12:31:34-04:00" level=debug msg="overlay: mount_data=lowerdir=/var/lib/containers/storage/overlay/l/BZ2UWY2RJNF2TFCOFMW4W4NBVO,upperdir=/var/lib/containers/storage/overlay/3ca3df692edc254075fdbd4c8f1f9612973492e52c45047ed57ead77a27e28fd/diff,workdir=/var/lib/containers/storage/overlay/3ca3df692edc254075fdbd4c8f1f9612973492e52c45047ed57ead77a27e28fd/work,nodev,metacopy=on,context=\"system_u:object_r:container_file_t:s0:c508,c881\"" time="2024-07-27T12:31:34-04:00" level=debug msg="Mounted container \"93532dadc1783be2948aef80b74ba1a5371faf20d5d5253e0eb8cacae7fd37bf\" at \"/var/lib/containers/storage/overlay/3ca3df692edc254075fdbd4c8f1f9612973492e52c45047ed57ead77a27e28fd/merged\"" time="2024-07-27T12:31:34-04:00" level=debug msg="Created root filesystem for container 93532dadc1783be2948aef80b74ba1a5371faf20d5d5253e0eb8cacae7fd37bf at /var/lib/containers/storage/overlay/3ca3df692edc254075fdbd4c8f1f9612973492e52c45047ed57ead77a27e28fd/merged" time="2024-07-27T12:31:34-04:00" level=debug msg="/etc/system-fips does not exist on host, not mounting FIPS mode subscription" time="2024-07-27T12:31:34-04:00" level=debug msg="Setting Cgroups for container 93532dadc1783be2948aef80b74ba1a5371faf20d5d5253e0eb8cacae7fd37bf to machine-libpod_pod_ab3f47105f53d1730873f106426ed61a89d93228bfacb6ee670475b2d79fdeba.slice:libpod:93532dadc1783be2948aef80b74ba1a5371faf20d5d5253e0eb8cacae7fd37bf" time="2024-07-27T12:31:34-04:00" level=debug msg="reading hooks from /usr/share/containers/oci/hooks.d" time="2024-07-27T12:31:34-04:00" level=debug msg="Workdir \"/var/www\" resolved to a volume or mount" time="2024-07-27T12:31:34-04:00" level=debug msg="Created OCI spec for container 93532dadc1783be2948aef80b74ba1a5371faf20d5d5253e0eb8cacae7fd37bf at /var/lib/containers/storage/overlay-containers/93532dadc1783be2948aef80b74ba1a5371faf20d5d5253e0eb8cacae7fd37bf/userdata/config.json" time="2024-07-27T12:31:34-04:00" level=debug msg="Created cgroup path machine.slice/machine-libpod_pod_ab3f47105f53d1730873f106426ed61a89d93228bfacb6ee670475b2d79fdeba.slice for parent machine.slice and name libpod_pod_ab3f47105f53d1730873f106426ed61a89d93228bfacb6ee670475b2d79fdeba" time="2024-07-27T12:31:34-04:00" level=debug msg="Created cgroup machine.slice/machine-libpod_pod_ab3f47105f53d1730873f106426ed61a89d93228bfacb6ee670475b2d79fdeba.slice" time="2024-07-27T12:31:34-04:00" level=debug msg="Got pod cgroup as machine.slice/machine-libpod_pod_ab3f47105f53d1730873f106426ed61a89d93228bfacb6ee670475b2d79fdeba.slice" time="2024-07-27T12:31:34-04:00" level=debug msg="/usr/bin/conmon messages will be logged to syslog" time="2024-07-27T12:31:34-04:00" level=debug msg="running conmon: /usr/bin/conmon" args="[--api-version 1 -c 93532dadc1783be2948aef80b74ba1a5371faf20d5d5253e0eb8cacae7fd37bf -u 93532dadc1783be2948aef80b74ba1a5371faf20d5d5253e0eb8cacae7fd37bf -r /usr/bin/crun -b /var/lib/containers/storage/overlay-containers/93532dadc1783be2948aef80b74ba1a5371faf20d5d5253e0eb8cacae7fd37bf/userdata -p /run/containers/storage/overlay-containers/93532dadc1783be2948aef80b74ba1a5371faf20d5d5253e0eb8cacae7fd37bf/userdata/pidfile -n httpd2-httpd2 --exit-dir /run/libpod/exits --persist-dir /run/libpod/persist/93532dadc1783be2948aef80b74ba1a5371faf20d5d5253e0eb8cacae7fd37bf --full-attach -s -l journald --log-level debug --syslog --conmon-pidfile /run/containers/storage/overlay-containers/93532dadc1783be2948aef80b74ba1a5371faf20d5d5253e0eb8cacae7fd37bf/userdata/conmon.pid --exit-command /usr/bin/podman --exit-command-arg --root --exit-command-arg /var/lib/containers/storage --exit-command-arg --runroot --exit-command-arg /run/containers/storage --exit-command-arg --log-level --exit-command-arg debug --exit-command-arg --cgroup-manager --exit-command-arg systemd --exit-command-arg --tmpdir --exit-command-arg /run/libpod --exit-command-arg --network-config-dir --exit-command-arg --exit-command-arg --network-backend --exit-command-arg netavark --exit-command-arg --volumepath --exit-command-arg /var/lib/containers/storage/volumes --exit-command-arg --db-backend --exit-command-arg sqlite --exit-command-arg --transient-store=false --exit-command-arg --runtime --exit-command-arg crun --exit-command-arg --storage-driver --exit-command-arg overlay --exit-command-arg --storage-opt --exit-command-arg overlay.mountopt=nodev,metacopy=on --exit-command-arg --events-backend --exit-command-arg journald --exit-command-arg --syslog --exit-command-arg container --exit-command-arg cleanup --exit-command-arg 93532dadc1783be2948aef80b74ba1a5371faf20d5d5253e0eb8cacae7fd37bf]" time="2024-07-27T12:31:34-04:00" level=info msg="Running conmon under slice machine-libpod_pod_ab3f47105f53d1730873f106426ed61a89d93228bfacb6ee670475b2d79fdeba.slice and unitName libpod-conmon-93532dadc1783be2948aef80b74ba1a5371faf20d5d5253e0eb8cacae7fd37bf.scope" time="2024-07-27T12:31:34-04:00" level=debug msg="Received: 26488" time="2024-07-27T12:31:34-04:00" level=info msg="Got Conmon PID as 26486" time="2024-07-27T12:31:34-04:00" level=debug msg="Created container 93532dadc1783be2948aef80b74ba1a5371faf20d5d5253e0eb8cacae7fd37bf in OCI runtime" time="2024-07-27T12:31:34-04:00" level=debug msg="Starting container 93532dadc1783be2948aef80b74ba1a5371faf20d5d5253e0eb8cacae7fd37bf with command [/bin/busybox-extras httpd -f -p 80]" time="2024-07-27T12:31:34-04:00" level=debug msg="Started container 93532dadc1783be2948aef80b74ba1a5371faf20d5d5253e0eb8cacae7fd37bf" time="2024-07-27T12:31:34-04:00" level=debug msg="Called kube.PersistentPostRunE(/usr/bin/podman play kube --start=true --log-level=debug /etc/containers/ansible-kubernetes.d/httpd2.yml)" time="2024-07-27T12:31:34-04:00" level=debug msg="Shutting down engines" Jul 27 12:31:34 ip-10-31-14-141.us-east-1.aws.redhat.com python3.12[26416]: ansible-containers.podman.podman_play PODMAN-PLAY-KUBE rc: 0 Jul 27 12:31:35 ip-10-31-14-141.us-east-1.aws.redhat.com python3.12[26602]: ansible-systemd Invoked with daemon_reload=True scope=system daemon_reexec=False no_block=False name=None state=None enabled=None force=None masked=None Jul 27 12:31:35 ip-10-31-14-141.us-east-1.aws.redhat.com systemd[1]: Reload requested from client PID 26603 ('systemctl') (unit session-5.scope)... Jul 27 12:31:35 ip-10-31-14-141.us-east-1.aws.redhat.com systemd[1]: Reloading... Jul 27 12:31:35 ip-10-31-14-141.us-east-1.aws.redhat.com systemd[1]: Reloading finished in 246 ms. Jul 27 12:31:36 ip-10-31-14-141.us-east-1.aws.redhat.com python3.12[26771]: ansible-systemd Invoked with name=podman-kube@-etc-containers-ansible\x2dkubernetes.d-httpd2.yml.service scope=system enabled=True daemon_reload=False daemon_reexec=False no_block=False state=None force=None masked=None Jul 27 12:31:36 ip-10-31-14-141.us-east-1.aws.redhat.com systemd[1]: Reload requested from client PID 26774 ('systemctl') (unit session-5.scope)... Jul 27 12:31:36 ip-10-31-14-141.us-east-1.aws.redhat.com systemd[1]: Reloading... Jul 27 12:31:36 ip-10-31-14-141.us-east-1.aws.redhat.com systemd[1]: Reloading finished in 230 ms. Jul 27 12:31:37 ip-10-31-14-141.us-east-1.aws.redhat.com python3.12[26942]: ansible-systemd Invoked with name=podman-kube@-etc-containers-ansible\x2dkubernetes.d-httpd2.yml.service scope=system state=started daemon_reload=False daemon_reexec=False no_block=False enabled=None force=None masked=None Jul 27 12:31:37 ip-10-31-14-141.us-east-1.aws.redhat.com systemd[1]: Created slice system-podman\x2dkube.slice - Slice /system/podman-kube. ░░ Subject: A start job for unit system-podman\x2dkube.slice has finished successfully ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ A start job for unit system-podman\x2dkube.slice has finished successfully. ░░ ░░ The job identifier is 2524. Jul 27 12:31:37 ip-10-31-14-141.us-east-1.aws.redhat.com systemd[1]: Starting podman-kube@-etc-containers-ansible\x2dkubernetes.d-httpd2.yml.service - A template for running K8s workloads via podman-kube-play... ░░ Subject: A start job for unit podman-kube@-etc-containers-ansible\x2dkubernetes.d-httpd2.yml.service has begun execution ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ A start job for unit podman-kube@-etc-containers-ansible\x2dkubernetes.d-httpd2.yml.service has begun execution. ░░ ░░ The job identifier is 2521. Jul 27 12:31:37 ip-10-31-14-141.us-east-1.aws.redhat.com podman[26946]: 2024-07-27 12:31:37.349867414 -0400 EDT m=+0.033394270 pod stop ab3f47105f53d1730873f106426ed61a89d93228bfacb6ee670475b2d79fdeba (image=, name=httpd2) Jul 27 12:31:37 ip-10-31-14-141.us-east-1.aws.redhat.com systemd[1]: tmp-crun.yWbfva.mount: Deactivated successfully. ░░ Subject: Unit succeeded ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ The unit tmp-crun.yWbfva.mount has successfully entered the 'dead' state. Jul 27 12:31:37 ip-10-31-14-141.us-east-1.aws.redhat.com systemd[1]: libpod-dbbab1074644820422d30320b7d51b465feadc09461c0f66b072f67d201c5833.scope: Deactivated successfully. ░░ Subject: Unit succeeded ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ The unit libpod-dbbab1074644820422d30320b7d51b465feadc09461c0f66b072f67d201c5833.scope has successfully entered the 'dead' state. Jul 27 12:31:37 ip-10-31-14-141.us-east-1.aws.redhat.com podman[26946]: 2024-07-27 12:31:37.38440358 -0400 EDT m=+0.067930666 container died dbbab1074644820422d30320b7d51b465feadc09461c0f66b072f67d201c5833 (image=localhost/podman-pause:5.1.2-1720656000, name=ab3f47105f53-infra, io.buildah.version=1.36.0) Jul 27 12:31:37 ip-10-31-14-141.us-east-1.aws.redhat.com aardvark-dns[26474]: Successfully parsed config Jul 27 12:31:37 ip-10-31-14-141.us-east-1.aws.redhat.com aardvark-dns[26474]: Listen v4 ip {} Jul 27 12:31:37 ip-10-31-14-141.us-east-1.aws.redhat.com aardvark-dns[26474]: Listen v6 ip {} Jul 27 12:31:37 ip-10-31-14-141.us-east-1.aws.redhat.com aardvark-dns[26474]: No configuration found stopping the sever Jul 27 12:31:37 ip-10-31-14-141.us-east-1.aws.redhat.com kernel: podman1: port 1(veth0) entered disabled state Jul 27 12:31:37 ip-10-31-14-141.us-east-1.aws.redhat.com kernel: veth0 (unregistering): left allmulticast mode Jul 27 12:31:37 ip-10-31-14-141.us-east-1.aws.redhat.com kernel: veth0 (unregistering): left promiscuous mode Jul 27 12:31:37 ip-10-31-14-141.us-east-1.aws.redhat.com kernel: podman1: port 1(veth0) entered disabled state Jul 27 12:31:37 ip-10-31-14-141.us-east-1.aws.redhat.com systemd[1]: run-r195f68d412d64ef19135845924417b7c.scope: Deactivated successfully. ░░ Subject: Unit succeeded ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ The unit run-r195f68d412d64ef19135845924417b7c.scope has successfully entered the 'dead' state. Jul 27 12:31:37 ip-10-31-14-141.us-east-1.aws.redhat.com /usr/bin/podman[26957]: time="2024-07-27T12:31:37-04:00" level=debug msg="Called cleanup.PersistentPreRunE(/usr/bin/podman --root /var/lib/containers/storage --runroot /run/containers/storage --log-level debug --cgroup-manager systemd --tmpdir /run/libpod --network-config-dir --network-backend netavark --volumepath /var/lib/containers/storage/volumes --db-backend sqlite --transient-store=false --runtime crun --storage-driver overlay --storage-opt overlay.mountopt=nodev,metacopy=on --events-backend journald --syslog container cleanup dbbab1074644820422d30320b7d51b465feadc09461c0f66b072f67d201c5833)" Jul 27 12:31:37 ip-10-31-14-141.us-east-1.aws.redhat.com /usr/bin/podman[26957]: time="2024-07-27T12:31:37-04:00" level=debug msg="Setting custom database backend: \"sqlite\"" Jul 27 12:31:37 ip-10-31-14-141.us-east-1.aws.redhat.com /usr/bin/podman[26957]: time="2024-07-27T12:31:37-04:00" level=debug msg="Using conmon: \"/usr/bin/conmon\"" Jul 27 12:31:37 ip-10-31-14-141.us-east-1.aws.redhat.com /usr/bin/podman[26957]: time="2024-07-27T12:31:37-04:00" level=info msg="Using sqlite as database backend" Jul 27 12:31:37 ip-10-31-14-141.us-east-1.aws.redhat.com /usr/bin/podman[26957]: time="2024-07-27T12:31:37-04:00" level=debug msg="Using graph driver overlay" Jul 27 12:31:37 ip-10-31-14-141.us-east-1.aws.redhat.com /usr/bin/podman[26957]: time="2024-07-27T12:31:37-04:00" level=debug msg="Using graph root /var/lib/containers/storage" Jul 27 12:31:37 ip-10-31-14-141.us-east-1.aws.redhat.com /usr/bin/podman[26957]: time="2024-07-27T12:31:37-04:00" level=debug msg="Using run root /run/containers/storage" Jul 27 12:31:37 ip-10-31-14-141.us-east-1.aws.redhat.com /usr/bin/podman[26957]: time="2024-07-27T12:31:37-04:00" level=debug msg="Using static dir /var/lib/containers/storage/libpod" Jul 27 12:31:37 ip-10-31-14-141.us-east-1.aws.redhat.com /usr/bin/podman[26957]: time="2024-07-27T12:31:37-04:00" level=debug msg="Using tmp dir /run/libpod" Jul 27 12:31:37 ip-10-31-14-141.us-east-1.aws.redhat.com /usr/bin/podman[26957]: time="2024-07-27T12:31:37-04:00" level=debug msg="Using volume path /var/lib/containers/storage/volumes" Jul 27 12:31:37 ip-10-31-14-141.us-east-1.aws.redhat.com /usr/bin/podman[26957]: time="2024-07-27T12:31:37-04:00" level=debug msg="Using transient store: false" Jul 27 12:31:37 ip-10-31-14-141.us-east-1.aws.redhat.com /usr/bin/podman[26957]: time="2024-07-27T12:31:37-04:00" level=debug msg="[graphdriver] trying provided driver \"overlay\"" Jul 27 12:31:37 ip-10-31-14-141.us-east-1.aws.redhat.com /usr/bin/podman[26957]: time="2024-07-27T12:31:37-04:00" level=debug msg="Cached value indicated that overlay is supported" Jul 27 12:31:37 ip-10-31-14-141.us-east-1.aws.redhat.com /usr/bin/podman[26957]: time="2024-07-27T12:31:37-04:00" level=debug msg="Cached value indicated that overlay is supported" Jul 27 12:31:37 ip-10-31-14-141.us-east-1.aws.redhat.com /usr/bin/podman[26957]: time="2024-07-27T12:31:37-04:00" level=debug msg="Cached value indicated that metacopy is being used" Jul 27 12:31:37 ip-10-31-14-141.us-east-1.aws.redhat.com /usr/bin/podman[26957]: time="2024-07-27T12:31:37-04:00" level=debug msg="Cached value indicated that native-diff is not being used" Jul 27 12:31:37 ip-10-31-14-141.us-east-1.aws.redhat.com /usr/bin/podman[26957]: time="2024-07-27T12:31:37-04:00" level=info msg="Not using native diff for overlay, this may cause degraded performance for building images: kernel has CONFIG_OVERLAY_FS_REDIRECT_DIR enabled" Jul 27 12:31:37 ip-10-31-14-141.us-east-1.aws.redhat.com /usr/bin/podman[26957]: time="2024-07-27T12:31:37-04:00" level=debug msg="backingFs=xfs, projectQuotaSupported=false, useNativeDiff=false, usingMetacopy=true" Jul 27 12:31:37 ip-10-31-14-141.us-east-1.aws.redhat.com /usr/bin/podman[26957]: time="2024-07-27T12:31:37-04:00" level=debug msg="Initializing event backend journald" Jul 27 12:31:37 ip-10-31-14-141.us-east-1.aws.redhat.com /usr/bin/podman[26957]: time="2024-07-27T12:31:37-04:00" level=debug msg="Configured OCI runtime crun-wasm initialization failed: no valid executable found for OCI runtime crun-wasm: invalid argument" Jul 27 12:31:37 ip-10-31-14-141.us-east-1.aws.redhat.com /usr/bin/podman[26957]: time="2024-07-27T12:31:37-04:00" level=debug msg="Configured OCI runtime runc initialization failed: no valid executable found for OCI runtime runc: invalid argument" Jul 27 12:31:37 ip-10-31-14-141.us-east-1.aws.redhat.com /usr/bin/podman[26957]: time="2024-07-27T12:31:37-04:00" level=debug msg="Configured OCI runtime crun-vm initialization failed: no valid executable found for OCI runtime crun-vm: invalid argument" Jul 27 12:31:37 ip-10-31-14-141.us-east-1.aws.redhat.com /usr/bin/podman[26957]: time="2024-07-27T12:31:37-04:00" level=debug msg="Configured OCI runtime runj initialization failed: no valid executable found for OCI runtime runj: invalid argument" Jul 27 12:31:37 ip-10-31-14-141.us-east-1.aws.redhat.com /usr/bin/podman[26957]: time="2024-07-27T12:31:37-04:00" level=debug msg="Configured OCI runtime kata initialization failed: no valid executable found for OCI runtime kata: invalid argument" Jul 27 12:31:37 ip-10-31-14-141.us-east-1.aws.redhat.com /usr/bin/podman[26957]: time="2024-07-27T12:31:37-04:00" level=debug msg="Configured OCI runtime runsc initialization failed: no valid executable found for OCI runtime runsc: invalid argument" Jul 27 12:31:37 ip-10-31-14-141.us-east-1.aws.redhat.com /usr/bin/podman[26957]: time="2024-07-27T12:31:37-04:00" level=debug msg="Configured OCI runtime youki initialization failed: no valid executable found for OCI runtime youki: invalid argument" Jul 27 12:31:37 ip-10-31-14-141.us-east-1.aws.redhat.com /usr/bin/podman[26957]: time="2024-07-27T12:31:37-04:00" level=debug msg="Configured OCI runtime krun initialization failed: no valid executable found for OCI runtime krun: invalid argument" Jul 27 12:31:37 ip-10-31-14-141.us-east-1.aws.redhat.com /usr/bin/podman[26957]: time="2024-07-27T12:31:37-04:00" level=debug msg="Configured OCI runtime ocijail initialization failed: no valid executable found for OCI runtime ocijail: invalid argument" Jul 27 12:31:37 ip-10-31-14-141.us-east-1.aws.redhat.com /usr/bin/podman[26957]: time="2024-07-27T12:31:37-04:00" level=debug msg="Using OCI runtime \"/usr/bin/crun\"" Jul 27 12:31:37 ip-10-31-14-141.us-east-1.aws.redhat.com /usr/bin/podman[26957]: time="2024-07-27T12:31:37-04:00" level=info msg="Setting parallel job count to 7" Jul 27 12:31:37 ip-10-31-14-141.us-east-1.aws.redhat.com NetworkManager[678]: [1722097897.4404] device (podman1): state change: activated -> unmanaged (reason 'unmanaged-external-down', sys-iface-state: 'external') Jul 27 12:31:37 ip-10-31-14-141.us-east-1.aws.redhat.com systemd[1]: run-netns-netns\x2dad67feaf\x2dbee7\x2d774e\x2db54b\x2d8987f6ef7689.mount: Deactivated successfully. ░░ Subject: Unit succeeded ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ The unit run-netns-netns\x2dad67feaf\x2dbee7\x2d774e\x2db54b\x2d8987f6ef7689.mount has successfully entered the 'dead' state. Jul 27 12:31:37 ip-10-31-14-141.us-east-1.aws.redhat.com podman[26946]: 2024-07-27 12:31:37.537865961 -0400 EDT m=+0.221392646 container cleanup dbbab1074644820422d30320b7d51b465feadc09461c0f66b072f67d201c5833 (image=localhost/podman-pause:5.1.2-1720656000, name=ab3f47105f53-infra, pod_id=ab3f47105f53d1730873f106426ed61a89d93228bfacb6ee670475b2d79fdeba, io.buildah.version=1.36.0) Jul 27 12:31:37 ip-10-31-14-141.us-east-1.aws.redhat.com /usr/bin/podman[26957]: time="2024-07-27T12:31:37-04:00" level=debug msg="Called cleanup.PersistentPostRunE(/usr/bin/podman --root /var/lib/containers/storage --runroot /run/containers/storage --log-level debug --cgroup-manager systemd --tmpdir /run/libpod --network-config-dir --network-backend netavark --volumepath /var/lib/containers/storage/volumes --db-backend sqlite --transient-store=false --runtime crun --storage-driver overlay --storage-opt overlay.mountopt=nodev,metacopy=on --events-backend journald --syslog container cleanup dbbab1074644820422d30320b7d51b465feadc09461c0f66b072f67d201c5833)" Jul 27 12:31:37 ip-10-31-14-141.us-east-1.aws.redhat.com /usr/bin/podman[26957]: time="2024-07-27T12:31:37-04:00" level=debug msg="Shutting down engines" Jul 27 12:31:37 ip-10-31-14-141.us-east-1.aws.redhat.com systemd[1]: libpod-conmon-dbbab1074644820422d30320b7d51b465feadc09461c0f66b072f67d201c5833.scope: Deactivated successfully. ░░ Subject: Unit succeeded ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ The unit libpod-conmon-dbbab1074644820422d30320b7d51b465feadc09461c0f66b072f67d201c5833.scope has successfully entered the 'dead' state. Jul 27 12:31:38 ip-10-31-14-141.us-east-1.aws.redhat.com systemd[1]: var-lib-containers-storage-overlay-41f4fbe1d2e8d3fc1965435eec797ddd0ee4533f30fb8fafae206940b3c6c4b1-merged.mount: Deactivated successfully. ░░ Subject: Unit succeeded ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ The unit var-lib-containers-storage-overlay-41f4fbe1d2e8d3fc1965435eec797ddd0ee4533f30fb8fafae206940b3c6c4b1-merged.mount has successfully entered the 'dead' state. Jul 27 12:31:38 ip-10-31-14-141.us-east-1.aws.redhat.com systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-dbbab1074644820422d30320b7d51b465feadc09461c0f66b072f67d201c5833-userdata-shm.mount: Deactivated successfully. ░░ Subject: Unit succeeded ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ The unit var-lib-containers-storage-overlay\x2dcontainers-dbbab1074644820422d30320b7d51b465feadc09461c0f66b072f67d201c5833-userdata-shm.mount has successfully entered the 'dead' state. Jul 27 12:31:47 ip-10-31-14-141.us-east-1.aws.redhat.com podman[26946]: time="2024-07-27T12:31:47-04:00" level=warning msg="StopSignal SIGTERM failed to stop container httpd2-httpd2 in 10 seconds, resorting to SIGKILL" Jul 27 12:31:47 ip-10-31-14-141.us-east-1.aws.redhat.com systemd[1]: libpod-93532dadc1783be2948aef80b74ba1a5371faf20d5d5253e0eb8cacae7fd37bf.scope: Deactivated successfully. ░░ Subject: Unit succeeded ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ The unit libpod-93532dadc1783be2948aef80b74ba1a5371faf20d5d5253e0eb8cacae7fd37bf.scope has successfully entered the 'dead' state. Jul 27 12:31:47 ip-10-31-14-141.us-east-1.aws.redhat.com conmon[26486]: conmon 93532dadc1783be2948a : container 26488 exited with status 137 Jul 27 12:31:47 ip-10-31-14-141.us-east-1.aws.redhat.com podman[26946]: 2024-07-27 12:31:47.402780786 -0400 EDT m=+10.086308009 container died 93532dadc1783be2948aef80b74ba1a5371faf20d5d5253e0eb8cacae7fd37bf (image=quay.io/libpod/testimage:20210610, name=httpd2-httpd2, created_at=2021-06-10T18:55:36Z, created_by=test/system/build-testimage, io.buildah.version=1.21.0, io.containers.autoupdate=registry, app=test) Jul 27 12:31:47 ip-10-31-14-141.us-east-1.aws.redhat.com conmon[26486]: conmon 93532dadc1783be2948a : Failed to open cgroups file: /sys/fs/cgroup/machine.slice/machine-libpod_pod_ab3f47105f53d1730873f106426ed61a89d93228bfacb6ee670475b2d79fdeba.slice/libpod-93532dadc1783be2948aef80b74ba1a5371faf20d5d5253e0eb8cacae7fd37bf.scope/container/memory.events Jul 27 12:31:47 ip-10-31-14-141.us-east-1.aws.redhat.com /usr/bin/podman[26979]: time="2024-07-27T12:31:47-04:00" level=debug msg="Called cleanup.PersistentPreRunE(/usr/bin/podman --root /var/lib/containers/storage --runroot /run/containers/storage --log-level debug --cgroup-manager systemd --tmpdir /run/libpod --network-config-dir --network-backend netavark --volumepath /var/lib/containers/storage/volumes --db-backend sqlite --transient-store=false --runtime crun --storage-driver overlay --storage-opt overlay.mountopt=nodev,metacopy=on --events-backend journald --syslog container cleanup 93532dadc1783be2948aef80b74ba1a5371faf20d5d5253e0eb8cacae7fd37bf)" Jul 27 12:31:47 ip-10-31-14-141.us-east-1.aws.redhat.com /usr/bin/podman[26979]: time="2024-07-27T12:31:47-04:00" level=debug msg="Setting custom database backend: \"sqlite\"" Jul 27 12:31:47 ip-10-31-14-141.us-east-1.aws.redhat.com /usr/bin/podman[26979]: time="2024-07-27T12:31:47-04:00" level=debug msg="Using conmon: \"/usr/bin/conmon\"" Jul 27 12:31:47 ip-10-31-14-141.us-east-1.aws.redhat.com /usr/bin/podman[26979]: time="2024-07-27T12:31:47-04:00" level=info msg="Using sqlite as database backend" Jul 27 12:31:47 ip-10-31-14-141.us-east-1.aws.redhat.com systemd[1]: var-lib-containers-storage-overlay-3ca3df692edc254075fdbd4c8f1f9612973492e52c45047ed57ead77a27e28fd-merged.mount: Deactivated successfully. ░░ Subject: Unit succeeded ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ The unit var-lib-containers-storage-overlay-3ca3df692edc254075fdbd4c8f1f9612973492e52c45047ed57ead77a27e28fd-merged.mount has successfully entered the 'dead' state. Jul 27 12:31:47 ip-10-31-14-141.us-east-1.aws.redhat.com /usr/bin/podman[26979]: time="2024-07-27T12:31:47-04:00" level=debug msg="Using graph driver overlay" Jul 27 12:31:47 ip-10-31-14-141.us-east-1.aws.redhat.com /usr/bin/podman[26979]: time="2024-07-27T12:31:47-04:00" level=debug msg="Using graph root /var/lib/containers/storage" Jul 27 12:31:47 ip-10-31-14-141.us-east-1.aws.redhat.com /usr/bin/podman[26979]: time="2024-07-27T12:31:47-04:00" level=debug msg="Using run root /run/containers/storage" Jul 27 12:31:47 ip-10-31-14-141.us-east-1.aws.redhat.com /usr/bin/podman[26979]: time="2024-07-27T12:31:47-04:00" level=debug msg="Using static dir /var/lib/containers/storage/libpod" Jul 27 12:31:47 ip-10-31-14-141.us-east-1.aws.redhat.com /usr/bin/podman[26979]: time="2024-07-27T12:31:47-04:00" level=debug msg="Using tmp dir /run/libpod" Jul 27 12:31:47 ip-10-31-14-141.us-east-1.aws.redhat.com /usr/bin/podman[26979]: time="2024-07-27T12:31:47-04:00" level=debug msg="Using volume path /var/lib/containers/storage/volumes" Jul 27 12:31:47 ip-10-31-14-141.us-east-1.aws.redhat.com /usr/bin/podman[26979]: time="2024-07-27T12:31:47-04:00" level=debug msg="Using transient store: false" Jul 27 12:31:47 ip-10-31-14-141.us-east-1.aws.redhat.com /usr/bin/podman[26979]: time="2024-07-27T12:31:47-04:00" level=debug msg="[graphdriver] trying provided driver \"overlay\"" Jul 27 12:31:47 ip-10-31-14-141.us-east-1.aws.redhat.com /usr/bin/podman[26979]: time="2024-07-27T12:31:47-04:00" level=debug msg="Cached value indicated that overlay is supported" Jul 27 12:31:47 ip-10-31-14-141.us-east-1.aws.redhat.com /usr/bin/podman[26979]: time="2024-07-27T12:31:47-04:00" level=debug msg="Cached value indicated that overlay is supported" Jul 27 12:31:47 ip-10-31-14-141.us-east-1.aws.redhat.com /usr/bin/podman[26979]: time="2024-07-27T12:31:47-04:00" level=debug msg="Cached value indicated that metacopy is being used" Jul 27 12:31:47 ip-10-31-14-141.us-east-1.aws.redhat.com /usr/bin/podman[26979]: time="2024-07-27T12:31:47-04:00" level=debug msg="Cached value indicated that native-diff is not being used" Jul 27 12:31:47 ip-10-31-14-141.us-east-1.aws.redhat.com /usr/bin/podman[26979]: time="2024-07-27T12:31:47-04:00" level=info msg="Not using native diff for overlay, this may cause degraded performance for building images: kernel has CONFIG_OVERLAY_FS_REDIRECT_DIR enabled" Jul 27 12:31:47 ip-10-31-14-141.us-east-1.aws.redhat.com /usr/bin/podman[26979]: time="2024-07-27T12:31:47-04:00" level=debug msg="backingFs=xfs, projectQuotaSupported=false, useNativeDiff=false, usingMetacopy=true" Jul 27 12:31:47 ip-10-31-14-141.us-east-1.aws.redhat.com /usr/bin/podman[26979]: time="2024-07-27T12:31:47-04:00" level=debug msg="Initializing event backend journald" Jul 27 12:31:47 ip-10-31-14-141.us-east-1.aws.redhat.com /usr/bin/podman[26979]: time="2024-07-27T12:31:47-04:00" level=debug msg="Configured OCI runtime runj initialization failed: no valid executable found for OCI runtime runj: invalid argument" Jul 27 12:31:47 ip-10-31-14-141.us-east-1.aws.redhat.com /usr/bin/podman[26979]: time="2024-07-27T12:31:47-04:00" level=debug msg="Configured OCI runtime kata initialization failed: no valid executable found for OCI runtime kata: invalid argument" Jul 27 12:31:47 ip-10-31-14-141.us-east-1.aws.redhat.com /usr/bin/podman[26979]: time="2024-07-27T12:31:47-04:00" level=debug msg="Configured OCI runtime runsc initialization failed: no valid executable found for OCI runtime runsc: invalid argument" Jul 27 12:31:47 ip-10-31-14-141.us-east-1.aws.redhat.com /usr/bin/podman[26979]: time="2024-07-27T12:31:47-04:00" level=debug msg="Configured OCI runtime youki initialization failed: no valid executable found for OCI runtime youki: invalid argument" Jul 27 12:31:47 ip-10-31-14-141.us-east-1.aws.redhat.com /usr/bin/podman[26979]: time="2024-07-27T12:31:47-04:00" level=debug msg="Configured OCI runtime krun initialization failed: no valid executable found for OCI runtime krun: invalid argument" Jul 27 12:31:47 ip-10-31-14-141.us-east-1.aws.redhat.com /usr/bin/podman[26979]: time="2024-07-27T12:31:47-04:00" level=debug msg="Configured OCI runtime crun-vm initialization failed: no valid executable found for OCI runtime crun-vm: invalid argument" Jul 27 12:31:47 ip-10-31-14-141.us-east-1.aws.redhat.com /usr/bin/podman[26979]: time="2024-07-27T12:31:47-04:00" level=debug msg="Configured OCI runtime runc initialization failed: no valid executable found for OCI runtime runc: invalid argument" Jul 27 12:31:47 ip-10-31-14-141.us-east-1.aws.redhat.com /usr/bin/podman[26979]: time="2024-07-27T12:31:47-04:00" level=debug msg="Configured OCI runtime ocijail initialization failed: no valid executable found for OCI runtime ocijail: invalid argument" Jul 27 12:31:47 ip-10-31-14-141.us-east-1.aws.redhat.com /usr/bin/podman[26979]: time="2024-07-27T12:31:47-04:00" level=debug msg="Configured OCI runtime crun-wasm initialization failed: no valid executable found for OCI runtime crun-wasm: invalid argument" Jul 27 12:31:47 ip-10-31-14-141.us-east-1.aws.redhat.com /usr/bin/podman[26979]: time="2024-07-27T12:31:47-04:00" level=debug msg="Using OCI runtime \"/usr/bin/crun\"" Jul 27 12:31:47 ip-10-31-14-141.us-east-1.aws.redhat.com /usr/bin/podman[26979]: time="2024-07-27T12:31:47-04:00" level=info msg="Setting parallel job count to 7" Jul 27 12:31:47 ip-10-31-14-141.us-east-1.aws.redhat.com podman[26946]: 2024-07-27 12:31:47.455732041 -0400 EDT m=+10.139258823 container cleanup 93532dadc1783be2948aef80b74ba1a5371faf20d5d5253e0eb8cacae7fd37bf (image=quay.io/libpod/testimage:20210610, name=httpd2-httpd2, pod_id=ab3f47105f53d1730873f106426ed61a89d93228bfacb6ee670475b2d79fdeba, app=test, created_at=2021-06-10T18:55:36Z, created_by=test/system/build-testimage, io.buildah.version=1.21.0, io.containers.autoupdate=registry) Jul 27 12:31:47 ip-10-31-14-141.us-east-1.aws.redhat.com /usr/bin/podman[26979]: time="2024-07-27T12:31:47-04:00" level=info msg="Received shutdown signal \"terminated\", terminating!" PID=26979 Jul 27 12:31:47 ip-10-31-14-141.us-east-1.aws.redhat.com /usr/bin/podman[26979]: time="2024-07-27T12:31:47-04:00" level=info msg="Invoking shutdown handler \"libpod\"" PID=26979 Jul 27 12:31:47 ip-10-31-14-141.us-east-1.aws.redhat.com systemd[1]: Stopping libpod-conmon-93532dadc1783be2948aef80b74ba1a5371faf20d5d5253e0eb8cacae7fd37bf.scope... ░░ Subject: A stop job for unit libpod-conmon-93532dadc1783be2948aef80b74ba1a5371faf20d5d5253e0eb8cacae7fd37bf.scope has begun execution ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ A stop job for unit libpod-conmon-93532dadc1783be2948aef80b74ba1a5371faf20d5d5253e0eb8cacae7fd37bf.scope has begun execution. ░░ ░░ The job identifier is 2608. Jul 27 12:31:47 ip-10-31-14-141.us-east-1.aws.redhat.com systemd[1]: libpod-conmon-93532dadc1783be2948aef80b74ba1a5371faf20d5d5253e0eb8cacae7fd37bf.scope: Deactivated successfully. ░░ Subject: Unit succeeded ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ The unit libpod-conmon-93532dadc1783be2948aef80b74ba1a5371faf20d5d5253e0eb8cacae7fd37bf.scope has successfully entered the 'dead' state. Jul 27 12:31:47 ip-10-31-14-141.us-east-1.aws.redhat.com systemd[1]: Stopped libpod-conmon-93532dadc1783be2948aef80b74ba1a5371faf20d5d5253e0eb8cacae7fd37bf.scope. ░░ Subject: A stop job for unit libpod-conmon-93532dadc1783be2948aef80b74ba1a5371faf20d5d5253e0eb8cacae7fd37bf.scope has finished ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ A stop job for unit libpod-conmon-93532dadc1783be2948aef80b74ba1a5371faf20d5d5253e0eb8cacae7fd37bf.scope has finished. ░░ ░░ The job identifier is 2608 and the job result is done. Jul 27 12:31:47 ip-10-31-14-141.us-east-1.aws.redhat.com systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully. ░░ Subject: Unit succeeded ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ The unit NetworkManager-dispatcher.service has successfully entered the 'dead' state. Jul 27 12:31:47 ip-10-31-14-141.us-east-1.aws.redhat.com systemd[1]: Removed slice machine-libpod_pod_ab3f47105f53d1730873f106426ed61a89d93228bfacb6ee670475b2d79fdeba.slice - cgroup machine-libpod_pod_ab3f47105f53d1730873f106426ed61a89d93228bfacb6ee670475b2d79fdeba.slice. ░░ Subject: A stop job for unit machine-libpod_pod_ab3f47105f53d1730873f106426ed61a89d93228bfacb6ee670475b2d79fdeba.slice has finished ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ A stop job for unit machine-libpod_pod_ab3f47105f53d1730873f106426ed61a89d93228bfacb6ee670475b2d79fdeba.slice has finished. ░░ ░░ The job identifier is 2607 and the job result is done. Jul 27 12:31:47 ip-10-31-14-141.us-east-1.aws.redhat.com podman[26946]: 2024-07-27 12:31:47.504057511 -0400 EDT m=+10.187584371 pod stop ab3f47105f53d1730873f106426ed61a89d93228bfacb6ee670475b2d79fdeba (image=, name=httpd2) Jul 27 12:31:47 ip-10-31-14-141.us-east-1.aws.redhat.com systemd[1]: machine-libpod_pod_ab3f47105f53d1730873f106426ed61a89d93228bfacb6ee670475b2d79fdeba.slice: Failed to open /run/systemd/transient/machine-libpod_pod_ab3f47105f53d1730873f106426ed61a89d93228bfacb6ee670475b2d79fdeba.slice: No such file or directory Jul 27 12:31:47 ip-10-31-14-141.us-east-1.aws.redhat.com podman[26946]: time="2024-07-27T12:31:47-04:00" level=error msg="Checking if infra needs to be stopped: removing pod ab3f47105f53d1730873f106426ed61a89d93228bfacb6ee670475b2d79fdeba cgroup: Unit machine-libpod_pod_ab3f47105f53d1730873f106426ed61a89d93228bfacb6ee670475b2d79fdeba.slice not loaded." Jul 27 12:31:47 ip-10-31-14-141.us-east-1.aws.redhat.com podman[26946]: 2024-07-27 12:31:47.731824803 -0400 EDT m=+10.415351611 container remove 93532dadc1783be2948aef80b74ba1a5371faf20d5d5253e0eb8cacae7fd37bf (image=quay.io/libpod/testimage:20210610, name=httpd2-httpd2, pod_id=ab3f47105f53d1730873f106426ed61a89d93228bfacb6ee670475b2d79fdeba, created_by=test/system/build-testimage, io.buildah.version=1.21.0, io.containers.autoupdate=registry, app=test, created_at=2021-06-10T18:55:36Z) Jul 27 12:31:47 ip-10-31-14-141.us-east-1.aws.redhat.com podman[26946]: 2024-07-27 12:31:47.771664803 -0400 EDT m=+10.455191504 container remove dbbab1074644820422d30320b7d51b465feadc09461c0f66b072f67d201c5833 (image=localhost/podman-pause:5.1.2-1720656000, name=ab3f47105f53-infra, pod_id=ab3f47105f53d1730873f106426ed61a89d93228bfacb6ee670475b2d79fdeba, io.buildah.version=1.36.0) Jul 27 12:31:47 ip-10-31-14-141.us-east-1.aws.redhat.com systemd[1]: machine-libpod_pod_ab3f47105f53d1730873f106426ed61a89d93228bfacb6ee670475b2d79fdeba.slice: Failed to open /run/systemd/transient/machine-libpod_pod_ab3f47105f53d1730873f106426ed61a89d93228bfacb6ee670475b2d79fdeba.slice: No such file or directory Jul 27 12:31:47 ip-10-31-14-141.us-east-1.aws.redhat.com podman[26946]: 2024-07-27 12:31:47.783059784 -0400 EDT m=+10.466586472 pod remove ab3f47105f53d1730873f106426ed61a89d93228bfacb6ee670475b2d79fdeba (image=, name=httpd2) Jul 27 12:31:47 ip-10-31-14-141.us-east-1.aws.redhat.com podman[26946]: Pods stopped: Jul 27 12:31:47 ip-10-31-14-141.us-east-1.aws.redhat.com podman[26946]: ab3f47105f53d1730873f106426ed61a89d93228bfacb6ee670475b2d79fdeba Jul 27 12:31:47 ip-10-31-14-141.us-east-1.aws.redhat.com podman[26946]: Pods removed: Jul 27 12:31:47 ip-10-31-14-141.us-east-1.aws.redhat.com podman[26946]: Error: removing pod ab3f47105f53d1730873f106426ed61a89d93228bfacb6ee670475b2d79fdeba cgroup: removing pod ab3f47105f53d1730873f106426ed61a89d93228bfacb6ee670475b2d79fdeba cgroup: Unit machine-libpod_pod_ab3f47105f53d1730873f106426ed61a89d93228bfacb6ee670475b2d79fdeba.slice not loaded. Jul 27 12:31:47 ip-10-31-14-141.us-east-1.aws.redhat.com podman[26946]: Secrets removed: Jul 27 12:31:47 ip-10-31-14-141.us-east-1.aws.redhat.com podman[26946]: Error: %!s() Jul 27 12:31:47 ip-10-31-14-141.us-east-1.aws.redhat.com podman[26946]: Volumes removed: Jul 27 12:31:47 ip-10-31-14-141.us-east-1.aws.redhat.com podman[26946]: 2024-07-27 12:31:47.812062355 -0400 EDT m=+10.495589406 container create cfa46864c7ec0493ae037af522f39cd0a603f4dc849eb026c04a5797807e00ad (image=localhost/podman-pause:5.1.2-1720656000, name=42d68b4496fe-service, PODMAN_SYSTEMD_UNIT=podman-kube@-etc-containers-ansible\x2dkubernetes.d-httpd2.yml.service) Jul 27 12:31:47 ip-10-31-14-141.us-east-1.aws.redhat.com systemd[1]: Created slice machine-libpod_pod_3fc987afc0fcfab30e8777f8ebf27fce38969878d374f1ad00ca4b1107c6d265.slice - cgroup machine-libpod_pod_3fc987afc0fcfab30e8777f8ebf27fce38969878d374f1ad00ca4b1107c6d265.slice. ░░ Subject: A start job for unit machine-libpod_pod_3fc987afc0fcfab30e8777f8ebf27fce38969878d374f1ad00ca4b1107c6d265.slice has finished successfully ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ A start job for unit machine-libpod_pod_3fc987afc0fcfab30e8777f8ebf27fce38969878d374f1ad00ca4b1107c6d265.slice has finished successfully. ░░ ░░ The job identifier is 2609. Jul 27 12:31:47 ip-10-31-14-141.us-east-1.aws.redhat.com podman[26946]: 2024-07-27 12:31:47.874309115 -0400 EDT m=+10.557836015 container create 21934980ccc10ecb78c2d3633f5f38d2dd8ef60d679cc00813b06946c70a7e97 (image=localhost/podman-pause:5.1.2-1720656000, name=3fc987afc0fc-infra, pod_id=3fc987afc0fcfab30e8777f8ebf27fce38969878d374f1ad00ca4b1107c6d265, PODMAN_SYSTEMD_UNIT=podman-kube@-etc-containers-ansible\x2dkubernetes.d-httpd2.yml.service, io.buildah.version=1.36.0) Jul 27 12:31:47 ip-10-31-14-141.us-east-1.aws.redhat.com podman[26946]: 2024-07-27 12:31:47.88469321 -0400 EDT m=+10.568219907 pod create 3fc987afc0fcfab30e8777f8ebf27fce38969878d374f1ad00ca4b1107c6d265 (image=, name=httpd2) Jul 27 12:31:47 ip-10-31-14-141.us-east-1.aws.redhat.com podman[26946]: 2024-07-27 12:31:47.922415355 -0400 EDT m=+10.605942045 container create 28a07a6e8a801cd6f84385e3d31d70834b0d17347f7424b4c81938b703393bad (image=quay.io/libpod/testimage:20210610, name=httpd2-httpd2, pod_id=3fc987afc0fcfab30e8777f8ebf27fce38969878d374f1ad00ca4b1107c6d265, created_by=test/system/build-testimage, io.buildah.version=1.21.0, app=test, io.containers.autoupdate=registry, PODMAN_SYSTEMD_UNIT=podman-kube@-etc-containers-ansible\x2dkubernetes.d-httpd2.yml.service, created_at=2021-06-10T18:55:36Z) Jul 27 12:31:47 ip-10-31-14-141.us-east-1.aws.redhat.com podman[26946]: 2024-07-27 12:31:47.922850983 -0400 EDT m=+10.606377701 container restart cfa46864c7ec0493ae037af522f39cd0a603f4dc849eb026c04a5797807e00ad (image=localhost/podman-pause:5.1.2-1720656000, name=42d68b4496fe-service, PODMAN_SYSTEMD_UNIT=podman-kube@-etc-containers-ansible\x2dkubernetes.d-httpd2.yml.service) Jul 27 12:31:47 ip-10-31-14-141.us-east-1.aws.redhat.com podman[26946]: 2024-07-27 12:31:47.88754201 -0400 EDT m=+10.571068945 image pull 9f9ec7f2fdef9168f74e9d057f307955db14d782cff22ded51d277d74798cb2f quay.io/libpod/testimage:20210610 Jul 27 12:31:47 ip-10-31-14-141.us-east-1.aws.redhat.com systemd[1]: Started libpod-cfa46864c7ec0493ae037af522f39cd0a603f4dc849eb026c04a5797807e00ad.scope - libcrun container. ░░ Subject: A start job for unit libpod-cfa46864c7ec0493ae037af522f39cd0a603f4dc849eb026c04a5797807e00ad.scope has finished successfully ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ A start job for unit libpod-cfa46864c7ec0493ae037af522f39cd0a603f4dc849eb026c04a5797807e00ad.scope has finished successfully. ░░ ░░ The job identifier is 2615. Jul 27 12:31:47 ip-10-31-14-141.us-east-1.aws.redhat.com podman[26946]: 2024-07-27 12:31:47.996015765 -0400 EDT m=+10.679542710 container init cfa46864c7ec0493ae037af522f39cd0a603f4dc849eb026c04a5797807e00ad (image=localhost/podman-pause:5.1.2-1720656000, name=42d68b4496fe-service, PODMAN_SYSTEMD_UNIT=podman-kube@-etc-containers-ansible\x2dkubernetes.d-httpd2.yml.service) Jul 27 12:31:47 ip-10-31-14-141.us-east-1.aws.redhat.com podman[26946]: 2024-07-27 12:31:47.998894892 -0400 EDT m=+10.682421681 container start cfa46864c7ec0493ae037af522f39cd0a603f4dc849eb026c04a5797807e00ad (image=localhost/podman-pause:5.1.2-1720656000, name=42d68b4496fe-service, PODMAN_SYSTEMD_UNIT=podman-kube@-etc-containers-ansible\x2dkubernetes.d-httpd2.yml.service) Jul 27 12:31:48 ip-10-31-14-141.us-east-1.aws.redhat.com NetworkManager[678]: [1722097908.0151] manager: (podman1): new Bridge device (/org/freedesktop/NetworkManager/Devices/5) Jul 27 12:31:48 ip-10-31-14-141.us-east-1.aws.redhat.com kernel: podman1: port 1(veth0) entered blocking state Jul 27 12:31:48 ip-10-31-14-141.us-east-1.aws.redhat.com kernel: podman1: port 1(veth0) entered disabled state Jul 27 12:31:48 ip-10-31-14-141.us-east-1.aws.redhat.com kernel: veth0: entered allmulticast mode Jul 27 12:31:48 ip-10-31-14-141.us-east-1.aws.redhat.com kernel: veth0: entered promiscuous mode Jul 27 12:31:48 ip-10-31-14-141.us-east-1.aws.redhat.com kernel: podman1: port 1(veth0) entered blocking state Jul 27 12:31:48 ip-10-31-14-141.us-east-1.aws.redhat.com kernel: podman1: port 1(veth0) entered forwarding state Jul 27 12:31:48 ip-10-31-14-141.us-east-1.aws.redhat.com NetworkManager[678]: [1722097908.0232] manager: (veth0): new Veth device (/org/freedesktop/NetworkManager/Devices/6) Jul 27 12:31:48 ip-10-31-14-141.us-east-1.aws.redhat.com NetworkManager[678]: [1722097908.0267] device (veth0): carrier: link connected Jul 27 12:31:48 ip-10-31-14-141.us-east-1.aws.redhat.com NetworkManager[678]: [1722097908.0271] device (podman1): carrier: link connected Jul 27 12:31:48 ip-10-31-14-141.us-east-1.aws.redhat.com (udev-worker)[26998]: Network interface NamePolicy= disabled on kernel command line. Jul 27 12:31:48 ip-10-31-14-141.us-east-1.aws.redhat.com (udev-worker)[26996]: Network interface NamePolicy= disabled on kernel command line. Jul 27 12:31:48 ip-10-31-14-141.us-east-1.aws.redhat.com NetworkManager[678]: [1722097908.1027] device (podman1): state change: unmanaged -> unavailable (reason 'connection-assumed', sys-iface-state: 'external') Jul 27 12:31:48 ip-10-31-14-141.us-east-1.aws.redhat.com NetworkManager[678]: [1722097908.1036] device (podman1): state change: unavailable -> disconnected (reason 'connection-assumed', sys-iface-state: 'external') Jul 27 12:31:48 ip-10-31-14-141.us-east-1.aws.redhat.com NetworkManager[678]: [1722097908.1045] device (podman1): Activation: starting connection 'podman1' (597d342f-9147-44d9-82b7-4336562fc291) Jul 27 12:31:48 ip-10-31-14-141.us-east-1.aws.redhat.com NetworkManager[678]: [1722097908.1048] device (podman1): state change: disconnected -> prepare (reason 'none', sys-iface-state: 'external') Jul 27 12:31:48 ip-10-31-14-141.us-east-1.aws.redhat.com NetworkManager[678]: [1722097908.1052] device (podman1): state change: prepare -> config (reason 'none', sys-iface-state: 'external') Jul 27 12:31:48 ip-10-31-14-141.us-east-1.aws.redhat.com NetworkManager[678]: [1722097908.1056] device (podman1): state change: config -> ip-config (reason 'none', sys-iface-state: 'external') Jul 27 12:31:48 ip-10-31-14-141.us-east-1.aws.redhat.com NetworkManager[678]: [1722097908.1060] device (podman1): state change: ip-config -> ip-check (reason 'none', sys-iface-state: 'external') Jul 27 12:31:48 ip-10-31-14-141.us-east-1.aws.redhat.com systemd[1]: Starting NetworkManager-dispatcher.service - Network Manager Script Dispatcher Service... ░░ Subject: A start job for unit NetworkManager-dispatcher.service has begun execution ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ A start job for unit NetworkManager-dispatcher.service has begun execution. ░░ ░░ The job identifier is 2622. Jul 27 12:31:48 ip-10-31-14-141.us-east-1.aws.redhat.com systemd[1]: Started NetworkManager-dispatcher.service - Network Manager Script Dispatcher Service. ░░ Subject: A start job for unit NetworkManager-dispatcher.service has finished successfully ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ A start job for unit NetworkManager-dispatcher.service has finished successfully. ░░ ░░ The job identifier is 2622. Jul 27 12:31:48 ip-10-31-14-141.us-east-1.aws.redhat.com NetworkManager[678]: [1722097908.1424] device (podman1): state change: ip-check -> secondaries (reason 'none', sys-iface-state: 'external') Jul 27 12:31:48 ip-10-31-14-141.us-east-1.aws.redhat.com NetworkManager[678]: [1722097908.1426] device (podman1): state change: secondaries -> activated (reason 'none', sys-iface-state: 'external') Jul 27 12:31:48 ip-10-31-14-141.us-east-1.aws.redhat.com NetworkManager[678]: [1722097908.1432] device (podman1): Activation: successful, device activated. Jul 27 12:31:48 ip-10-31-14-141.us-east-1.aws.redhat.com systemd[1]: Started run-r438989e88bd74f25b2e11a3ff478396f.scope - /usr/libexec/podman/aardvark-dns --config /run/containers/networks/aardvark-dns -p 53 run. ░░ Subject: A start job for unit run-r438989e88bd74f25b2e11a3ff478396f.scope has finished successfully ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ A start job for unit run-r438989e88bd74f25b2e11a3ff478396f.scope has finished successfully. ░░ ░░ The job identifier is 2702. Jul 27 12:31:48 ip-10-31-14-141.us-east-1.aws.redhat.com systemd[1]: Started libpod-21934980ccc10ecb78c2d3633f5f38d2dd8ef60d679cc00813b06946c70a7e97.scope - libcrun container. ░░ Subject: A start job for unit libpod-21934980ccc10ecb78c2d3633f5f38d2dd8ef60d679cc00813b06946c70a7e97.scope has finished successfully ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ A start job for unit libpod-21934980ccc10ecb78c2d3633f5f38d2dd8ef60d679cc00813b06946c70a7e97.scope has finished successfully. ░░ ░░ The job identifier is 2708. Jul 27 12:31:48 ip-10-31-14-141.us-east-1.aws.redhat.com podman[26946]: 2024-07-27 12:31:48.22965057 -0400 EDT m=+10.913177417 container init 21934980ccc10ecb78c2d3633f5f38d2dd8ef60d679cc00813b06946c70a7e97 (image=localhost/podman-pause:5.1.2-1720656000, name=3fc987afc0fc-infra, pod_id=3fc987afc0fcfab30e8777f8ebf27fce38969878d374f1ad00ca4b1107c6d265, PODMAN_SYSTEMD_UNIT=podman-kube@-etc-containers-ansible\x2dkubernetes.d-httpd2.yml.service, io.buildah.version=1.36.0) Jul 27 12:31:48 ip-10-31-14-141.us-east-1.aws.redhat.com podman[26946]: 2024-07-27 12:31:48.232878953 -0400 EDT m=+10.916405735 container start 21934980ccc10ecb78c2d3633f5f38d2dd8ef60d679cc00813b06946c70a7e97 (image=localhost/podman-pause:5.1.2-1720656000, name=3fc987afc0fc-infra, pod_id=3fc987afc0fcfab30e8777f8ebf27fce38969878d374f1ad00ca4b1107c6d265, PODMAN_SYSTEMD_UNIT=podman-kube@-etc-containers-ansible\x2dkubernetes.d-httpd2.yml.service, io.buildah.version=1.36.0) Jul 27 12:31:48 ip-10-31-14-141.us-east-1.aws.redhat.com systemd[1]: Started libpod-28a07a6e8a801cd6f84385e3d31d70834b0d17347f7424b4c81938b703393bad.scope - libcrun container. ░░ Subject: A start job for unit libpod-28a07a6e8a801cd6f84385e3d31d70834b0d17347f7424b4c81938b703393bad.scope has finished successfully ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ A start job for unit libpod-28a07a6e8a801cd6f84385e3d31d70834b0d17347f7424b4c81938b703393bad.scope has finished successfully. ░░ ░░ The job identifier is 2715. Jul 27 12:31:48 ip-10-31-14-141.us-east-1.aws.redhat.com podman[26946]: 2024-07-27 12:31:48.291283612 -0400 EDT m=+10.974810527 container init 28a07a6e8a801cd6f84385e3d31d70834b0d17347f7424b4c81938b703393bad (image=quay.io/libpod/testimage:20210610, name=httpd2-httpd2, pod_id=3fc987afc0fcfab30e8777f8ebf27fce38969878d374f1ad00ca4b1107c6d265, io.containers.autoupdate=registry, PODMAN_SYSTEMD_UNIT=podman-kube@-etc-containers-ansible\x2dkubernetes.d-httpd2.yml.service, app=test, created_at=2021-06-10T18:55:36Z, created_by=test/system/build-testimage, io.buildah.version=1.21.0) Jul 27 12:31:48 ip-10-31-14-141.us-east-1.aws.redhat.com podman[26946]: 2024-07-27 12:31:48.294305812 -0400 EDT m=+10.977832574 container start 28a07a6e8a801cd6f84385e3d31d70834b0d17347f7424b4c81938b703393bad (image=quay.io/libpod/testimage:20210610, name=httpd2-httpd2, pod_id=3fc987afc0fcfab30e8777f8ebf27fce38969878d374f1ad00ca4b1107c6d265, created_by=test/system/build-testimage, io.buildah.version=1.21.0, io.containers.autoupdate=registry, PODMAN_SYSTEMD_UNIT=podman-kube@-etc-containers-ansible\x2dkubernetes.d-httpd2.yml.service, app=test, created_at=2021-06-10T18:55:36Z) Jul 27 12:31:48 ip-10-31-14-141.us-east-1.aws.redhat.com podman[26946]: 2024-07-27 12:31:48.301206475 -0400 EDT m=+10.984733271 pod start 3fc987afc0fcfab30e8777f8ebf27fce38969878d374f1ad00ca4b1107c6d265 (image=, name=httpd2) Jul 27 12:31:48 ip-10-31-14-141.us-east-1.aws.redhat.com podman[26946]: Pod: Jul 27 12:31:48 ip-10-31-14-141.us-east-1.aws.redhat.com podman[26946]: 3fc987afc0fcfab30e8777f8ebf27fce38969878d374f1ad00ca4b1107c6d265 Jul 27 12:31:48 ip-10-31-14-141.us-east-1.aws.redhat.com podman[26946]: Container: Jul 27 12:31:48 ip-10-31-14-141.us-east-1.aws.redhat.com podman[26946]: 28a07a6e8a801cd6f84385e3d31d70834b0d17347f7424b4c81938b703393bad Jul 27 12:31:48 ip-10-31-14-141.us-east-1.aws.redhat.com systemd[1]: Started podman-kube@-etc-containers-ansible\x2dkubernetes.d-httpd2.yml.service - A template for running K8s workloads via podman-kube-play. ░░ Subject: A start job for unit podman-kube@-etc-containers-ansible\x2dkubernetes.d-httpd2.yml.service has finished successfully ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ A start job for unit podman-kube@-etc-containers-ansible\x2dkubernetes.d-httpd2.yml.service has finished successfully. ░░ ░░ The job identifier is 2521. Jul 27 12:31:49 ip-10-31-14-141.us-east-1.aws.redhat.com python3.12[27162]: ansible-getent Invoked with database=group key=0 fail_key=False service=None split=None Jul 27 12:31:49 ip-10-31-14-141.us-east-1.aws.redhat.com python3.12[27276]: ansible-stat Invoked with path=/usr/bin/getsubids follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Jul 27 12:31:50 ip-10-31-14-141.us-east-1.aws.redhat.com python3.12[27391]: ansible-ansible.legacy.command Invoked with _raw_params=systemd-escape --template podman-kube@.service /etc/containers/ansible-kubernetes.d/httpd3.yml _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Jul 27 12:31:51 ip-10-31-14-141.us-east-1.aws.redhat.com python3.12[27505]: ansible-file Invoked with path=/tmp/lsr_ykmfaaw1_podman/httpd3 state=directory owner=root group=root recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None setype=None attributes=None Jul 27 12:31:52 ip-10-31-14-141.us-east-1.aws.redhat.com python3.12[27618]: ansible-file Invoked with path=/tmp/lsr_ykmfaaw1_podman/httpd3-create state=directory owner=root group=root recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None setype=None attributes=None Jul 27 12:31:52 ip-10-31-14-141.us-east-1.aws.redhat.com podman[27747]: 2024-07-27 12:31:52.975709585 -0400 EDT m=+0.319000869 image pull 9f9ec7f2fdef9168f74e9d057f307955db14d782cff22ded51d277d74798cb2f quay.io/libpod/testimage:20210610 Jul 27 12:31:53 ip-10-31-14-141.us-east-1.aws.redhat.com python3.12[27874]: ansible-stat Invoked with path=/etc/containers/ansible-kubernetes.d/httpd3.yml follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Jul 27 12:31:53 ip-10-31-14-141.us-east-1.aws.redhat.com python3.12[27987]: ansible-file Invoked with path=/etc/containers/ansible-kubernetes.d state=directory owner=root group=0 mode=0755 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Jul 27 12:31:54 ip-10-31-14-141.us-east-1.aws.redhat.com python3.12[28100]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/ansible-kubernetes.d/httpd3.yml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True Jul 27 12:31:54 ip-10-31-14-141.us-east-1.aws.redhat.com python3.12[28190]: ansible-ansible.legacy.copy Invoked with dest=/etc/containers/ansible-kubernetes.d/httpd3.yml owner=root group=0 mode=0644 src=/root/.ansible/tmp/ansible-tmp-1722097913.9955175-8372-237534320432247/.source.yml _original_basename=.t6d2o5p5 follow=False checksum=7767c5f8bacb5de840129bb71124239716453343 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Jul 27 12:31:55 ip-10-31-14-141.us-east-1.aws.redhat.com python3.12[28303]: ansible-containers.podman.podman_play Invoked with state=started kube_file=/etc/containers/ansible-kubernetes.d/httpd3.yml executable=podman annotation=None authfile=None build=None cert_dir=None configmap=None context_dir=None seccomp_profile_root=None username=None password=NOT_LOGGING_PARAMETER log_driver=None log_opt=None network=None tls_verify=None debug=None quiet=None recreate=None userns=None log_level=None quadlet_dir=None quadlet_filename=None quadlet_options=None Jul 27 12:31:55 ip-10-31-14-141.us-east-1.aws.redhat.com systemd[1]: Created slice machine-libpod_pod_51e54c7f30426f36d3559784a8c8bbc4b3f498176e5f23a0774597e750d3f6ae.slice - cgroup machine-libpod_pod_51e54c7f30426f36d3559784a8c8bbc4b3f498176e5f23a0774597e750d3f6ae.slice. ░░ Subject: A start job for unit machine-libpod_pod_51e54c7f30426f36d3559784a8c8bbc4b3f498176e5f23a0774597e750d3f6ae.slice has finished successfully ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ A start job for unit machine-libpod_pod_51e54c7f30426f36d3559784a8c8bbc4b3f498176e5f23a0774597e750d3f6ae.slice has finished successfully. ░░ ░░ The job identifier is 2722. Jul 27 12:31:55 ip-10-31-14-141.us-east-1.aws.redhat.com podman[28310]: 2024-07-27 12:31:55.33431411 -0400 EDT m=+0.211685672 container create daada585ee8ec8c6af2a44c62ff5e665877088e370f3186af224d93b45274886 (image=localhost/podman-pause:5.1.2-1720656000, name=51e54c7f3042-infra, pod_id=51e54c7f30426f36d3559784a8c8bbc4b3f498176e5f23a0774597e750d3f6ae, io.buildah.version=1.36.0) Jul 27 12:31:55 ip-10-31-14-141.us-east-1.aws.redhat.com podman[28310]: 2024-07-27 12:31:55.340964088 -0400 EDT m=+0.218335431 pod create 51e54c7f30426f36d3559784a8c8bbc4b3f498176e5f23a0774597e750d3f6ae (image=, name=httpd3) Jul 27 12:31:55 ip-10-31-14-141.us-east-1.aws.redhat.com podman[28310]: 2024-07-27 12:31:55.343781908 -0400 EDT m=+0.221153482 image pull 9f9ec7f2fdef9168f74e9d057f307955db14d782cff22ded51d277d74798cb2f quay.io/libpod/testimage:20210610 Jul 27 12:31:55 ip-10-31-14-141.us-east-1.aws.redhat.com podman[28310]: 2024-07-27 12:31:55.373519612 -0400 EDT m=+0.250891059 container create 030999d8e1f2ce202ca3845ef1e18268a7318f66b0ebfeda66866bb8082444ad (image=quay.io/libpod/testimage:20210610, name=httpd3-httpd3, pod_id=51e54c7f30426f36d3559784a8c8bbc4b3f498176e5f23a0774597e750d3f6ae, io.containers.autoupdate=registry, created_at=2021-06-10T18:55:36Z, created_by=test/system/build-testimage, io.buildah.version=1.21.0, app=test) Jul 27 12:31:55 ip-10-31-14-141.us-east-1.aws.redhat.com kernel: podman1: port 2(veth1) entered blocking state Jul 27 12:31:55 ip-10-31-14-141.us-east-1.aws.redhat.com kernel: podman1: port 2(veth1) entered disabled state Jul 27 12:31:55 ip-10-31-14-141.us-east-1.aws.redhat.com kernel: veth1: entered allmulticast mode Jul 27 12:31:55 ip-10-31-14-141.us-east-1.aws.redhat.com kernel: veth1: entered promiscuous mode Jul 27 12:31:55 ip-10-31-14-141.us-east-1.aws.redhat.com kernel: podman1: port 2(veth1) entered blocking state Jul 27 12:31:55 ip-10-31-14-141.us-east-1.aws.redhat.com kernel: podman1: port 2(veth1) entered forwarding state Jul 27 12:31:55 ip-10-31-14-141.us-east-1.aws.redhat.com NetworkManager[678]: [1722097915.4016] manager: (veth1): new Veth device (/org/freedesktop/NetworkManager/Devices/7) Jul 27 12:31:55 ip-10-31-14-141.us-east-1.aws.redhat.com NetworkManager[678]: [1722097915.4080] device (veth1): carrier: link connected Jul 27 12:31:55 ip-10-31-14-141.us-east-1.aws.redhat.com (udev-worker)[28321]: Network interface NamePolicy= disabled on kernel command line. Jul 27 12:31:55 ip-10-31-14-141.us-east-1.aws.redhat.com systemd[1]: Started libpod-conmon-daada585ee8ec8c6af2a44c62ff5e665877088e370f3186af224d93b45274886.scope. ░░ Subject: A start job for unit libpod-conmon-daada585ee8ec8c6af2a44c62ff5e665877088e370f3186af224d93b45274886.scope has finished successfully ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ A start job for unit libpod-conmon-daada585ee8ec8c6af2a44c62ff5e665877088e370f3186af224d93b45274886.scope has finished successfully. ░░ ░░ The job identifier is 2729. Jul 27 12:31:55 ip-10-31-14-141.us-east-1.aws.redhat.com systemd[1]: tmp-crun.C1B2RW.mount: Deactivated successfully. ░░ Subject: Unit succeeded ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ The unit tmp-crun.C1B2RW.mount has successfully entered the 'dead' state. Jul 27 12:31:55 ip-10-31-14-141.us-east-1.aws.redhat.com systemd[1]: Started libpod-daada585ee8ec8c6af2a44c62ff5e665877088e370f3186af224d93b45274886.scope - libcrun container. ░░ Subject: A start job for unit libpod-daada585ee8ec8c6af2a44c62ff5e665877088e370f3186af224d93b45274886.scope has finished successfully ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ A start job for unit libpod-daada585ee8ec8c6af2a44c62ff5e665877088e370f3186af224d93b45274886.scope has finished successfully. ░░ ░░ The job identifier is 2736. Jul 27 12:31:55 ip-10-31-14-141.us-east-1.aws.redhat.com podman[28310]: 2024-07-27 12:31:55.536700541 -0400 EDT m=+0.414072178 container init daada585ee8ec8c6af2a44c62ff5e665877088e370f3186af224d93b45274886 (image=localhost/podman-pause:5.1.2-1720656000, name=51e54c7f3042-infra, pod_id=51e54c7f30426f36d3559784a8c8bbc4b3f498176e5f23a0774597e750d3f6ae, io.buildah.version=1.36.0) Jul 27 12:31:55 ip-10-31-14-141.us-east-1.aws.redhat.com podman[28310]: 2024-07-27 12:31:55.541067417 -0400 EDT m=+0.418438913 container start daada585ee8ec8c6af2a44c62ff5e665877088e370f3186af224d93b45274886 (image=localhost/podman-pause:5.1.2-1720656000, name=51e54c7f3042-infra, pod_id=51e54c7f30426f36d3559784a8c8bbc4b3f498176e5f23a0774597e750d3f6ae, io.buildah.version=1.36.0) Jul 27 12:31:55 ip-10-31-14-141.us-east-1.aws.redhat.com systemd[1]: Started libpod-conmon-030999d8e1f2ce202ca3845ef1e18268a7318f66b0ebfeda66866bb8082444ad.scope. ░░ Subject: A start job for unit libpod-conmon-030999d8e1f2ce202ca3845ef1e18268a7318f66b0ebfeda66866bb8082444ad.scope has finished successfully ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ A start job for unit libpod-conmon-030999d8e1f2ce202ca3845ef1e18268a7318f66b0ebfeda66866bb8082444ad.scope has finished successfully. ░░ ░░ The job identifier is 2743. Jul 27 12:31:55 ip-10-31-14-141.us-east-1.aws.redhat.com systemd[1]: Started libpod-030999d8e1f2ce202ca3845ef1e18268a7318f66b0ebfeda66866bb8082444ad.scope - libcrun container. ░░ Subject: A start job for unit libpod-030999d8e1f2ce202ca3845ef1e18268a7318f66b0ebfeda66866bb8082444ad.scope has finished successfully ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ A start job for unit libpod-030999d8e1f2ce202ca3845ef1e18268a7318f66b0ebfeda66866bb8082444ad.scope has finished successfully. ░░ ░░ The job identifier is 2750. Jul 27 12:31:55 ip-10-31-14-141.us-east-1.aws.redhat.com podman[28310]: 2024-07-27 12:31:55.613827977 -0400 EDT m=+0.491199603 container init 030999d8e1f2ce202ca3845ef1e18268a7318f66b0ebfeda66866bb8082444ad (image=quay.io/libpod/testimage:20210610, name=httpd3-httpd3, pod_id=51e54c7f30426f36d3559784a8c8bbc4b3f498176e5f23a0774597e750d3f6ae, app=test, created_at=2021-06-10T18:55:36Z, created_by=test/system/build-testimage, io.buildah.version=1.21.0, io.containers.autoupdate=registry) Jul 27 12:31:55 ip-10-31-14-141.us-east-1.aws.redhat.com podman[28310]: 2024-07-27 12:31:55.617027388 -0400 EDT m=+0.494398924 container start 030999d8e1f2ce202ca3845ef1e18268a7318f66b0ebfeda66866bb8082444ad (image=quay.io/libpod/testimage:20210610, name=httpd3-httpd3, pod_id=51e54c7f30426f36d3559784a8c8bbc4b3f498176e5f23a0774597e750d3f6ae, created_at=2021-06-10T18:55:36Z, created_by=test/system/build-testimage, io.buildah.version=1.21.0, io.containers.autoupdate=registry, app=test) Jul 27 12:31:55 ip-10-31-14-141.us-east-1.aws.redhat.com podman[28310]: 2024-07-27 12:31:55.624042322 -0400 EDT m=+0.501413678 pod start 51e54c7f30426f36d3559784a8c8bbc4b3f498176e5f23a0774597e750d3f6ae (image=, name=httpd3) Jul 27 12:31:56 ip-10-31-14-141.us-east-1.aws.redhat.com python3.12[28470]: ansible-systemd Invoked with daemon_reload=True scope=system daemon_reexec=False no_block=False name=None state=None enabled=None force=None masked=None Jul 27 12:31:56 ip-10-31-14-141.us-east-1.aws.redhat.com systemd[1]: Reload requested from client PID 28471 ('systemctl') (unit session-5.scope)... Jul 27 12:31:56 ip-10-31-14-141.us-east-1.aws.redhat.com systemd[1]: Reloading... Jul 27 12:31:56 ip-10-31-14-141.us-east-1.aws.redhat.com systemd[1]: Reloading finished in 246 ms. Jul 27 12:31:57 ip-10-31-14-141.us-east-1.aws.redhat.com python3.12[28639]: ansible-systemd Invoked with name=podman-kube@-etc-containers-ansible\x2dkubernetes.d-httpd3.yml.service scope=system enabled=True daemon_reload=False daemon_reexec=False no_block=False state=None force=None masked=None Jul 27 12:31:57 ip-10-31-14-141.us-east-1.aws.redhat.com systemd[1]: Reload requested from client PID 28642 ('systemctl') (unit session-5.scope)... Jul 27 12:31:57 ip-10-31-14-141.us-east-1.aws.redhat.com systemd[1]: Reloading... Jul 27 12:31:57 ip-10-31-14-141.us-east-1.aws.redhat.com systemd[1]: Reloading finished in 244 ms. Jul 27 12:31:57 ip-10-31-14-141.us-east-1.aws.redhat.com python3.12[28810]: ansible-systemd Invoked with name=podman-kube@-etc-containers-ansible\x2dkubernetes.d-httpd3.yml.service scope=system state=started daemon_reload=False daemon_reexec=False no_block=False enabled=None force=None masked=None Jul 27 12:31:57 ip-10-31-14-141.us-east-1.aws.redhat.com systemd[1]: Starting podman-kube@-etc-containers-ansible\x2dkubernetes.d-httpd3.yml.service - A template for running K8s workloads via podman-kube-play... ░░ Subject: A start job for unit podman-kube@-etc-containers-ansible\x2dkubernetes.d-httpd3.yml.service has begun execution ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ A start job for unit podman-kube@-etc-containers-ansible\x2dkubernetes.d-httpd3.yml.service has begun execution. ░░ ░░ The job identifier is 2757. Jul 27 12:31:58 ip-10-31-14-141.us-east-1.aws.redhat.com podman[28814]: 2024-07-27 12:31:58.02647191 -0400 EDT m=+0.033549678 pod stop 51e54c7f30426f36d3559784a8c8bbc4b3f498176e5f23a0774597e750d3f6ae (image=, name=httpd3) Jul 27 12:31:58 ip-10-31-14-141.us-east-1.aws.redhat.com systemd[1]: tmp-crun.yauA7R.mount: Deactivated successfully. ░░ Subject: Unit succeeded ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ The unit tmp-crun.yauA7R.mount has successfully entered the 'dead' state. Jul 27 12:31:58 ip-10-31-14-141.us-east-1.aws.redhat.com systemd[1]: libpod-daada585ee8ec8c6af2a44c62ff5e665877088e370f3186af224d93b45274886.scope: Deactivated successfully. ░░ Subject: Unit succeeded ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ The unit libpod-daada585ee8ec8c6af2a44c62ff5e665877088e370f3186af224d93b45274886.scope has successfully entered the 'dead' state. Jul 27 12:31:58 ip-10-31-14-141.us-east-1.aws.redhat.com podman[28814]: 2024-07-27 12:31:58.059289555 -0400 EDT m=+0.066367248 container died daada585ee8ec8c6af2a44c62ff5e665877088e370f3186af224d93b45274886 (image=localhost/podman-pause:5.1.2-1720656000, name=51e54c7f3042-infra, io.buildah.version=1.36.0) Jul 27 12:31:58 ip-10-31-14-141.us-east-1.aws.redhat.com systemd[1]: tmp-crun.pMdNa2.mount: Deactivated successfully. ░░ Subject: Unit succeeded ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ The unit tmp-crun.pMdNa2.mount has successfully entered the 'dead' state. Jul 27 12:31:58 ip-10-31-14-141.us-east-1.aws.redhat.com kernel: podman1: port 2(veth1) entered disabled state Jul 27 12:31:58 ip-10-31-14-141.us-east-1.aws.redhat.com kernel: veth1 (unregistering): left allmulticast mode Jul 27 12:31:58 ip-10-31-14-141.us-east-1.aws.redhat.com kernel: veth1 (unregistering): left promiscuous mode Jul 27 12:31:58 ip-10-31-14-141.us-east-1.aws.redhat.com kernel: podman1: port 2(veth1) entered disabled state Jul 27 12:31:58 ip-10-31-14-141.us-east-1.aws.redhat.com systemd[1]: run-netns-netns\x2dce489bf5\x2d85ef\x2d501a\x2d3443\x2d01e8362eda34.mount: Deactivated successfully. ░░ Subject: Unit succeeded ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ The unit run-netns-netns\x2dce489bf5\x2d85ef\x2d501a\x2d3443\x2d01e8362eda34.mount has successfully entered the 'dead' state. Jul 27 12:31:58 ip-10-31-14-141.us-east-1.aws.redhat.com podman[28814]: 2024-07-27 12:31:58.156941831 -0400 EDT m=+0.164019206 container cleanup daada585ee8ec8c6af2a44c62ff5e665877088e370f3186af224d93b45274886 (image=localhost/podman-pause:5.1.2-1720656000, name=51e54c7f3042-infra, pod_id=51e54c7f30426f36d3559784a8c8bbc4b3f498176e5f23a0774597e750d3f6ae, io.buildah.version=1.36.0) Jul 27 12:31:58 ip-10-31-14-141.us-east-1.aws.redhat.com systemd[1]: libpod-conmon-daada585ee8ec8c6af2a44c62ff5e665877088e370f3186af224d93b45274886.scope: Deactivated successfully. ░░ Subject: Unit succeeded ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ The unit libpod-conmon-daada585ee8ec8c6af2a44c62ff5e665877088e370f3186af224d93b45274886.scope has successfully entered the 'dead' state. Jul 27 12:31:58 ip-10-31-14-141.us-east-1.aws.redhat.com systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully. ░░ Subject: Unit succeeded ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ The unit NetworkManager-dispatcher.service has successfully entered the 'dead' state. Jul 27 12:31:59 ip-10-31-14-141.us-east-1.aws.redhat.com systemd[1]: var-lib-containers-storage-overlay-f6c156948585188fb8ba9c7554ec284c006f6c771ac0831d6c74c7be4400feb5-merged.mount: Deactivated successfully. ░░ Subject: Unit succeeded ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ The unit var-lib-containers-storage-overlay-f6c156948585188fb8ba9c7554ec284c006f6c771ac0831d6c74c7be4400feb5-merged.mount has successfully entered the 'dead' state. Jul 27 12:31:59 ip-10-31-14-141.us-east-1.aws.redhat.com systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-daada585ee8ec8c6af2a44c62ff5e665877088e370f3186af224d93b45274886-userdata-shm.mount: Deactivated successfully. ░░ Subject: Unit succeeded ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ The unit var-lib-containers-storage-overlay\x2dcontainers-daada585ee8ec8c6af2a44c62ff5e665877088e370f3186af224d93b45274886-userdata-shm.mount has successfully entered the 'dead' state. Jul 27 12:32:08 ip-10-31-14-141.us-east-1.aws.redhat.com podman[28814]: time="2024-07-27T12:32:08-04:00" level=warning msg="StopSignal SIGTERM failed to stop container httpd3-httpd3 in 10 seconds, resorting to SIGKILL" Jul 27 12:32:08 ip-10-31-14-141.us-east-1.aws.redhat.com systemd[1]: libpod-030999d8e1f2ce202ca3845ef1e18268a7318f66b0ebfeda66866bb8082444ad.scope: Deactivated successfully. ░░ Subject: Unit succeeded ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ The unit libpod-030999d8e1f2ce202ca3845ef1e18268a7318f66b0ebfeda66866bb8082444ad.scope has successfully entered the 'dead' state. Jul 27 12:32:08 ip-10-31-14-141.us-east-1.aws.redhat.com podman[28814]: 2024-07-27 12:32:08.083023875 -0400 EDT m=+10.090101456 container died 030999d8e1f2ce202ca3845ef1e18268a7318f66b0ebfeda66866bb8082444ad (image=quay.io/libpod/testimage:20210610, name=httpd3-httpd3, created_at=2021-06-10T18:55:36Z, created_by=test/system/build-testimage, io.buildah.version=1.21.0, io.containers.autoupdate=registry, app=test) Jul 27 12:32:08 ip-10-31-14-141.us-east-1.aws.redhat.com systemd[1]: var-lib-containers-storage-overlay-8862b3c043b48e30eea977b7aef7985f37de3169dba05dc91b86c318229883f3-merged.mount: Deactivated successfully. ░░ Subject: Unit succeeded ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ The unit var-lib-containers-storage-overlay-8862b3c043b48e30eea977b7aef7985f37de3169dba05dc91b86c318229883f3-merged.mount has successfully entered the 'dead' state. Jul 27 12:32:08 ip-10-31-14-141.us-east-1.aws.redhat.com podman[28814]: 2024-07-27 12:32:08.122996557 -0400 EDT m=+10.130074034 container cleanup 030999d8e1f2ce202ca3845ef1e18268a7318f66b0ebfeda66866bb8082444ad (image=quay.io/libpod/testimage:20210610, name=httpd3-httpd3, pod_id=51e54c7f30426f36d3559784a8c8bbc4b3f498176e5f23a0774597e750d3f6ae, created_at=2021-06-10T18:55:36Z, created_by=test/system/build-testimage, io.buildah.version=1.21.0, io.containers.autoupdate=registry, app=test) Jul 27 12:32:08 ip-10-31-14-141.us-east-1.aws.redhat.com systemd[1]: Stopping libpod-conmon-030999d8e1f2ce202ca3845ef1e18268a7318f66b0ebfeda66866bb8082444ad.scope... ░░ Subject: A stop job for unit libpod-conmon-030999d8e1f2ce202ca3845ef1e18268a7318f66b0ebfeda66866bb8082444ad.scope has begun execution ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ A stop job for unit libpod-conmon-030999d8e1f2ce202ca3845ef1e18268a7318f66b0ebfeda66866bb8082444ad.scope has begun execution. ░░ ░░ The job identifier is 2844. Jul 27 12:32:08 ip-10-31-14-141.us-east-1.aws.redhat.com systemd[1]: libpod-conmon-030999d8e1f2ce202ca3845ef1e18268a7318f66b0ebfeda66866bb8082444ad.scope: Deactivated successfully. ░░ Subject: Unit succeeded ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ The unit libpod-conmon-030999d8e1f2ce202ca3845ef1e18268a7318f66b0ebfeda66866bb8082444ad.scope has successfully entered the 'dead' state. Jul 27 12:32:08 ip-10-31-14-141.us-east-1.aws.redhat.com systemd[1]: Stopped libpod-conmon-030999d8e1f2ce202ca3845ef1e18268a7318f66b0ebfeda66866bb8082444ad.scope. ░░ Subject: A stop job for unit libpod-conmon-030999d8e1f2ce202ca3845ef1e18268a7318f66b0ebfeda66866bb8082444ad.scope has finished ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ A stop job for unit libpod-conmon-030999d8e1f2ce202ca3845ef1e18268a7318f66b0ebfeda66866bb8082444ad.scope has finished. ░░ ░░ The job identifier is 2844 and the job result is done. Jul 27 12:32:08 ip-10-31-14-141.us-east-1.aws.redhat.com systemd[1]: Removed slice machine-libpod_pod_51e54c7f30426f36d3559784a8c8bbc4b3f498176e5f23a0774597e750d3f6ae.slice - cgroup machine-libpod_pod_51e54c7f30426f36d3559784a8c8bbc4b3f498176e5f23a0774597e750d3f6ae.slice. ░░ Subject: A stop job for unit machine-libpod_pod_51e54c7f30426f36d3559784a8c8bbc4b3f498176e5f23a0774597e750d3f6ae.slice has finished ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ A stop job for unit machine-libpod_pod_51e54c7f30426f36d3559784a8c8bbc4b3f498176e5f23a0774597e750d3f6ae.slice has finished. ░░ ░░ The job identifier is 2843 and the job result is done. Jul 27 12:32:08 ip-10-31-14-141.us-east-1.aws.redhat.com podman[28814]: 2024-07-27 12:32:08.135581215 -0400 EDT m=+10.142658735 pod stop 51e54c7f30426f36d3559784a8c8bbc4b3f498176e5f23a0774597e750d3f6ae (image=, name=httpd3) Jul 27 12:32:08 ip-10-31-14-141.us-east-1.aws.redhat.com systemd[1]: machine-libpod_pod_51e54c7f30426f36d3559784a8c8bbc4b3f498176e5f23a0774597e750d3f6ae.slice: Failed to open /run/systemd/transient/machine-libpod_pod_51e54c7f30426f36d3559784a8c8bbc4b3f498176e5f23a0774597e750d3f6ae.slice: No such file or directory Jul 27 12:32:08 ip-10-31-14-141.us-east-1.aws.redhat.com podman[28814]: time="2024-07-27T12:32:08-04:00" level=error msg="Checking if infra needs to be stopped: removing pod 51e54c7f30426f36d3559784a8c8bbc4b3f498176e5f23a0774597e750d3f6ae cgroup: Unit machine-libpod_pod_51e54c7f30426f36d3559784a8c8bbc4b3f498176e5f23a0774597e750d3f6ae.slice not loaded." Jul 27 12:32:08 ip-10-31-14-141.us-east-1.aws.redhat.com podman[28814]: 2024-07-27 12:32:08.166070444 -0400 EDT m=+10.173148843 pod stop 51e54c7f30426f36d3559784a8c8bbc4b3f498176e5f23a0774597e750d3f6ae (image=, name=httpd3) Jul 27 12:32:08 ip-10-31-14-141.us-east-1.aws.redhat.com systemd[1]: machine-libpod_pod_51e54c7f30426f36d3559784a8c8bbc4b3f498176e5f23a0774597e750d3f6ae.slice: Failed to open /run/systemd/transient/machine-libpod_pod_51e54c7f30426f36d3559784a8c8bbc4b3f498176e5f23a0774597e750d3f6ae.slice: No such file or directory Jul 27 12:32:08 ip-10-31-14-141.us-east-1.aws.redhat.com podman[28814]: time="2024-07-27T12:32:08-04:00" level=error msg="Checking if infra needs to be stopped: removing pod 51e54c7f30426f36d3559784a8c8bbc4b3f498176e5f23a0774597e750d3f6ae cgroup: Unit machine-libpod_pod_51e54c7f30426f36d3559784a8c8bbc4b3f498176e5f23a0774597e750d3f6ae.slice not loaded." Jul 27 12:32:08 ip-10-31-14-141.us-east-1.aws.redhat.com podman[28814]: 2024-07-27 12:32:08.196197386 -0400 EDT m=+10.203274858 container remove 030999d8e1f2ce202ca3845ef1e18268a7318f66b0ebfeda66866bb8082444ad (image=quay.io/libpod/testimage:20210610, name=httpd3-httpd3, pod_id=51e54c7f30426f36d3559784a8c8bbc4b3f498176e5f23a0774597e750d3f6ae, app=test, created_at=2021-06-10T18:55:36Z, created_by=test/system/build-testimage, io.buildah.version=1.21.0, io.containers.autoupdate=registry) Jul 27 12:32:08 ip-10-31-14-141.us-east-1.aws.redhat.com podman[28814]: 2024-07-27 12:32:08.224778396 -0400 EDT m=+10.231855787 container remove daada585ee8ec8c6af2a44c62ff5e665877088e370f3186af224d93b45274886 (image=localhost/podman-pause:5.1.2-1720656000, name=51e54c7f3042-infra, pod_id=51e54c7f30426f36d3559784a8c8bbc4b3f498176e5f23a0774597e750d3f6ae, io.buildah.version=1.36.0) Jul 27 12:32:08 ip-10-31-14-141.us-east-1.aws.redhat.com systemd[1]: machine-libpod_pod_51e54c7f30426f36d3559784a8c8bbc4b3f498176e5f23a0774597e750d3f6ae.slice: Failed to open /run/systemd/transient/machine-libpod_pod_51e54c7f30426f36d3559784a8c8bbc4b3f498176e5f23a0774597e750d3f6ae.slice: No such file or directory Jul 27 12:32:08 ip-10-31-14-141.us-east-1.aws.redhat.com podman[28814]: 2024-07-27 12:32:08.236058403 -0400 EDT m=+10.243135780 pod remove 51e54c7f30426f36d3559784a8c8bbc4b3f498176e5f23a0774597e750d3f6ae (image=, name=httpd3) Jul 27 12:32:08 ip-10-31-14-141.us-east-1.aws.redhat.com podman[28814]: Pods stopped: Jul 27 12:32:08 ip-10-31-14-141.us-east-1.aws.redhat.com podman[28814]: 51e54c7f30426f36d3559784a8c8bbc4b3f498176e5f23a0774597e750d3f6ae Jul 27 12:32:08 ip-10-31-14-141.us-east-1.aws.redhat.com podman[28814]: Pods removed: Jul 27 12:32:08 ip-10-31-14-141.us-east-1.aws.redhat.com podman[28814]: Error: removing pod 51e54c7f30426f36d3559784a8c8bbc4b3f498176e5f23a0774597e750d3f6ae cgroup: removing pod 51e54c7f30426f36d3559784a8c8bbc4b3f498176e5f23a0774597e750d3f6ae cgroup: Unit machine-libpod_pod_51e54c7f30426f36d3559784a8c8bbc4b3f498176e5f23a0774597e750d3f6ae.slice not loaded. Jul 27 12:32:08 ip-10-31-14-141.us-east-1.aws.redhat.com podman[28814]: Secrets removed: Jul 27 12:32:08 ip-10-31-14-141.us-east-1.aws.redhat.com podman[28814]: Error: %!s() Jul 27 12:32:08 ip-10-31-14-141.us-east-1.aws.redhat.com podman[28814]: Volumes removed: Jul 27 12:32:08 ip-10-31-14-141.us-east-1.aws.redhat.com podman[28814]: 2024-07-27 12:32:08.264562524 -0400 EDT m=+10.271640111 container create 3d4d558c0f2fa33094c0bd8d5e83414ebd5d14339c39e2ae713129f7e70714f8 (image=localhost/podman-pause:5.1.2-1720656000, name=b46e3d5b107f-service, PODMAN_SYSTEMD_UNIT=podman-kube@-etc-containers-ansible\x2dkubernetes.d-httpd3.yml.service) Jul 27 12:32:08 ip-10-31-14-141.us-east-1.aws.redhat.com systemd[1]: Created slice machine-libpod_pod_d934b8f7de34bc54d46a16a6adb769f71e105fd9b95764f463833eec9b2c48c6.slice - cgroup machine-libpod_pod_d934b8f7de34bc54d46a16a6adb769f71e105fd9b95764f463833eec9b2c48c6.slice. ░░ Subject: A start job for unit machine-libpod_pod_d934b8f7de34bc54d46a16a6adb769f71e105fd9b95764f463833eec9b2c48c6.slice has finished successfully ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ A start job for unit machine-libpod_pod_d934b8f7de34bc54d46a16a6adb769f71e105fd9b95764f463833eec9b2c48c6.slice has finished successfully. ░░ ░░ The job identifier is 2845. Jul 27 12:32:08 ip-10-31-14-141.us-east-1.aws.redhat.com podman[28814]: 2024-07-27 12:32:08.360077721 -0400 EDT m=+10.367155344 container create b752438f066ce9faf605c7486f1f6ff867a81d90ea1198b0c44ff91e2c7d59d1 (image=localhost/podman-pause:5.1.2-1720656000, name=d934b8f7de34-infra, pod_id=d934b8f7de34bc54d46a16a6adb769f71e105fd9b95764f463833eec9b2c48c6, PODMAN_SYSTEMD_UNIT=podman-kube@-etc-containers-ansible\x2dkubernetes.d-httpd3.yml.service, io.buildah.version=1.36.0) Jul 27 12:32:08 ip-10-31-14-141.us-east-1.aws.redhat.com podman[28814]: 2024-07-27 12:32:08.369627659 -0400 EDT m=+10.376705026 pod create d934b8f7de34bc54d46a16a6adb769f71e105fd9b95764f463833eec9b2c48c6 (image=, name=httpd3) Jul 27 12:32:08 ip-10-31-14-141.us-east-1.aws.redhat.com podman[28814]: 2024-07-27 12:32:08.372527657 -0400 EDT m=+10.379605403 image pull 9f9ec7f2fdef9168f74e9d057f307955db14d782cff22ded51d277d74798cb2f quay.io/libpod/testimage:20210610 Jul 27 12:32:08 ip-10-31-14-141.us-east-1.aws.redhat.com podman[28814]: 2024-07-27 12:32:08.4054002 -0400 EDT m=+10.412477575 container create cf272fb64f907d0bdeb087821bcab3f874e448e85b816fb7d9962878c5c5b1f8 (image=quay.io/libpod/testimage:20210610, name=httpd3-httpd3, pod_id=d934b8f7de34bc54d46a16a6adb769f71e105fd9b95764f463833eec9b2c48c6, created_by=test/system/build-testimage, io.buildah.version=1.21.0, io.containers.autoupdate=registry, app=test, PODMAN_SYSTEMD_UNIT=podman-kube@-etc-containers-ansible\x2dkubernetes.d-httpd3.yml.service, created_at=2021-06-10T18:55:36Z) Jul 27 12:32:08 ip-10-31-14-141.us-east-1.aws.redhat.com podman[28814]: 2024-07-27 12:32:08.405809332 -0400 EDT m=+10.412886731 container restart 3d4d558c0f2fa33094c0bd8d5e83414ebd5d14339c39e2ae713129f7e70714f8 (image=localhost/podman-pause:5.1.2-1720656000, name=b46e3d5b107f-service, PODMAN_SYSTEMD_UNIT=podman-kube@-etc-containers-ansible\x2dkubernetes.d-httpd3.yml.service) Jul 27 12:32:08 ip-10-31-14-141.us-east-1.aws.redhat.com systemd[1]: Started libpod-3d4d558c0f2fa33094c0bd8d5e83414ebd5d14339c39e2ae713129f7e70714f8.scope - libcrun container. ░░ Subject: A start job for unit libpod-3d4d558c0f2fa33094c0bd8d5e83414ebd5d14339c39e2ae713129f7e70714f8.scope has finished successfully ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ A start job for unit libpod-3d4d558c0f2fa33094c0bd8d5e83414ebd5d14339c39e2ae713129f7e70714f8.scope has finished successfully. ░░ ░░ The job identifier is 2851. Jul 27 12:32:08 ip-10-31-14-141.us-east-1.aws.redhat.com podman[28814]: 2024-07-27 12:32:08.459755476 -0400 EDT m=+10.466833073 container init 3d4d558c0f2fa33094c0bd8d5e83414ebd5d14339c39e2ae713129f7e70714f8 (image=localhost/podman-pause:5.1.2-1720656000, name=b46e3d5b107f-service, PODMAN_SYSTEMD_UNIT=podman-kube@-etc-containers-ansible\x2dkubernetes.d-httpd3.yml.service) Jul 27 12:32:08 ip-10-31-14-141.us-east-1.aws.redhat.com podman[28814]: 2024-07-27 12:32:08.462820362 -0400 EDT m=+10.469897916 container start 3d4d558c0f2fa33094c0bd8d5e83414ebd5d14339c39e2ae713129f7e70714f8 (image=localhost/podman-pause:5.1.2-1720656000, name=b46e3d5b107f-service, PODMAN_SYSTEMD_UNIT=podman-kube@-etc-containers-ansible\x2dkubernetes.d-httpd3.yml.service) Jul 27 12:32:08 ip-10-31-14-141.us-east-1.aws.redhat.com kernel: podman1: port 2(veth1) entered blocking state Jul 27 12:32:08 ip-10-31-14-141.us-east-1.aws.redhat.com kernel: podman1: port 2(veth1) entered disabled state Jul 27 12:32:08 ip-10-31-14-141.us-east-1.aws.redhat.com kernel: veth1: entered allmulticast mode Jul 27 12:32:08 ip-10-31-14-141.us-east-1.aws.redhat.com kernel: veth1: entered promiscuous mode Jul 27 12:32:08 ip-10-31-14-141.us-east-1.aws.redhat.com kernel: podman1: port 2(veth1) entered blocking state Jul 27 12:32:08 ip-10-31-14-141.us-east-1.aws.redhat.com kernel: podman1: port 2(veth1) entered forwarding state Jul 27 12:32:08 ip-10-31-14-141.us-east-1.aws.redhat.com NetworkManager[678]: [1722097928.4928] device (veth1): carrier: link connected Jul 27 12:32:08 ip-10-31-14-141.us-east-1.aws.redhat.com NetworkManager[678]: [1722097928.4932] manager: (veth1): new Veth device (/org/freedesktop/NetworkManager/Devices/8) Jul 27 12:32:08 ip-10-31-14-141.us-east-1.aws.redhat.com (udev-worker)[28859]: Network interface NamePolicy= disabled on kernel command line. Jul 27 12:32:08 ip-10-31-14-141.us-east-1.aws.redhat.com systemd[1]: Started libpod-b752438f066ce9faf605c7486f1f6ff867a81d90ea1198b0c44ff91e2c7d59d1.scope - libcrun container. ░░ Subject: A start job for unit libpod-b752438f066ce9faf605c7486f1f6ff867a81d90ea1198b0c44ff91e2c7d59d1.scope has finished successfully ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ A start job for unit libpod-b752438f066ce9faf605c7486f1f6ff867a81d90ea1198b0c44ff91e2c7d59d1.scope has finished successfully. ░░ ░░ The job identifier is 2858. Jul 27 12:32:08 ip-10-31-14-141.us-east-1.aws.redhat.com podman[28814]: 2024-07-27 12:32:08.606572131 -0400 EDT m=+10.613649725 container init b752438f066ce9faf605c7486f1f6ff867a81d90ea1198b0c44ff91e2c7d59d1 (image=localhost/podman-pause:5.1.2-1720656000, name=d934b8f7de34-infra, pod_id=d934b8f7de34bc54d46a16a6adb769f71e105fd9b95764f463833eec9b2c48c6, PODMAN_SYSTEMD_UNIT=podman-kube@-etc-containers-ansible\x2dkubernetes.d-httpd3.yml.service, io.buildah.version=1.36.0) Jul 27 12:32:08 ip-10-31-14-141.us-east-1.aws.redhat.com podman[28814]: 2024-07-27 12:32:08.609511076 -0400 EDT m=+10.616588544 container start b752438f066ce9faf605c7486f1f6ff867a81d90ea1198b0c44ff91e2c7d59d1 (image=localhost/podman-pause:5.1.2-1720656000, name=d934b8f7de34-infra, pod_id=d934b8f7de34bc54d46a16a6adb769f71e105fd9b95764f463833eec9b2c48c6, io.buildah.version=1.36.0, PODMAN_SYSTEMD_UNIT=podman-kube@-etc-containers-ansible\x2dkubernetes.d-httpd3.yml.service) Jul 27 12:32:08 ip-10-31-14-141.us-east-1.aws.redhat.com systemd[1]: Started libpod-cf272fb64f907d0bdeb087821bcab3f874e448e85b816fb7d9962878c5c5b1f8.scope - libcrun container. ░░ Subject: A start job for unit libpod-cf272fb64f907d0bdeb087821bcab3f874e448e85b816fb7d9962878c5c5b1f8.scope has finished successfully ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ A start job for unit libpod-cf272fb64f907d0bdeb087821bcab3f874e448e85b816fb7d9962878c5c5b1f8.scope has finished successfully. ░░ ░░ The job identifier is 2865. Jul 27 12:32:08 ip-10-31-14-141.us-east-1.aws.redhat.com podman[28814]: 2024-07-27 12:32:08.665829681 -0400 EDT m=+10.672907207 container init cf272fb64f907d0bdeb087821bcab3f874e448e85b816fb7d9962878c5c5b1f8 (image=quay.io/libpod/testimage:20210610, name=httpd3-httpd3, pod_id=d934b8f7de34bc54d46a16a6adb769f71e105fd9b95764f463833eec9b2c48c6, PODMAN_SYSTEMD_UNIT=podman-kube@-etc-containers-ansible\x2dkubernetes.d-httpd3.yml.service, app=test, created_at=2021-06-10T18:55:36Z, created_by=test/system/build-testimage, io.buildah.version=1.21.0, io.containers.autoupdate=registry) Jul 27 12:32:08 ip-10-31-14-141.us-east-1.aws.redhat.com podman[28814]: 2024-07-27 12:32:08.668895792 -0400 EDT m=+10.675973313 container start cf272fb64f907d0bdeb087821bcab3f874e448e85b816fb7d9962878c5c5b1f8 (image=quay.io/libpod/testimage:20210610, name=httpd3-httpd3, pod_id=d934b8f7de34bc54d46a16a6adb769f71e105fd9b95764f463833eec9b2c48c6, created_by=test/system/build-testimage, io.buildah.version=1.21.0, io.containers.autoupdate=registry, PODMAN_SYSTEMD_UNIT=podman-kube@-etc-containers-ansible\x2dkubernetes.d-httpd3.yml.service, app=test, created_at=2021-06-10T18:55:36Z) Jul 27 12:32:08 ip-10-31-14-141.us-east-1.aws.redhat.com podman[28814]: 2024-07-27 12:32:08.674987063 -0400 EDT m=+10.682064538 pod start d934b8f7de34bc54d46a16a6adb769f71e105fd9b95764f463833eec9b2c48c6 (image=, name=httpd3) Jul 27 12:32:08 ip-10-31-14-141.us-east-1.aws.redhat.com podman[28814]: Pod: Jul 27 12:32:08 ip-10-31-14-141.us-east-1.aws.redhat.com podman[28814]: d934b8f7de34bc54d46a16a6adb769f71e105fd9b95764f463833eec9b2c48c6 Jul 27 12:32:08 ip-10-31-14-141.us-east-1.aws.redhat.com podman[28814]: Container: Jul 27 12:32:08 ip-10-31-14-141.us-east-1.aws.redhat.com podman[28814]: cf272fb64f907d0bdeb087821bcab3f874e448e85b816fb7d9962878c5c5b1f8 Jul 27 12:32:08 ip-10-31-14-141.us-east-1.aws.redhat.com systemd[1]: Started podman-kube@-etc-containers-ansible\x2dkubernetes.d-httpd3.yml.service - A template for running K8s workloads via podman-kube-play. ░░ Subject: A start job for unit podman-kube@-etc-containers-ansible\x2dkubernetes.d-httpd3.yml.service has finished successfully ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ A start job for unit podman-kube@-etc-containers-ansible\x2dkubernetes.d-httpd3.yml.service has finished successfully. ░░ ░░ The job identifier is 2757. Jul 27 12:32:09 ip-10-31-14-141.us-east-1.aws.redhat.com sudo[29040]: root : TTY=pts/0 ; PWD=/root ; USER=podman_basic_user ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-blszdxwsybxzezcwpimtgciwezqmqtlh ; /usr/bin/python3.12 /var/tmp/ansible-tmp-1722097929.1266143-8424-270429973512865/AnsiballZ_command.py' Jul 27 12:32:09 ip-10-31-14-141.us-east-1.aws.redhat.com sudo[29040]: pam_systemd(sudo:session): New sd-bus connection (system-bus-pam-systemd-29040) opened. Jul 27 12:32:09 ip-10-31-14-141.us-east-1.aws.redhat.com sudo[29040]: pam_unix(sudo:session): session opened for user podman_basic_user(uid=3001) by root(uid=0) Jul 27 12:32:09 ip-10-31-14-141.us-east-1.aws.redhat.com python3.12[29043]: ansible-ansible.legacy.command Invoked with _raw_params=podman pod inspect httpd1 --format '{{.State}}' _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Jul 27 12:32:09 ip-10-31-14-141.us-east-1.aws.redhat.com systemd[23350]: Started podman-29051.scope. ░░ Subject: A start job for unit UNIT has finished successfully ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ A start job for unit UNIT has finished successfully. ░░ ░░ The job identifier is 116. Jul 27 12:32:09 ip-10-31-14-141.us-east-1.aws.redhat.com sudo[29040]: pam_unix(sudo:session): session closed for user podman_basic_user Jul 27 12:32:09 ip-10-31-14-141.us-east-1.aws.redhat.com python3.12[29173]: ansible-ansible.legacy.command Invoked with _raw_params=podman pod inspect httpd2 --format '{{.State}}' _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Jul 27 12:32:10 ip-10-31-14-141.us-east-1.aws.redhat.com python3.12[29293]: ansible-ansible.legacy.command Invoked with _raw_params=podman pod inspect httpd3 --format '{{.State}}' _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Jul 27 12:32:10 ip-10-31-14-141.us-east-1.aws.redhat.com sudo[29450]: root : TTY=pts/0 ; PWD=/root ; USER=podman_basic_user ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xzxaxwicyvauzwcykyrjazqeadtmffvy ; XDG_RUNTIME_DIR=/run/user/3001 /usr/bin/python3.12 /var/tmp/ansible-tmp-1722097930.562536-8450-185007968080719/AnsiballZ_command.py' Jul 27 12:32:10 ip-10-31-14-141.us-east-1.aws.redhat.com sudo[29450]: pam_systemd(sudo:session): New sd-bus connection (system-bus-pam-systemd-29450) opened. Jul 27 12:32:10 ip-10-31-14-141.us-east-1.aws.redhat.com sudo[29450]: pam_unix(sudo:session): session opened for user podman_basic_user(uid=3001) by root(uid=0) Jul 27 12:32:10 ip-10-31-14-141.us-east-1.aws.redhat.com python3.12[29453]: ansible-ansible.legacy.command Invoked with _raw_params=set -euo pipefail systemctl --user list-units -a -l --plain | grep -E '^[ ]*podman-kube@.+-httpd1[.]yml[.]service[ ]+loaded[ ]+active ' _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Jul 27 12:32:10 ip-10-31-14-141.us-east-1.aws.redhat.com sudo[29450]: pam_unix(sudo:session): session closed for user podman_basic_user Jul 27 12:32:11 ip-10-31-14-141.us-east-1.aws.redhat.com python3.12[29569]: ansible-ansible.legacy.command Invoked with _raw_params=set -euo pipefail systemctl --system list-units -a -l --plain | grep -E '^[ ]*podman-kube@.+-httpd2[.]yml[.]service[ ]+loaded[ ]+active ' _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Jul 27 12:32:11 ip-10-31-14-141.us-east-1.aws.redhat.com python3.12[29685]: ansible-ansible.legacy.command Invoked with _raw_params=set -euo pipefail systemctl --system list-units -a -l --plain | grep -E '^[ ]*podman-kube@.+-httpd3[.]yml[.]service[ ]+loaded[ ]+active ' _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Jul 27 12:32:12 ip-10-31-14-141.us-east-1.aws.redhat.com python3.12[29801]: ansible-ansible.legacy.uri Invoked with url=http://localhost:15001/index.txt return_content=True force=False http_agent=ansible-httpget use_proxy=True validate_certs=True force_basic_auth=False use_gssapi=False body_format=raw method=GET follow_redirects=safe status_code=[200] timeout=30 headers={} remote_src=False unredirected_headers=[] decompress=True use_netrc=True unsafe_writes=False url_username=None url_password=NOT_LOGGING_PARAMETER client_cert=None client_key=None dest=None body=None src=None creates=None removes=None unix_socket=None ca_path=None ciphers=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Jul 27 12:32:12 ip-10-31-14-141.us-east-1.aws.redhat.com python3.12[29915]: ansible-ansible.legacy.uri Invoked with url=http://localhost:15002/index.txt return_content=True force=False http_agent=ansible-httpget use_proxy=True validate_certs=True force_basic_auth=False use_gssapi=False body_format=raw method=GET follow_redirects=safe status_code=[200] timeout=30 headers={} remote_src=False unredirected_headers=[] decompress=True use_netrc=True unsafe_writes=False url_username=None url_password=NOT_LOGGING_PARAMETER client_cert=None client_key=None dest=None body=None src=None creates=None removes=None unix_socket=None ca_path=None ciphers=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Jul 27 12:32:13 ip-10-31-14-141.us-east-1.aws.redhat.com python3.12[30029]: ansible-ansible.legacy.command Invoked with _raw_params=ls -alrtF /tmp/lsr_ykmfaaw1_podman/httpd1-create _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Jul 27 12:32:13 ip-10-31-14-141.us-east-1.aws.redhat.com python3.12[30143]: ansible-ansible.legacy.command Invoked with _raw_params=ls -alrtF /tmp/lsr_ykmfaaw1_podman/httpd2-create _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Jul 27 12:32:14 ip-10-31-14-141.us-east-1.aws.redhat.com python3.12[30257]: ansible-ansible.legacy.command Invoked with _raw_params=ls -alrtF /tmp/lsr_ykmfaaw1_podman/httpd3-create _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Jul 27 12:32:16 ip-10-31-14-141.us-east-1.aws.redhat.com python3.12[30484]: ansible-ansible.legacy.command Invoked with _raw_params=podman --version _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Jul 27 12:32:16 ip-10-31-14-141.us-east-1.aws.redhat.com python3.12[30603]: ansible-getent Invoked with database=group key=0 fail_key=False service=None split=None Jul 27 12:32:17 ip-10-31-14-141.us-east-1.aws.redhat.com python3.12[30717]: ansible-stat Invoked with path=/usr/bin/getsubids follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Jul 27 12:32:19 ip-10-31-14-141.us-east-1.aws.redhat.com python3.12[30832]: ansible-ansible.legacy.dnf Invoked with name=['firewalld'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None Jul 27 12:32:21 ip-10-31-14-141.us-east-1.aws.redhat.com python3.12[30946]: ansible-systemd Invoked with name=firewalld masked=False daemon_reload=False daemon_reexec=False scope=system no_block=False state=None enabled=None force=None Jul 27 12:32:21 ip-10-31-14-141.us-east-1.aws.redhat.com python3.12[31061]: ansible-ansible.legacy.systemd Invoked with name=firewalld state=started enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Jul 27 12:32:22 ip-10-31-14-141.us-east-1.aws.redhat.com python3.12[31176]: ansible-fedora.linux_system_roles.firewall_lib Invoked with port=['15001-15003/tcp'] permanent=True runtime=True state=enabled __report_changed=True service=[] source_port=[] forward_port=[] rich_rule=[] source=[] interface=[] interface_pci_id=[] icmp_block=[] timeout=0 ipset_entries=[] protocol=[] helper_module=[] destination=[] firewalld_conf=None masquerade=None icmp_block_inversion=None target=None zone=None set_default_zone=None ipset=None ipset_type=None description=None short=None Jul 27 12:32:24 ip-10-31-14-141.us-east-1.aws.redhat.com python3.12[31289]: ansible-ansible.legacy.dnf Invoked with name=['python3-libselinux', 'python3-policycoreutils'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None Jul 27 12:32:24 ip-10-31-14-141.us-east-1.aws.redhat.com python3.12[31403]: ansible-ansible.legacy.dnf Invoked with name=['policycoreutils-python-utils'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None Jul 27 12:32:25 ip-10-31-14-141.us-east-1.aws.redhat.com python3.12[31517]: ansible-setup Invoked with filter=['ansible_selinux'] gather_subset=['all'] gather_timeout=10 fact_path=/etc/ansible/facts.d Jul 27 12:32:27 ip-10-31-14-141.us-east-1.aws.redhat.com python3.12[31672]: ansible-fedora.linux_system_roles.local_seport Invoked with ports=['15001-15003'] proto=tcp setype=http_port_t state=present local=False ignore_selinux_state=False reload=True Jul 27 12:32:27 ip-10-31-14-141.us-east-1.aws.redhat.com python3.12[31785]: ansible-fedora.linux_system_roles.selinux_modules_facts Invoked Jul 27 12:32:32 ip-10-31-14-141.us-east-1.aws.redhat.com python3.12[31898]: ansible-getent Invoked with database=passwd key=podman_basic_user fail_key=False service=None split=None Jul 27 12:32:33 ip-10-31-14-141.us-east-1.aws.redhat.com python3.12[32012]: ansible-getent Invoked with database=group key=3001 fail_key=False service=None split=None Jul 27 12:32:33 ip-10-31-14-141.us-east-1.aws.redhat.com python3.12[32126]: ansible-stat Invoked with path=/usr/bin/getsubids follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Jul 27 12:32:34 ip-10-31-14-141.us-east-1.aws.redhat.com python3.12[32241]: ansible-ansible.legacy.command Invoked with _raw_params=getsubids podman_basic_user _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Jul 27 12:32:34 ip-10-31-14-141.us-east-1.aws.redhat.com python3.12[32355]: ansible-ansible.legacy.command Invoked with _raw_params=getsubids -g podman_basic_user _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Jul 27 12:32:35 ip-10-31-14-141.us-east-1.aws.redhat.com python3.12[32469]: ansible-ansible.legacy.command Invoked with _raw_params=systemd-escape --template podman-kube@.service /home/podman_basic_user/.config/containers/ansible-kubernetes.d/httpd1.yml _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Jul 27 12:32:36 ip-10-31-14-141.us-east-1.aws.redhat.com python3.12[32583]: ansible-ansible.legacy.command Invoked with creates=/var/lib/systemd/linger/podman_basic_user _raw_params=loginctl enable-linger podman_basic_user _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None removes=None stdin=None Jul 27 12:32:36 ip-10-31-14-141.us-east-1.aws.redhat.com python3.12[32696]: ansible-file Invoked with path=/tmp/lsr_ykmfaaw1_podman/httpd1 state=directory owner=podman_basic_user group=3001 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None setype=None attributes=None Jul 27 12:32:37 ip-10-31-14-141.us-east-1.aws.redhat.com python3.12[32809]: ansible-file Invoked with path=/tmp/lsr_ykmfaaw1_podman/httpd1-create state=directory owner=podman_basic_user group=3001 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None setype=None attributes=None Jul 27 12:32:37 ip-10-31-14-141.us-east-1.aws.redhat.com sudo[32958]: root : TTY=pts/0 ; PWD=/root ; USER=podman_basic_user ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cgkkrfautrsiyjmlykataepqqcsrmkxd ; XDG_RUNTIME_DIR=/run/user/3001 /usr/bin/python3.12 /var/tmp/ansible-tmp-1722097957.434248-8806-260387462283130/AnsiballZ_podman_image.py' Jul 27 12:32:37 ip-10-31-14-141.us-east-1.aws.redhat.com sudo[32958]: pam_systemd(sudo:session): New sd-bus connection (system-bus-pam-systemd-32958) opened. Jul 27 12:32:37 ip-10-31-14-141.us-east-1.aws.redhat.com sudo[32958]: pam_unix(sudo:session): session opened for user podman_basic_user(uid=3001) by root(uid=0) Jul 27 12:32:37 ip-10-31-14-141.us-east-1.aws.redhat.com systemd[23350]: Started podman-32962.scope. ░░ Subject: A start job for unit UNIT has finished successfully ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ A start job for unit UNIT has finished successfully. ░░ ░░ The job identifier is 120. Jul 27 12:32:37 ip-10-31-14-141.us-east-1.aws.redhat.com systemd[23350]: Started podman-32969.scope. ░░ Subject: A start job for unit UNIT has finished successfully ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ A start job for unit UNIT has finished successfully. ░░ ░░ The job identifier is 124. Jul 27 12:32:38 ip-10-31-14-141.us-east-1.aws.redhat.com systemd[23350]: Started podman-32977.scope. ░░ Subject: A start job for unit UNIT has finished successfully ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ A start job for unit UNIT has finished successfully. ░░ ░░ The job identifier is 128. Jul 27 12:32:38 ip-10-31-14-141.us-east-1.aws.redhat.com systemd[23350]: Started podman-32984.scope. ░░ Subject: A start job for unit UNIT has finished successfully ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ A start job for unit UNIT has finished successfully. ░░ ░░ The job identifier is 132. Jul 27 12:32:38 ip-10-31-14-141.us-east-1.aws.redhat.com sudo[32958]: pam_unix(sudo:session): session closed for user podman_basic_user Jul 27 12:32:39 ip-10-31-14-141.us-east-1.aws.redhat.com python3.12[33103]: ansible-stat Invoked with path=/home/podman_basic_user/.config/containers/ansible-kubernetes.d/httpd1.yml follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Jul 27 12:32:39 ip-10-31-14-141.us-east-1.aws.redhat.com python3.12[33218]: ansible-file Invoked with path=/home/podman_basic_user/.config/containers/ansible-kubernetes.d state=directory owner=podman_basic_user group=3001 mode=0755 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Jul 27 12:32:39 ip-10-31-14-141.us-east-1.aws.redhat.com python3.12[33331]: ansible-ansible.legacy.stat Invoked with path=/home/podman_basic_user/.config/containers/ansible-kubernetes.d/httpd1.yml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True Jul 27 12:32:40 ip-10-31-14-141.us-east-1.aws.redhat.com python3.12[33388]: ansible-ansible.legacy.file Invoked with owner=podman_basic_user group=3001 mode=0644 dest=/home/podman_basic_user/.config/containers/ansible-kubernetes.d/httpd1.yml _original_basename=.v5zrrp0z recurse=False state=file path=/home/podman_basic_user/.config/containers/ansible-kubernetes.d/httpd1.yml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Jul 27 12:32:40 ip-10-31-14-141.us-east-1.aws.redhat.com sudo[33537]: root : TTY=pts/0 ; PWD=/root ; USER=podman_basic_user ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ivcffrsxtvjebfoawbvariijrhndwqow ; XDG_RUNTIME_DIR=/run/user/3001 /usr/bin/python3.12 /var/tmp/ansible-tmp-1722097960.3111765-8844-180340567266965/AnsiballZ_podman_play.py' Jul 27 12:32:40 ip-10-31-14-141.us-east-1.aws.redhat.com sudo[33537]: pam_systemd(sudo:session): New sd-bus connection (system-bus-pam-systemd-33537) opened. Jul 27 12:32:40 ip-10-31-14-141.us-east-1.aws.redhat.com sudo[33537]: pam_unix(sudo:session): session opened for user podman_basic_user(uid=3001) by root(uid=0) Jul 27 12:32:40 ip-10-31-14-141.us-east-1.aws.redhat.com python3.12[33540]: ansible-containers.podman.podman_play Invoked with state=started debug=True log_level=debug kube_file=/home/podman_basic_user/.config/containers/ansible-kubernetes.d/httpd1.yml executable=podman annotation=None authfile=None build=None cert_dir=None configmap=None context_dir=None seccomp_profile_root=None username=None password=NOT_LOGGING_PARAMETER log_driver=None log_opt=None network=None tls_verify=None quiet=None recreate=None userns=None quadlet_dir=None quadlet_filename=None quadlet_options=None Jul 27 12:32:40 ip-10-31-14-141.us-east-1.aws.redhat.com systemd[23350]: Started podman-33547.scope. ░░ Subject: A start job for unit UNIT has finished successfully ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ A start job for unit UNIT has finished successfully. ░░ ░░ The job identifier is 136. Jul 27 12:32:40 ip-10-31-14-141.us-east-1.aws.redhat.com systemd[23350]: Created slice user-libpod_pod_12b0ee8b5925a02a9e98f79e5783b87536d5cdb6f2ed4e8f46f928359df94879.slice - cgroup user-libpod_pod_12b0ee8b5925a02a9e98f79e5783b87536d5cdb6f2ed4e8f46f928359df94879.slice. ░░ Subject: A start job for unit UNIT has finished successfully ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ A start job for unit UNIT has finished successfully. ░░ ░░ The job identifier is 140. Jul 27 12:32:40 ip-10-31-14-141.us-east-1.aws.redhat.com python3.12[33540]: ansible-containers.podman.podman_play PODMAN-PLAY-KUBE command: /usr/bin/podman play kube --start=true --log-level=debug /home/podman_basic_user/.config/containers/ansible-kubernetes.d/httpd1.yml Jul 27 12:32:40 ip-10-31-14-141.us-east-1.aws.redhat.com python3.12[33540]: ansible-containers.podman.podman_play PODMAN-PLAY-KUBE stdout: Jul 27 12:32:40 ip-10-31-14-141.us-east-1.aws.redhat.com python3.12[33540]: ansible-containers.podman.podman_play PODMAN-PLAY-KUBE stderr: time="2024-07-27T12:32:40-04:00" level=info msg="/usr/bin/podman filtering at log level debug" time="2024-07-27T12:32:40-04:00" level=debug msg="Called kube.PersistentPreRunE(/usr/bin/podman play kube --start=true --log-level=debug /home/podman_basic_user/.config/containers/ansible-kubernetes.d/httpd1.yml)" time="2024-07-27T12:32:40-04:00" level=debug msg="Using conmon: \"/usr/bin/conmon\"" time="2024-07-27T12:32:40-04:00" level=info msg="Using sqlite as database backend" time="2024-07-27T12:32:40-04:00" level=debug msg="systemd-logind: Unknown object '/'." time="2024-07-27T12:32:40-04:00" level=debug msg="Using graph driver overlay" time="2024-07-27T12:32:40-04:00" level=debug msg="Using graph root /home/podman_basic_user/.local/share/containers/storage" time="2024-07-27T12:32:40-04:00" level=debug msg="Using run root /run/user/3001/containers" time="2024-07-27T12:32:40-04:00" level=debug msg="Using static dir /home/podman_basic_user/.local/share/containers/storage/libpod" time="2024-07-27T12:32:40-04:00" level=debug msg="Using tmp dir /run/user/3001/libpod/tmp" time="2024-07-27T12:32:40-04:00" level=debug msg="Using volume path /home/podman_basic_user/.local/share/containers/storage/volumes" time="2024-07-27T12:32:40-04:00" level=debug msg="Using transient store: false" time="2024-07-27T12:32:40-04:00" level=debug msg="[graphdriver] trying provided driver \"overlay\"" time="2024-07-27T12:32:40-04:00" level=debug msg="Cached value indicated that overlay is supported" time="2024-07-27T12:32:40-04:00" level=debug msg="Cached value indicated that overlay is supported" time="2024-07-27T12:32:40-04:00" level=debug msg="Cached value indicated that metacopy is not being used" time="2024-07-27T12:32:40-04:00" level=debug msg="Cached value indicated that native-diff is usable" time="2024-07-27T12:32:40-04:00" level=debug msg="backingFs=xfs, projectQuotaSupported=false, useNativeDiff=true, usingMetacopy=false" time="2024-07-27T12:32:40-04:00" level=debug msg="Initializing event backend file" time="2024-07-27T12:32:40-04:00" level=debug msg="Configured OCI runtime krun initialization failed: no valid executable found for OCI runtime krun: invalid argument" time="2024-07-27T12:32:40-04:00" level=debug msg="Configured OCI runtime ocijail initialization failed: no valid executable found for OCI runtime ocijail: invalid argument" time="2024-07-27T12:32:40-04:00" level=debug msg="Configured OCI runtime crun-vm initialization failed: no valid executable found for OCI runtime crun-vm: invalid argument" time="2024-07-27T12:32:40-04:00" level=debug msg="Configured OCI runtime runc initialization failed: no valid executable found for OCI runtime runc: invalid argument" time="2024-07-27T12:32:40-04:00" level=debug msg="Configured OCI runtime runj initialization failed: no valid executable found for OCI runtime runj: invalid argument" time="2024-07-27T12:32:40-04:00" level=debug msg="Configured OCI runtime kata initialization failed: no valid executable found for OCI runtime kata: invalid argument" time="2024-07-27T12:32:40-04:00" level=debug msg="Configured OCI runtime runsc initialization failed: no valid executable found for OCI runtime runsc: invalid argument" time="2024-07-27T12:32:40-04:00" level=debug msg="Configured OCI runtime crun-wasm initialization failed: no valid executable found for OCI runtime crun-wasm: invalid argument" time="2024-07-27T12:32:40-04:00" level=debug msg="Configured OCI runtime youki initialization failed: no valid executable found for OCI runtime youki: invalid argument" time="2024-07-27T12:32:40-04:00" level=debug msg="Using OCI runtime \"/usr/bin/crun\"" time="2024-07-27T12:32:40-04:00" level=info msg="Setting parallel job count to 7" time="2024-07-27T12:32:40-04:00" level=debug msg="Successfully loaded network podman-default-kube-network: &{podman-default-kube-network f351134fdf5176cce5c68a14f62be5f152324834e113401839991ccbf6fbc72a bridge podman1 2024-07-27 12:31:13.375699494 -0400 EDT [{{{10.89.0.0 ffffff00}} 10.89.0.1 }] [] false false true [] map[] map[] map[driver:host-local]}" time="2024-07-27T12:32:40-04:00" level=debug msg="Successfully loaded 2 networks" time="2024-07-27T12:32:40-04:00" level=debug msg="Looking up image \"localhost/podman-pause:5.1.2-1720656000\" in local containers storage" time="2024-07-27T12:32:40-04:00" level=debug msg="Normalized platform linux/amd64 to {amd64 linux [] }" time="2024-07-27T12:32:40-04:00" level=debug msg="Trying \"localhost/podman-pause:5.1.2-1720656000\" ..." time="2024-07-27T12:32:40-04:00" level=debug msg="parsed reference into \"[overlay@/home/podman_basic_user/.local/share/containers/storage+/run/user/3001/containers]@cd62bd3c2881107301605a8726f142eaa1c0465e93dd923b4030f0ac91a5d3c4\"" time="2024-07-27T12:32:40-04:00" level=debug msg="Found image \"localhost/podman-pause:5.1.2-1720656000\" as \"localhost/podman-pause:5.1.2-1720656000\" in local containers storage" time="2024-07-27T12:32:40-04:00" level=debug msg="Found image \"localhost/podman-pause:5.1.2-1720656000\" as \"localhost/podman-pause:5.1.2-1720656000\" in local containers storage ([overlay@/home/podman_basic_user/.local/share/containers/storage+/run/user/3001/containers]@cd62bd3c2881107301605a8726f142eaa1c0465e93dd923b4030f0ac91a5d3c4)" time="2024-07-27T12:32:40-04:00" level=debug msg="exporting opaque data as blob \"sha256:cd62bd3c2881107301605a8726f142eaa1c0465e93dd923b4030f0ac91a5d3c4\"" time="2024-07-27T12:32:40-04:00" level=debug msg="Pod using bridge network mode" time="2024-07-27T12:32:40-04:00" level=debug msg="Created cgroup path user.slice/user-libpod_pod_12b0ee8b5925a02a9e98f79e5783b87536d5cdb6f2ed4e8f46f928359df94879.slice for parent user.slice and name libpod_pod_12b0ee8b5925a02a9e98f79e5783b87536d5cdb6f2ed4e8f46f928359df94879" time="2024-07-27T12:32:40-04:00" level=debug msg="Created cgroup user.slice/user-libpod_pod_12b0ee8b5925a02a9e98f79e5783b87536d5cdb6f2ed4e8f46f928359df94879.slice" time="2024-07-27T12:32:40-04:00" level=debug msg="Got pod cgroup as user.slice/user-3001.slice/user@3001.service/user.slice/user-libpod_pod_12b0ee8b5925a02a9e98f79e5783b87536d5cdb6f2ed4e8f46f928359df94879.slice" Error: adding pod to state: name "httpd1" is in use: pod already exists time="2024-07-27T12:32:40-04:00" level=debug msg="Shutting down engines" Jul 27 12:32:40 ip-10-31-14-141.us-east-1.aws.redhat.com python3.12[33540]: ansible-containers.podman.podman_play PODMAN-PLAY-KUBE rc: 125 Jul 27 12:32:40 ip-10-31-14-141.us-east-1.aws.redhat.com sudo[33537]: pam_unix(sudo:session): session closed for user podman_basic_user Jul 27 12:32:41 ip-10-31-14-141.us-east-1.aws.redhat.com python3.12[33666]: ansible-getent Invoked with database=passwd key=root fail_key=False service=None split=None Jul 27 12:32:42 ip-10-31-14-141.us-east-1.aws.redhat.com python3.12[33780]: ansible-getent Invoked with database=group key=0 fail_key=False service=None split=None Jul 27 12:32:42 ip-10-31-14-141.us-east-1.aws.redhat.com python3.12[33894]: ansible-stat Invoked with path=/usr/bin/getsubids follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Jul 27 12:32:43 ip-10-31-14-141.us-east-1.aws.redhat.com python3.12[34009]: ansible-ansible.legacy.command Invoked with _raw_params=systemd-escape --template podman-kube@.service /etc/containers/ansible-kubernetes.d/httpd2.yml _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Jul 27 12:32:44 ip-10-31-14-141.us-east-1.aws.redhat.com python3.12[34123]: ansible-file Invoked with path=/tmp/lsr_ykmfaaw1_podman/httpd2 state=directory owner=root group=root recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None setype=None attributes=None Jul 27 12:32:45 ip-10-31-14-141.us-east-1.aws.redhat.com python3.12[34236]: ansible-file Invoked with path=/tmp/lsr_ykmfaaw1_podman/httpd2-create state=directory owner=root group=root recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None setype=None attributes=None Jul 27 12:32:46 ip-10-31-14-141.us-east-1.aws.redhat.com podman[34366]: 2024-07-27 12:32:46.173987426 -0400 EDT m=+0.380705744 image pull 9f9ec7f2fdef9168f74e9d057f307955db14d782cff22ded51d277d74798cb2f quay.io/libpod/testimage:20210610 Jul 27 12:32:46 ip-10-31-14-141.us-east-1.aws.redhat.com python3.12[34493]: ansible-stat Invoked with path=/etc/containers/ansible-kubernetes.d/httpd2.yml follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Jul 27 12:32:47 ip-10-31-14-141.us-east-1.aws.redhat.com python3.12[34608]: ansible-file Invoked with path=/etc/containers/ansible-kubernetes.d state=directory owner=root group=0 mode=0755 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Jul 27 12:32:47 ip-10-31-14-141.us-east-1.aws.redhat.com python3.12[34721]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/ansible-kubernetes.d/httpd2.yml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True Jul 27 12:32:47 ip-10-31-14-141.us-east-1.aws.redhat.com python3.12[34778]: ansible-ansible.legacy.file Invoked with owner=root group=0 mode=0644 dest=/etc/containers/ansible-kubernetes.d/httpd2.yml _original_basename=.toywz0ue recurse=False state=file path=/etc/containers/ansible-kubernetes.d/httpd2.yml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Jul 27 12:32:48 ip-10-31-14-141.us-east-1.aws.redhat.com python3.12[34891]: ansible-containers.podman.podman_play Invoked with state=started debug=True log_level=debug kube_file=/etc/containers/ansible-kubernetes.d/httpd2.yml executable=podman annotation=None authfile=None build=None cert_dir=None configmap=None context_dir=None seccomp_profile_root=None username=None password=NOT_LOGGING_PARAMETER log_driver=None log_opt=None network=None tls_verify=None quiet=None recreate=None userns=None quadlet_dir=None quadlet_filename=None quadlet_options=None Jul 27 12:32:48 ip-10-31-14-141.us-east-1.aws.redhat.com systemd[1]: Created slice machine-libpod_pod_b01d25e8cbcd8151db950a0699def71734e6d2551138c460a248a0ffcaaea3fd.slice - cgroup machine-libpod_pod_b01d25e8cbcd8151db950a0699def71734e6d2551138c460a248a0ffcaaea3fd.slice. ░░ Subject: A start job for unit machine-libpod_pod_b01d25e8cbcd8151db950a0699def71734e6d2551138c460a248a0ffcaaea3fd.slice has finished successfully ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ A start job for unit machine-libpod_pod_b01d25e8cbcd8151db950a0699def71734e6d2551138c460a248a0ffcaaea3fd.slice has finished successfully. ░░ ░░ The job identifier is 2872. Jul 27 12:32:48 ip-10-31-14-141.us-east-1.aws.redhat.com python3.12[34891]: ansible-containers.podman.podman_play PODMAN-PLAY-KUBE command: /usr/bin/podman play kube --start=true --log-level=debug /etc/containers/ansible-kubernetes.d/httpd2.yml Jul 27 12:32:48 ip-10-31-14-141.us-east-1.aws.redhat.com python3.12[34891]: ansible-containers.podman.podman_play PODMAN-PLAY-KUBE stdout: Jul 27 12:32:48 ip-10-31-14-141.us-east-1.aws.redhat.com python3.12[34891]: ansible-containers.podman.podman_play PODMAN-PLAY-KUBE stderr: time="2024-07-27T12:32:48-04:00" level=info msg="/usr/bin/podman filtering at log level debug" time="2024-07-27T12:32:48-04:00" level=debug msg="Called kube.PersistentPreRunE(/usr/bin/podman play kube --start=true --log-level=debug /etc/containers/ansible-kubernetes.d/httpd2.yml)" time="2024-07-27T12:32:48-04:00" level=debug msg="Using conmon: \"/usr/bin/conmon\"" time="2024-07-27T12:32:48-04:00" level=info msg="Using sqlite as database backend" time="2024-07-27T12:32:48-04:00" level=debug msg="Using graph driver overlay" time="2024-07-27T12:32:48-04:00" level=debug msg="Using graph root /var/lib/containers/storage" time="2024-07-27T12:32:48-04:00" level=debug msg="Using run root /run/containers/storage" time="2024-07-27T12:32:48-04:00" level=debug msg="Using static dir /var/lib/containers/storage/libpod" time="2024-07-27T12:32:48-04:00" level=debug msg="Using tmp dir /run/libpod" time="2024-07-27T12:32:48-04:00" level=debug msg="Using volume path /var/lib/containers/storage/volumes" time="2024-07-27T12:32:48-04:00" level=debug msg="Using transient store: false" time="2024-07-27T12:32:48-04:00" level=debug msg="[graphdriver] trying provided driver \"overlay\"" time="2024-07-27T12:32:48-04:00" level=debug msg="Cached value indicated that overlay is supported" time="2024-07-27T12:32:48-04:00" level=debug msg="Cached value indicated that overlay is supported" time="2024-07-27T12:32:48-04:00" level=debug msg="Cached value indicated that metacopy is being used" time="2024-07-27T12:32:48-04:00" level=debug msg="Cached value indicated that native-diff is not being used" time="2024-07-27T12:32:48-04:00" level=info msg="Not using native diff for overlay, this may cause degraded performance for building images: kernel has CONFIG_OVERLAY_FS_REDIRECT_DIR enabled" time="2024-07-27T12:32:48-04:00" level=debug msg="backingFs=xfs, projectQuotaSupported=false, useNativeDiff=false, usingMetacopy=true" time="2024-07-27T12:32:48-04:00" level=debug msg="Initializing event backend journald" time="2024-07-27T12:32:48-04:00" level=debug msg="Configured OCI runtime runc initialization failed: no valid executable found for OCI runtime runc: invalid argument" time="2024-07-27T12:32:48-04:00" level=debug msg="Configured OCI runtime runj initialization failed: no valid executable found for OCI runtime runj: invalid argument" time="2024-07-27T12:32:48-04:00" level=debug msg="Configured OCI runtime kata initialization failed: no valid executable found for OCI runtime kata: invalid argument" time="2024-07-27T12:32:48-04:00" level=debug msg="Configured OCI runtime runsc initialization failed: no valid executable found for OCI runtime runsc: invalid argument" time="2024-07-27T12:32:48-04:00" level=debug msg="Configured OCI runtime youki initialization failed: no valid executable found for OCI runtime youki: invalid argument" time="2024-07-27T12:32:48-04:00" level=debug msg="Configured OCI runtime krun initialization failed: no valid executable found for OCI runtime krun: invalid argument" time="2024-07-27T12:32:48-04:00" level=debug msg="Configured OCI runtime ocijail initialization failed: no valid executable found for OCI runtime ocijail: invalid argument" time="2024-07-27T12:32:48-04:00" level=debug msg="Configured OCI runtime crun-vm initialization failed: no valid executable found for OCI runtime crun-vm: invalid argument" time="2024-07-27T12:32:48-04:00" level=debug msg="Configured OCI runtime crun-wasm initialization failed: no valid executable found for OCI runtime crun-wasm: invalid argument" time="2024-07-27T12:32:48-04:00" level=debug msg="Using OCI runtime \"/usr/bin/crun\"" time="2024-07-27T12:32:48-04:00" level=info msg="Setting parallel job count to 7" time="2024-07-27T12:32:48-04:00" level=debug msg="Successfully loaded network podman-default-kube-network: &{podman-default-kube-network 8f5134bcda5f516a62e074f65b89e1dc156f1d967eb9102e712a946d948ca3ab bridge podman1 2024-07-27 12:29:35.997811221 -0400 EDT [{{{10.89.0.0 ffffff00}} 10.89.0.1 }] [] false false true [] map[] map[] map[driver:host-local]}" time="2024-07-27T12:32:48-04:00" level=debug msg="Successfully loaded 2 networks" time="2024-07-27T12:32:48-04:00" level=debug msg="Looking up image \"localhost/podman-pause:5.1.2-1720656000\" in local containers storage" time="2024-07-27T12:32:48-04:00" level=debug msg="Normalized platform linux/amd64 to {amd64 linux [] }" time="2024-07-27T12:32:48-04:00" level=debug msg="Trying \"localhost/podman-pause:5.1.2-1720656000\" ..." time="2024-07-27T12:32:48-04:00" level=debug msg="parsed reference into \"[overlay@/var/lib/containers/storage+/run/containers/storage:overlay.mountopt=nodev,metacopy=on]@dd68369a4c98aa462789d69687e5025b82f278ebe66afc63c07c145727b83b12\"" time="2024-07-27T12:32:48-04:00" level=debug msg="Found image \"localhost/podman-pause:5.1.2-1720656000\" as \"localhost/podman-pause:5.1.2-1720656000\" in local containers storage" time="2024-07-27T12:32:48-04:00" level=debug msg="Found image \"localhost/podman-pause:5.1.2-1720656000\" as \"localhost/podman-pause:5.1.2-1720656000\" in local containers storage ([overlay@/var/lib/containers/storage+/run/containers/storage:overlay.mountopt=nodev,metacopy=on]@dd68369a4c98aa462789d69687e5025b82f278ebe66afc63c07c145727b83b12)" time="2024-07-27T12:32:48-04:00" level=debug msg="exporting opaque data as blob \"sha256:dd68369a4c98aa462789d69687e5025b82f278ebe66afc63c07c145727b83b12\"" time="2024-07-27T12:32:48-04:00" level=debug msg="Pod using bridge network mode" time="2024-07-27T12:32:48-04:00" level=debug msg="Created cgroup path machine.slice/machine-libpod_pod_b01d25e8cbcd8151db950a0699def71734e6d2551138c460a248a0ffcaaea3fd.slice for parent machine.slice and name libpod_pod_b01d25e8cbcd8151db950a0699def71734e6d2551138c460a248a0ffcaaea3fd" time="2024-07-27T12:32:48-04:00" level=debug msg="Created cgroup machine.slice/machine-libpod_pod_b01d25e8cbcd8151db950a0699def71734e6d2551138c460a248a0ffcaaea3fd.slice" time="2024-07-27T12:32:48-04:00" level=debug msg="Got pod cgroup as machine.slice/machine-libpod_pod_b01d25e8cbcd8151db950a0699def71734e6d2551138c460a248a0ffcaaea3fd.slice" Error: adding pod to state: name "httpd2" is in use: pod already exists time="2024-07-27T12:32:48-04:00" level=debug msg="Shutting down engines" Jul 27 12:32:48 ip-10-31-14-141.us-east-1.aws.redhat.com python3.12[34891]: ansible-containers.podman.podman_play PODMAN-PLAY-KUBE rc: 125 Jul 27 12:32:49 ip-10-31-14-141.us-east-1.aws.redhat.com python3.12[35017]: ansible-getent Invoked with database=group key=0 fail_key=False service=None split=None Jul 27 12:32:49 ip-10-31-14-141.us-east-1.aws.redhat.com python3.12[35131]: ansible-stat Invoked with path=/usr/bin/getsubids follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Jul 27 12:32:51 ip-10-31-14-141.us-east-1.aws.redhat.com python3.12[35246]: ansible-ansible.legacy.command Invoked with _raw_params=systemd-escape --template podman-kube@.service /etc/containers/ansible-kubernetes.d/httpd3.yml _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Jul 27 12:32:52 ip-10-31-14-141.us-east-1.aws.redhat.com python3.12[35360]: ansible-file Invoked with path=/tmp/lsr_ykmfaaw1_podman/httpd3 state=directory owner=root group=root recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None setype=None attributes=None Jul 27 12:32:52 ip-10-31-14-141.us-east-1.aws.redhat.com python3.12[35473]: ansible-file Invoked with path=/tmp/lsr_ykmfaaw1_podman/httpd3-create state=directory owner=root group=root recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None setype=None attributes=None Jul 27 12:32:53 ip-10-31-14-141.us-east-1.aws.redhat.com podman[35601]: 2024-07-27 12:32:53.288798562 -0400 EDT m=+0.340932956 image pull 9f9ec7f2fdef9168f74e9d057f307955db14d782cff22ded51d277d74798cb2f quay.io/libpod/testimage:20210610 Jul 27 12:32:53 ip-10-31-14-141.us-east-1.aws.redhat.com python3.12[35728]: ansible-stat Invoked with path=/etc/containers/ansible-kubernetes.d/httpd3.yml follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Jul 27 12:32:54 ip-10-31-14-141.us-east-1.aws.redhat.com python3.12[35843]: ansible-file Invoked with path=/etc/containers/ansible-kubernetes.d state=directory owner=root group=0 mode=0755 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Jul 27 12:32:54 ip-10-31-14-141.us-east-1.aws.redhat.com python3.12[35956]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/ansible-kubernetes.d/httpd3.yml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True Jul 27 12:32:54 ip-10-31-14-141.us-east-1.aws.redhat.com python3.12[36013]: ansible-ansible.legacy.file Invoked with owner=root group=0 mode=0644 dest=/etc/containers/ansible-kubernetes.d/httpd3.yml _original_basename=.y465hhuq recurse=False state=file path=/etc/containers/ansible-kubernetes.d/httpd3.yml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Jul 27 12:32:55 ip-10-31-14-141.us-east-1.aws.redhat.com python3.12[36126]: ansible-containers.podman.podman_play Invoked with state=started kube_file=/etc/containers/ansible-kubernetes.d/httpd3.yml executable=podman annotation=None authfile=None build=None cert_dir=None configmap=None context_dir=None seccomp_profile_root=None username=None password=NOT_LOGGING_PARAMETER log_driver=None log_opt=None network=None tls_verify=None debug=None quiet=None recreate=None userns=None log_level=None quadlet_dir=None quadlet_filename=None quadlet_options=None Jul 27 12:32:55 ip-10-31-14-141.us-east-1.aws.redhat.com systemd[1]: Created slice machine-libpod_pod_9c708f57c195dcb81690a52eca5d59580019e3bb8e3f1da72c0406894d1e2199.slice - cgroup machine-libpod_pod_9c708f57c195dcb81690a52eca5d59580019e3bb8e3f1da72c0406894d1e2199.slice. ░░ Subject: A start job for unit machine-libpod_pod_9c708f57c195dcb81690a52eca5d59580019e3bb8e3f1da72c0406894d1e2199.slice has finished successfully ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ A start job for unit machine-libpod_pod_9c708f57c195dcb81690a52eca5d59580019e3bb8e3f1da72c0406894d1e2199.slice has finished successfully. ░░ ░░ The job identifier is 2878. Jul 27 12:32:56 ip-10-31-14-141.us-east-1.aws.redhat.com sudo[36289]: root : TTY=pts/0 ; PWD=/root ; USER=podman_basic_user ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sfhzskanemupfvbtjfdfaojclkqsrsbl ; /usr/bin/python3.12 /var/tmp/ansible-tmp-1722097976.030087-9099-50490068000412/AnsiballZ_command.py' Jul 27 12:32:56 ip-10-31-14-141.us-east-1.aws.redhat.com sudo[36289]: pam_systemd(sudo:session): New sd-bus connection (system-bus-pam-systemd-36289) opened. Jul 27 12:32:56 ip-10-31-14-141.us-east-1.aws.redhat.com sudo[36289]: pam_unix(sudo:session): session opened for user podman_basic_user(uid=3001) by root(uid=0) Jul 27 12:32:56 ip-10-31-14-141.us-east-1.aws.redhat.com python3.12[36292]: ansible-ansible.legacy.command Invoked with _raw_params=podman pod inspect httpd1 --format '{{.State}}' _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Jul 27 12:32:56 ip-10-31-14-141.us-east-1.aws.redhat.com systemd[23350]: Started podman-36300.scope. ░░ Subject: A start job for unit UNIT has finished successfully ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ A start job for unit UNIT has finished successfully. ░░ ░░ The job identifier is 144. Jul 27 12:32:56 ip-10-31-14-141.us-east-1.aws.redhat.com sudo[36289]: pam_unix(sudo:session): session closed for user podman_basic_user Jul 27 12:32:56 ip-10-31-14-141.us-east-1.aws.redhat.com python3.12[36421]: ansible-ansible.legacy.command Invoked with _raw_params=podman pod inspect httpd2 --format '{{.State}}' _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Jul 27 12:32:57 ip-10-31-14-141.us-east-1.aws.redhat.com python3.12[36542]: ansible-ansible.legacy.command Invoked with _raw_params=podman pod inspect httpd3 --format '{{.State}}' _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Jul 27 12:32:57 ip-10-31-14-141.us-east-1.aws.redhat.com sudo[36699]: root : TTY=pts/0 ; PWD=/root ; USER=podman_basic_user ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wadmgcflojtyghnaykpgblpmewnyduex ; XDG_RUNTIME_DIR=/run/user/3001 /usr/bin/python3.12 /var/tmp/ansible-tmp-1722097977.4558182-9123-46239639450111/AnsiballZ_command.py' Jul 27 12:32:57 ip-10-31-14-141.us-east-1.aws.redhat.com sudo[36699]: pam_systemd(sudo:session): New sd-bus connection (system-bus-pam-systemd-36699) opened. Jul 27 12:32:57 ip-10-31-14-141.us-east-1.aws.redhat.com sudo[36699]: pam_unix(sudo:session): session opened for user podman_basic_user(uid=3001) by root(uid=0) Jul 27 12:32:57 ip-10-31-14-141.us-east-1.aws.redhat.com python3.12[36702]: ansible-ansible.legacy.command Invoked with _raw_params=set -euo pipefail systemctl --user list-units -a -l --plain | grep -E '^[ ]*podman-kube@.+-httpd1[.]yml[.]service[ ]+loaded[ ]+active ' _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Jul 27 12:32:57 ip-10-31-14-141.us-east-1.aws.redhat.com sudo[36699]: pam_unix(sudo:session): session closed for user podman_basic_user Jul 27 12:32:58 ip-10-31-14-141.us-east-1.aws.redhat.com python3.12[36818]: ansible-ansible.legacy.command Invoked with _raw_params=set -euo pipefail systemctl --system list-units -a -l --plain | grep -E '^[ ]*podman-kube@.+-httpd2[.]yml[.]service[ ]+loaded[ ]+active ' _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Jul 27 12:32:58 ip-10-31-14-141.us-east-1.aws.redhat.com python3.12[36934]: ansible-ansible.legacy.command Invoked with _raw_params=set -euo pipefail systemctl --system list-units -a -l --plain | grep -E '^[ ]*podman-kube@.+-httpd3[.]yml[.]service[ ]+loaded[ ]+active ' _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Jul 27 12:32:59 ip-10-31-14-141.us-east-1.aws.redhat.com python3.12[37050]: ansible-ansible.legacy.uri Invoked with url=http://localhost:15001/index.txt return_content=True force=False http_agent=ansible-httpget use_proxy=True validate_certs=True force_basic_auth=False use_gssapi=False body_format=raw method=GET follow_redirects=safe status_code=[200] timeout=30 headers={} remote_src=False unredirected_headers=[] decompress=True use_netrc=True unsafe_writes=False url_username=None url_password=NOT_LOGGING_PARAMETER client_cert=None client_key=None dest=None body=None src=None creates=None removes=None unix_socket=None ca_path=None ciphers=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Jul 27 12:32:59 ip-10-31-14-141.us-east-1.aws.redhat.com python3.12[37164]: ansible-ansible.legacy.uri Invoked with url=http://localhost:15002/index.txt return_content=True force=False http_agent=ansible-httpget use_proxy=True validate_certs=True force_basic_auth=False use_gssapi=False body_format=raw method=GET follow_redirects=safe status_code=[200] timeout=30 headers={} remote_src=False unredirected_headers=[] decompress=True use_netrc=True unsafe_writes=False url_username=None url_password=NOT_LOGGING_PARAMETER client_cert=None client_key=None dest=None body=None src=None creates=None removes=None unix_socket=None ca_path=None ciphers=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Jul 27 12:33:00 ip-10-31-14-141.us-east-1.aws.redhat.com python3.12[37278]: ansible-ansible.legacy.uri Invoked with url=http://localhost:15003/index.txt return_content=True force=False http_agent=ansible-httpget use_proxy=True validate_certs=True force_basic_auth=False use_gssapi=False body_format=raw method=GET follow_redirects=safe status_code=[200] timeout=30 headers={} remote_src=False unredirected_headers=[] decompress=True use_netrc=True unsafe_writes=False url_username=None url_password=NOT_LOGGING_PARAMETER client_cert=None client_key=None dest=None body=None src=None creates=None removes=None unix_socket=None ca_path=None ciphers=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Jul 27 12:33:02 ip-10-31-14-141.us-east-1.aws.redhat.com python3.12[37505]: ansible-ansible.legacy.command Invoked with _raw_params=podman --version _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Jul 27 12:33:03 ip-10-31-14-141.us-east-1.aws.redhat.com python3.12[37624]: ansible-getent Invoked with database=group key=0 fail_key=False service=None split=None Jul 27 12:33:03 ip-10-31-14-141.us-east-1.aws.redhat.com python3.12[37738]: ansible-stat Invoked with path=/usr/bin/getsubids follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Jul 27 12:33:06 ip-10-31-14-141.us-east-1.aws.redhat.com python3.12[37853]: ansible-getent Invoked with database=passwd key=podman_basic_user fail_key=False service=None split=None Jul 27 12:33:06 ip-10-31-14-141.us-east-1.aws.redhat.com python3.12[37967]: ansible-getent Invoked with database=group key=3001 fail_key=False service=None split=None Jul 27 12:33:07 ip-10-31-14-141.us-east-1.aws.redhat.com python3.12[38081]: ansible-stat Invoked with path=/usr/bin/getsubids follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Jul 27 12:33:07 ip-10-31-14-141.us-east-1.aws.redhat.com python3.12[38196]: ansible-ansible.legacy.command Invoked with _raw_params=getsubids podman_basic_user _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Jul 27 12:33:08 ip-10-31-14-141.us-east-1.aws.redhat.com python3.12[38310]: ansible-ansible.legacy.command Invoked with _raw_params=getsubids -g podman_basic_user _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Jul 27 12:33:09 ip-10-31-14-141.us-east-1.aws.redhat.com python3.12[38424]: ansible-ansible.legacy.command Invoked with _raw_params=systemd-escape --template podman-kube@.service /home/podman_basic_user/.config/containers/ansible-kubernetes.d/httpd1.yml _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Jul 27 12:33:09 ip-10-31-14-141.us-east-1.aws.redhat.com python3.12[38538]: ansible-stat Invoked with path=/run/user/3001 follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Jul 27 12:33:10 ip-10-31-14-141.us-east-1.aws.redhat.com sudo[38689]: root : TTY=pts/0 ; PWD=/root ; USER=podman_basic_user ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ciqbrqkfjmgkkozkoeeszxgooyhhcuvb ; XDG_RUNTIME_DIR=/run/user/3001 /usr/bin/python3.12 /var/tmp/ansible-tmp-1722097989.9294028-9322-192888159902137/AnsiballZ_systemd.py' Jul 27 12:33:10 ip-10-31-14-141.us-east-1.aws.redhat.com sudo[38689]: pam_systemd(sudo:session): New sd-bus connection (system-bus-pam-systemd-38689) opened. Jul 27 12:33:10 ip-10-31-14-141.us-east-1.aws.redhat.com sudo[38689]: pam_unix(sudo:session): session opened for user podman_basic_user(uid=3001) by root(uid=0) Jul 27 12:33:10 ip-10-31-14-141.us-east-1.aws.redhat.com python3.12[38692]: ansible-systemd Invoked with name=podman-kube@-home-podman_basic_user-.config-containers-ansible\x2dkubernetes.d-httpd1.yml.service scope=user state=stopped enabled=False daemon_reload=False daemon_reexec=False no_block=False force=None masked=None Jul 27 12:33:10 ip-10-31-14-141.us-east-1.aws.redhat.com systemd[23350]: Reload requested from client PID 38695 ('systemctl')... Jul 27 12:33:10 ip-10-31-14-141.us-east-1.aws.redhat.com systemd[23350]: Reloading... Jul 27 12:33:10 ip-10-31-14-141.us-east-1.aws.redhat.com systemd[23350]: Reloading finished in 72 ms. Jul 27 12:33:10 ip-10-31-14-141.us-east-1.aws.redhat.com systemd[23350]: Stopping podman-kube@-home-podman_basic_user-.config-containers-ansible\x2dkubernetes.d-httpd1.yml.service - A template for running K8s workloads via podman-kube-play... ░░ Subject: A stop job for unit UNIT has begun execution ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ A stop job for unit UNIT has begun execution. ░░ ░░ The job identifier is 148. Jul 27 12:33:10 ip-10-31-14-141.us-east-1.aws.redhat.com kernel: podman1: port 1(veth0) entered disabled state Jul 27 12:33:10 ip-10-31-14-141.us-east-1.aws.redhat.com kernel: veth0 (unregistering): left allmulticast mode Jul 27 12:33:10 ip-10-31-14-141.us-east-1.aws.redhat.com kernel: veth0 (unregistering): left promiscuous mode Jul 27 12:33:10 ip-10-31-14-141.us-east-1.aws.redhat.com kernel: podman1: port 1(veth0) entered disabled state Jul 27 12:33:20 ip-10-31-14-141.us-east-1.aws.redhat.com podman[38706]: time="2024-07-27T12:33:20-04:00" level=warning msg="StopSignal SIGTERM failed to stop container httpd1-httpd1 in 10 seconds, resorting to SIGKILL" Jul 27 12:33:20 ip-10-31-14-141.us-east-1.aws.redhat.com systemd[23350]: Removed slice user-libpod_pod_eb0fd24ab262a446ab756535a25f500b5ec073e9a28a2dc378eb03b2412f15c9.slice - cgroup user-libpod_pod_eb0fd24ab262a446ab756535a25f500b5ec073e9a28a2dc378eb03b2412f15c9.slice. ░░ Subject: A stop job for unit UNIT has finished ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ A stop job for unit UNIT has finished. ░░ ░░ The job identifier is 149 and the job result is done. Jul 27 12:33:20 ip-10-31-14-141.us-east-1.aws.redhat.com systemd[23350]: user-libpod_pod_eb0fd24ab262a446ab756535a25f500b5ec073e9a28a2dc378eb03b2412f15c9.slice: Failed to open /run/user/3001/systemd/transient/user-libpod_pod_eb0fd24ab262a446ab756535a25f500b5ec073e9a28a2dc378eb03b2412f15c9.slice: No such file or directory Jul 27 12:33:21 ip-10-31-14-141.us-east-1.aws.redhat.com podman[38706]: Pods stopped: Jul 27 12:33:21 ip-10-31-14-141.us-east-1.aws.redhat.com podman[38706]: eb0fd24ab262a446ab756535a25f500b5ec073e9a28a2dc378eb03b2412f15c9 Jul 27 12:33:21 ip-10-31-14-141.us-east-1.aws.redhat.com podman[38706]: Pods removed: Jul 27 12:33:21 ip-10-31-14-141.us-east-1.aws.redhat.com podman[38706]: Error: removing pod eb0fd24ab262a446ab756535a25f500b5ec073e9a28a2dc378eb03b2412f15c9 cgroup: removing pod eb0fd24ab262a446ab756535a25f500b5ec073e9a28a2dc378eb03b2412f15c9 cgroup: Unit user-libpod_pod_eb0fd24ab262a446ab756535a25f500b5ec073e9a28a2dc378eb03b2412f15c9.slice not loaded. Jul 27 12:33:21 ip-10-31-14-141.us-east-1.aws.redhat.com podman[38706]: Secrets removed: Jul 27 12:33:21 ip-10-31-14-141.us-east-1.aws.redhat.com podman[38706]: Error: %!s() Jul 27 12:33:21 ip-10-31-14-141.us-east-1.aws.redhat.com podman[38706]: Volumes removed: Jul 27 12:33:21 ip-10-31-14-141.us-east-1.aws.redhat.com systemd[23350]: Stopped podman-kube@-home-podman_basic_user-.config-containers-ansible\x2dkubernetes.d-httpd1.yml.service - A template for running K8s workloads via podman-kube-play. ░░ Subject: A stop job for unit UNIT has finished ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ A stop job for unit UNIT has finished. ░░ ░░ The job identifier is 148 and the job result is done. Jul 27 12:33:21 ip-10-31-14-141.us-east-1.aws.redhat.com sudo[38689]: pam_unix(sudo:session): session closed for user podman_basic_user Jul 27 12:33:21 ip-10-31-14-141.us-east-1.aws.redhat.com python3.12[38865]: ansible-stat Invoked with path=/home/podman_basic_user/.config/containers/ansible-kubernetes.d/httpd1.yml follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Jul 27 12:33:21 ip-10-31-14-141.us-east-1.aws.redhat.com sudo[39016]: root : TTY=pts/0 ; PWD=/root ; USER=podman_basic_user ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fyrxltxcvhtvvxpdayrpjoxxtejldmey ; XDG_RUNTIME_DIR=/run/user/3001 /usr/bin/python3.12 /var/tmp/ansible-tmp-1722098001.7130353-9340-181628971819968/AnsiballZ_podman_play.py' Jul 27 12:33:21 ip-10-31-14-141.us-east-1.aws.redhat.com sudo[39016]: pam_systemd(sudo:session): New sd-bus connection (system-bus-pam-systemd-39016) opened. Jul 27 12:33:21 ip-10-31-14-141.us-east-1.aws.redhat.com sudo[39016]: pam_unix(sudo:session): session opened for user podman_basic_user(uid=3001) by root(uid=0) Jul 27 12:33:22 ip-10-31-14-141.us-east-1.aws.redhat.com python3.12[39019]: ansible-containers.podman.podman_play Invoked with state=absent debug=True log_level=debug kube_file=/home/podman_basic_user/.config/containers/ansible-kubernetes.d/httpd1.yml executable=podman annotation=None authfile=None build=None cert_dir=None configmap=None context_dir=None seccomp_profile_root=None username=None password=NOT_LOGGING_PARAMETER log_driver=None log_opt=None network=None tls_verify=None quiet=None recreate=None userns=None quadlet_dir=None quadlet_filename=None quadlet_options=None Jul 27 12:33:22 ip-10-31-14-141.us-east-1.aws.redhat.com python3.12[39019]: ansible-containers.podman.podman_play version: 5.1.2, kube file /home/podman_basic_user/.config/containers/ansible-kubernetes.d/httpd1.yml Jul 27 12:33:22 ip-10-31-14-141.us-east-1.aws.redhat.com systemd[23350]: Started podman-39026.scope. ░░ Subject: A start job for unit UNIT has finished successfully ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ A start job for unit UNIT has finished successfully. ░░ ░░ The job identifier is 150. Jul 27 12:33:22 ip-10-31-14-141.us-east-1.aws.redhat.com python3.12[39019]: ansible-containers.podman.podman_play PODMAN-PLAY-KUBE command: /usr/bin/podman kube play --down /home/podman_basic_user/.config/containers/ansible-kubernetes.d/httpd1.yml Jul 27 12:33:22 ip-10-31-14-141.us-east-1.aws.redhat.com python3.12[39019]: ansible-containers.podman.podman_play PODMAN-PLAY-KUBE stdout: Pods stopped: Pods removed: Secrets removed: Volumes removed: Jul 27 12:33:22 ip-10-31-14-141.us-east-1.aws.redhat.com python3.12[39019]: ansible-containers.podman.podman_play PODMAN-PLAY-KUBE stderr: Jul 27 12:33:22 ip-10-31-14-141.us-east-1.aws.redhat.com python3.12[39019]: ansible-containers.podman.podman_play PODMAN-PLAY-KUBE rc: 0 Jul 27 12:33:22 ip-10-31-14-141.us-east-1.aws.redhat.com sudo[39016]: pam_unix(sudo:session): session closed for user podman_basic_user Jul 27 12:33:22 ip-10-31-14-141.us-east-1.aws.redhat.com python3.12[39146]: ansible-file Invoked with path=/home/podman_basic_user/.config/containers/ansible-kubernetes.d/httpd1.yml state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Jul 27 12:33:22 ip-10-31-14-141.us-east-1.aws.redhat.com sudo[39295]: root : TTY=pts/0 ; PWD=/root ; USER=podman_basic_user ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-paddzoimqahqdhwgsrnnqvwcpyvmesws ; XDG_RUNTIME_DIR=/run/user/3001 /usr/bin/python3.12 /var/tmp/ansible-tmp-1722098002.7582216-9358-84588193523613/AnsiballZ_command.py' Jul 27 12:33:22 ip-10-31-14-141.us-east-1.aws.redhat.com sudo[39295]: pam_systemd(sudo:session): New sd-bus connection (system-bus-pam-systemd-39295) opened. Jul 27 12:33:22 ip-10-31-14-141.us-east-1.aws.redhat.com sudo[39295]: pam_unix(sudo:session): session opened for user podman_basic_user(uid=3001) by root(uid=0) Jul 27 12:33:23 ip-10-31-14-141.us-east-1.aws.redhat.com python3.12[39298]: ansible-ansible.legacy.command Invoked with _raw_params=podman image prune -f _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Jul 27 12:33:23 ip-10-31-14-141.us-east-1.aws.redhat.com systemd[23350]: Started podman-39299.scope. ░░ Subject: A start job for unit UNIT has finished successfully ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ A start job for unit UNIT has finished successfully. ░░ ░░ The job identifier is 154. Jul 27 12:33:23 ip-10-31-14-141.us-east-1.aws.redhat.com sudo[39295]: pam_unix(sudo:session): session closed for user podman_basic_user Jul 27 12:33:24 ip-10-31-14-141.us-east-1.aws.redhat.com python3.12[39418]: ansible-getent Invoked with database=passwd key=root fail_key=False service=None split=None Jul 27 12:33:24 ip-10-31-14-141.us-east-1.aws.redhat.com python3.12[39532]: ansible-getent Invoked with database=group key=0 fail_key=False service=None split=None Jul 27 12:33:25 ip-10-31-14-141.us-east-1.aws.redhat.com python3.12[39646]: ansible-stat Invoked with path=/usr/bin/getsubids follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Jul 27 12:33:26 ip-10-31-14-141.us-east-1.aws.redhat.com python3.12[39761]: ansible-ansible.legacy.command Invoked with _raw_params=systemd-escape --template podman-kube@.service /etc/containers/ansible-kubernetes.d/httpd2.yml _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Jul 27 12:33:27 ip-10-31-14-141.us-east-1.aws.redhat.com python3.12[39875]: ansible-systemd Invoked with name=podman-kube@-etc-containers-ansible\x2dkubernetes.d-httpd2.yml.service scope=system state=stopped enabled=False daemon_reload=False daemon_reexec=False no_block=False force=None masked=None Jul 27 12:33:27 ip-10-31-14-141.us-east-1.aws.redhat.com systemd[1]: Reload requested from client PID 39878 ('systemctl') (unit session-5.scope)... Jul 27 12:33:27 ip-10-31-14-141.us-east-1.aws.redhat.com systemd[1]: Reloading... Jul 27 12:33:27 ip-10-31-14-141.us-east-1.aws.redhat.com systemd[1]: Reloading finished in 246 ms. Jul 27 12:33:27 ip-10-31-14-141.us-east-1.aws.redhat.com systemd[1]: Stopping podman-kube@-etc-containers-ansible\x2dkubernetes.d-httpd2.yml.service - A template for running K8s workloads via podman-kube-play... ░░ Subject: A stop job for unit podman-kube@-etc-containers-ansible\x2dkubernetes.d-httpd2.yml.service has begun execution ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ A stop job for unit podman-kube@-etc-containers-ansible\x2dkubernetes.d-httpd2.yml.service has begun execution. ░░ ░░ The job identifier is 2885. Jul 27 12:33:27 ip-10-31-14-141.us-east-1.aws.redhat.com podman[39935]: 2024-07-27 12:33:27.508879913 -0400 EDT m=+0.033288001 pod stop 3fc987afc0fcfab30e8777f8ebf27fce38969878d374f1ad00ca4b1107c6d265 (image=, name=httpd2) Jul 27 12:33:27 ip-10-31-14-141.us-east-1.aws.redhat.com systemd[1]: tmp-crun.Xnz3jw.mount: Deactivated successfully. ░░ Subject: Unit succeeded ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ The unit tmp-crun.Xnz3jw.mount has successfully entered the 'dead' state. Jul 27 12:33:27 ip-10-31-14-141.us-east-1.aws.redhat.com systemd[1]: libpod-21934980ccc10ecb78c2d3633f5f38d2dd8ef60d679cc00813b06946c70a7e97.scope: Deactivated successfully. ░░ Subject: Unit succeeded ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ The unit libpod-21934980ccc10ecb78c2d3633f5f38d2dd8ef60d679cc00813b06946c70a7e97.scope has successfully entered the 'dead' state. Jul 27 12:33:27 ip-10-31-14-141.us-east-1.aws.redhat.com podman[39935]: 2024-07-27 12:33:27.542254668 -0400 EDT m=+0.066662918 container died 21934980ccc10ecb78c2d3633f5f38d2dd8ef60d679cc00813b06946c70a7e97 (image=localhost/podman-pause:5.1.2-1720656000, name=3fc987afc0fc-infra, PODMAN_SYSTEMD_UNIT=podman-kube@-etc-containers-ansible\x2dkubernetes.d-httpd2.yml.service, io.buildah.version=1.36.0) Jul 27 12:33:27 ip-10-31-14-141.us-east-1.aws.redhat.com kernel: podman1: port 1(veth0) entered disabled state Jul 27 12:33:27 ip-10-31-14-141.us-east-1.aws.redhat.com kernel: veth0 (unregistering): left allmulticast mode Jul 27 12:33:27 ip-10-31-14-141.us-east-1.aws.redhat.com kernel: veth0 (unregistering): left promiscuous mode Jul 27 12:33:27 ip-10-31-14-141.us-east-1.aws.redhat.com kernel: podman1: port 1(veth0) entered disabled state Jul 27 12:33:27 ip-10-31-14-141.us-east-1.aws.redhat.com systemd[1]: run-netns-netns\x2dc1fd9291\x2d1ee0\x2d6c8d\x2d7868\x2d8e0d222da8f6.mount: Deactivated successfully. ░░ Subject: Unit succeeded ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ The unit run-netns-netns\x2dc1fd9291\x2d1ee0\x2d6c8d\x2d7868\x2d8e0d222da8f6.mount has successfully entered the 'dead' state. Jul 27 12:33:27 ip-10-31-14-141.us-east-1.aws.redhat.com podman[39935]: 2024-07-27 12:33:27.645807774 -0400 EDT m=+0.170215713 container cleanup 21934980ccc10ecb78c2d3633f5f38d2dd8ef60d679cc00813b06946c70a7e97 (image=localhost/podman-pause:5.1.2-1720656000, name=3fc987afc0fc-infra, pod_id=3fc987afc0fcfab30e8777f8ebf27fce38969878d374f1ad00ca4b1107c6d265, io.buildah.version=1.36.0, PODMAN_SYSTEMD_UNIT=podman-kube@-etc-containers-ansible\x2dkubernetes.d-httpd2.yml.service) Jul 27 12:33:28 ip-10-31-14-141.us-east-1.aws.redhat.com systemd[1]: var-lib-containers-storage-overlay-6ed095a417e8f0a482152436f83644e275252114c32ea202df372f994582bf1b-merged.mount: Deactivated successfully. ░░ Subject: Unit succeeded ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ The unit var-lib-containers-storage-overlay-6ed095a417e8f0a482152436f83644e275252114c32ea202df372f994582bf1b-merged.mount has successfully entered the 'dead' state. Jul 27 12:33:28 ip-10-31-14-141.us-east-1.aws.redhat.com systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-21934980ccc10ecb78c2d3633f5f38d2dd8ef60d679cc00813b06946c70a7e97-userdata-shm.mount: Deactivated successfully. ░░ Subject: Unit succeeded ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ The unit var-lib-containers-storage-overlay\x2dcontainers-21934980ccc10ecb78c2d3633f5f38d2dd8ef60d679cc00813b06946c70a7e97-userdata-shm.mount has successfully entered the 'dead' state. Jul 27 12:33:37 ip-10-31-14-141.us-east-1.aws.redhat.com podman[39935]: time="2024-07-27T12:33:37-04:00" level=warning msg="StopSignal SIGTERM failed to stop container httpd2-httpd2 in 10 seconds, resorting to SIGKILL" Jul 27 12:33:37 ip-10-31-14-141.us-east-1.aws.redhat.com systemd[1]: libpod-28a07a6e8a801cd6f84385e3d31d70834b0d17347f7424b4c81938b703393bad.scope: Deactivated successfully. ░░ Subject: Unit succeeded ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ The unit libpod-28a07a6e8a801cd6f84385e3d31d70834b0d17347f7424b4c81938b703393bad.scope has successfully entered the 'dead' state. Jul 27 12:33:37 ip-10-31-14-141.us-east-1.aws.redhat.com podman[39935]: 2024-07-27 12:33:37.566215056 -0400 EDT m=+10.090623353 container died 28a07a6e8a801cd6f84385e3d31d70834b0d17347f7424b4c81938b703393bad (image=quay.io/libpod/testimage:20210610, name=httpd2-httpd2, app=test, created_at=2021-06-10T18:55:36Z, created_by=test/system/build-testimage, io.buildah.version=1.21.0, io.containers.autoupdate=registry, PODMAN_SYSTEMD_UNIT=podman-kube@-etc-containers-ansible\x2dkubernetes.d-httpd2.yml.service) Jul 27 12:33:37 ip-10-31-14-141.us-east-1.aws.redhat.com systemd[1]: var-lib-containers-storage-overlay-be22017918fa5afa4e8e4e7a2b4fb5ce0fec83d715114bfa9d310b394fc77263-merged.mount: Deactivated successfully. ░░ Subject: Unit succeeded ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ The unit var-lib-containers-storage-overlay-be22017918fa5afa4e8e4e7a2b4fb5ce0fec83d715114bfa9d310b394fc77263-merged.mount has successfully entered the 'dead' state. Jul 27 12:33:37 ip-10-31-14-141.us-east-1.aws.redhat.com podman[39935]: 2024-07-27 12:33:37.619498022 -0400 EDT m=+10.143906081 container cleanup 28a07a6e8a801cd6f84385e3d31d70834b0d17347f7424b4c81938b703393bad (image=quay.io/libpod/testimage:20210610, name=httpd2-httpd2, pod_id=3fc987afc0fcfab30e8777f8ebf27fce38969878d374f1ad00ca4b1107c6d265, PODMAN_SYSTEMD_UNIT=podman-kube@-etc-containers-ansible\x2dkubernetes.d-httpd2.yml.service, app=test, created_at=2021-06-10T18:55:36Z, created_by=test/system/build-testimage, io.buildah.version=1.21.0, io.containers.autoupdate=registry) Jul 27 12:33:37 ip-10-31-14-141.us-east-1.aws.redhat.com systemd[1]: Removed slice machine-libpod_pod_3fc987afc0fcfab30e8777f8ebf27fce38969878d374f1ad00ca4b1107c6d265.slice - cgroup machine-libpod_pod_3fc987afc0fcfab30e8777f8ebf27fce38969878d374f1ad00ca4b1107c6d265.slice. ░░ Subject: A stop job for unit machine-libpod_pod_3fc987afc0fcfab30e8777f8ebf27fce38969878d374f1ad00ca4b1107c6d265.slice has finished ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ A stop job for unit machine-libpod_pod_3fc987afc0fcfab30e8777f8ebf27fce38969878d374f1ad00ca4b1107c6d265.slice has finished. ░░ ░░ The job identifier is 2887 and the job result is done. Jul 27 12:33:37 ip-10-31-14-141.us-east-1.aws.redhat.com podman[39935]: 2024-07-27 12:33:37.652550525 -0400 EDT m=+10.176958687 container remove 28a07a6e8a801cd6f84385e3d31d70834b0d17347f7424b4c81938b703393bad (image=quay.io/libpod/testimage:20210610, name=httpd2-httpd2, pod_id=3fc987afc0fcfab30e8777f8ebf27fce38969878d374f1ad00ca4b1107c6d265, created_by=test/system/build-testimage, io.buildah.version=1.21.0, io.containers.autoupdate=registry, PODMAN_SYSTEMD_UNIT=podman-kube@-etc-containers-ansible\x2dkubernetes.d-httpd2.yml.service, app=test, created_at=2021-06-10T18:55:36Z) Jul 27 12:33:37 ip-10-31-14-141.us-east-1.aws.redhat.com podman[39935]: 2024-07-27 12:33:37.691063242 -0400 EDT m=+10.215471196 container remove 21934980ccc10ecb78c2d3633f5f38d2dd8ef60d679cc00813b06946c70a7e97 (image=localhost/podman-pause:5.1.2-1720656000, name=3fc987afc0fc-infra, pod_id=3fc987afc0fcfab30e8777f8ebf27fce38969878d374f1ad00ca4b1107c6d265, PODMAN_SYSTEMD_UNIT=podman-kube@-etc-containers-ansible\x2dkubernetes.d-httpd2.yml.service, io.buildah.version=1.36.0) Jul 27 12:33:37 ip-10-31-14-141.us-east-1.aws.redhat.com systemd[1]: machine-libpod_pod_3fc987afc0fcfab30e8777f8ebf27fce38969878d374f1ad00ca4b1107c6d265.slice: Failed to open /run/systemd/transient/machine-libpod_pod_3fc987afc0fcfab30e8777f8ebf27fce38969878d374f1ad00ca4b1107c6d265.slice: No such file or directory Jul 27 12:33:37 ip-10-31-14-141.us-east-1.aws.redhat.com podman[39935]: 2024-07-27 12:33:37.704019265 -0400 EDT m=+10.228427404 pod remove 3fc987afc0fcfab30e8777f8ebf27fce38969878d374f1ad00ca4b1107c6d265 (image=, name=httpd2) Jul 27 12:33:37 ip-10-31-14-141.us-east-1.aws.redhat.com podman[39935]: 2024-07-27 12:33:37.70829221 -0400 EDT m=+10.232700351 container kill cfa46864c7ec0493ae037af522f39cd0a603f4dc849eb026c04a5797807e00ad (image=localhost/podman-pause:5.1.2-1720656000, name=42d68b4496fe-service, PODMAN_SYSTEMD_UNIT=podman-kube@-etc-containers-ansible\x2dkubernetes.d-httpd2.yml.service) Jul 27 12:33:37 ip-10-31-14-141.us-east-1.aws.redhat.com systemd[1]: libpod-cfa46864c7ec0493ae037af522f39cd0a603f4dc849eb026c04a5797807e00ad.scope: Deactivated successfully. ░░ Subject: Unit succeeded ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ The unit libpod-cfa46864c7ec0493ae037af522f39cd0a603f4dc849eb026c04a5797807e00ad.scope has successfully entered the 'dead' state. Jul 27 12:33:37 ip-10-31-14-141.us-east-1.aws.redhat.com conmon[26990]: conmon cfa46864c7ec0493ae03 : Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-cfa46864c7ec0493ae037af522f39cd0a603f4dc849eb026c04a5797807e00ad.scope/container/memory.events Jul 27 12:33:37 ip-10-31-14-141.us-east-1.aws.redhat.com podman[39935]: 2024-07-27 12:33:37.718384278 -0400 EDT m=+10.242792573 container died cfa46864c7ec0493ae037af522f39cd0a603f4dc849eb026c04a5797807e00ad (image=localhost/podman-pause:5.1.2-1720656000, name=42d68b4496fe-service, PODMAN_SYSTEMD_UNIT=podman-kube@-etc-containers-ansible\x2dkubernetes.d-httpd2.yml.service) Jul 27 12:33:37 ip-10-31-14-141.us-east-1.aws.redhat.com podman[39935]: 2024-07-27 12:33:37.797193155 -0400 EDT m=+10.321601108 container remove cfa46864c7ec0493ae037af522f39cd0a603f4dc849eb026c04a5797807e00ad (image=localhost/podman-pause:5.1.2-1720656000, name=42d68b4496fe-service, PODMAN_SYSTEMD_UNIT=podman-kube@-etc-containers-ansible\x2dkubernetes.d-httpd2.yml.service) Jul 27 12:33:37 ip-10-31-14-141.us-east-1.aws.redhat.com podman[39935]: Pods stopped: Jul 27 12:33:37 ip-10-31-14-141.us-east-1.aws.redhat.com podman[39935]: 3fc987afc0fcfab30e8777f8ebf27fce38969878d374f1ad00ca4b1107c6d265 Jul 27 12:33:37 ip-10-31-14-141.us-east-1.aws.redhat.com podman[39935]: Pods removed: Jul 27 12:33:37 ip-10-31-14-141.us-east-1.aws.redhat.com podman[39935]: Error: removing pod 3fc987afc0fcfab30e8777f8ebf27fce38969878d374f1ad00ca4b1107c6d265 cgroup: removing pod 3fc987afc0fcfab30e8777f8ebf27fce38969878d374f1ad00ca4b1107c6d265 cgroup: Unit machine-libpod_pod_3fc987afc0fcfab30e8777f8ebf27fce38969878d374f1ad00ca4b1107c6d265.slice not loaded. Jul 27 12:33:37 ip-10-31-14-141.us-east-1.aws.redhat.com podman[39935]: Secrets removed: Jul 27 12:33:37 ip-10-31-14-141.us-east-1.aws.redhat.com podman[39935]: Error: %!s() Jul 27 12:33:37 ip-10-31-14-141.us-east-1.aws.redhat.com podman[39935]: Volumes removed: Jul 27 12:33:37 ip-10-31-14-141.us-east-1.aws.redhat.com systemd[1]: podman-kube@-etc-containers-ansible\x2dkubernetes.d-httpd2.yml.service: Deactivated successfully. ░░ Subject: Unit succeeded ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ The unit podman-kube@-etc-containers-ansible\x2dkubernetes.d-httpd2.yml.service has successfully entered the 'dead' state. Jul 27 12:33:37 ip-10-31-14-141.us-east-1.aws.redhat.com systemd[1]: Stopped podman-kube@-etc-containers-ansible\x2dkubernetes.d-httpd2.yml.service - A template for running K8s workloads via podman-kube-play. ░░ Subject: A stop job for unit podman-kube@-etc-containers-ansible\x2dkubernetes.d-httpd2.yml.service has finished ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ A stop job for unit podman-kube@-etc-containers-ansible\x2dkubernetes.d-httpd2.yml.service has finished. ░░ ░░ The job identifier is 2885 and the job result is done. Jul 27 12:33:38 ip-10-31-14-141.us-east-1.aws.redhat.com python3.12[40094]: ansible-stat Invoked with path=/etc/containers/ansible-kubernetes.d/httpd2.yml follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Jul 27 12:33:38 ip-10-31-14-141.us-east-1.aws.redhat.com systemd[1]: var-lib-containers-storage-overlay-c0f48c65598cd39955de2ec76b67f125e8b50d209acdb5eb066aeee985f1ce34-merged.mount: Deactivated successfully. ░░ Subject: Unit succeeded ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ The unit var-lib-containers-storage-overlay-c0f48c65598cd39955de2ec76b67f125e8b50d209acdb5eb066aeee985f1ce34-merged.mount has successfully entered the 'dead' state. Jul 27 12:33:38 ip-10-31-14-141.us-east-1.aws.redhat.com systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-cfa46864c7ec0493ae037af522f39cd0a603f4dc849eb026c04a5797807e00ad-userdata-shm.mount: Deactivated successfully. ░░ Subject: Unit succeeded ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ The unit var-lib-containers-storage-overlay\x2dcontainers-cfa46864c7ec0493ae037af522f39cd0a603f4dc849eb026c04a5797807e00ad-userdata-shm.mount has successfully entered the 'dead' state. Jul 27 12:33:38 ip-10-31-14-141.us-east-1.aws.redhat.com python3.12[40209]: ansible-containers.podman.podman_play Invoked with state=absent debug=True log_level=debug kube_file=/etc/containers/ansible-kubernetes.d/httpd2.yml executable=podman annotation=None authfile=None build=None cert_dir=None configmap=None context_dir=None seccomp_profile_root=None username=None password=NOT_LOGGING_PARAMETER log_driver=None log_opt=None network=None tls_verify=None quiet=None recreate=None userns=None quadlet_dir=None quadlet_filename=None quadlet_options=None Jul 27 12:33:38 ip-10-31-14-141.us-east-1.aws.redhat.com python3.12[40209]: ansible-containers.podman.podman_play version: 5.1.2, kube file /etc/containers/ansible-kubernetes.d/httpd2.yml Jul 27 12:33:38 ip-10-31-14-141.us-east-1.aws.redhat.com python3.12[40209]: ansible-containers.podman.podman_play PODMAN-PLAY-KUBE command: /usr/bin/podman kube play --down /etc/containers/ansible-kubernetes.d/httpd2.yml Jul 27 12:33:38 ip-10-31-14-141.us-east-1.aws.redhat.com python3.12[40209]: ansible-containers.podman.podman_play PODMAN-PLAY-KUBE stdout: Pods stopped: Pods removed: Secrets removed: Volumes removed: Jul 27 12:33:38 ip-10-31-14-141.us-east-1.aws.redhat.com python3.12[40209]: ansible-containers.podman.podman_play PODMAN-PLAY-KUBE stderr: Jul 27 12:33:38 ip-10-31-14-141.us-east-1.aws.redhat.com python3.12[40209]: ansible-containers.podman.podman_play PODMAN-PLAY-KUBE rc: 0 Jul 27 12:33:39 ip-10-31-14-141.us-east-1.aws.redhat.com python3.12[40336]: ansible-file Invoked with path=/etc/containers/ansible-kubernetes.d/httpd2.yml state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Jul 27 12:33:39 ip-10-31-14-141.us-east-1.aws.redhat.com python3.12[40449]: ansible-ansible.legacy.command Invoked with _raw_params=podman image prune -f _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Jul 27 12:33:40 ip-10-31-14-141.us-east-1.aws.redhat.com python3.12[40570]: ansible-getent Invoked with database=group key=0 fail_key=False service=None split=None Jul 27 12:33:41 ip-10-31-14-141.us-east-1.aws.redhat.com python3.12[40684]: ansible-stat Invoked with path=/usr/bin/getsubids follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Jul 27 12:33:42 ip-10-31-14-141.us-east-1.aws.redhat.com python3.12[40799]: ansible-ansible.legacy.command Invoked with _raw_params=systemd-escape --template podman-kube@.service /etc/containers/ansible-kubernetes.d/httpd3.yml _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Jul 27 12:33:43 ip-10-31-14-141.us-east-1.aws.redhat.com python3.12[40913]: ansible-systemd Invoked with name=podman-kube@-etc-containers-ansible\x2dkubernetes.d-httpd3.yml.service scope=system state=stopped enabled=False daemon_reload=False daemon_reexec=False no_block=False force=None masked=None Jul 27 12:33:43 ip-10-31-14-141.us-east-1.aws.redhat.com systemd[1]: Reload requested from client PID 40916 ('systemctl') (unit session-5.scope)... Jul 27 12:33:43 ip-10-31-14-141.us-east-1.aws.redhat.com systemd[1]: Reloading... Jul 27 12:33:43 ip-10-31-14-141.us-east-1.aws.redhat.com systemd[1]: Reloading finished in 225 ms. Jul 27 12:33:43 ip-10-31-14-141.us-east-1.aws.redhat.com systemd[1]: Stopping podman-kube@-etc-containers-ansible\x2dkubernetes.d-httpd3.yml.service - A template for running K8s workloads via podman-kube-play... ░░ Subject: A stop job for unit podman-kube@-etc-containers-ansible\x2dkubernetes.d-httpd3.yml.service has begun execution ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ A stop job for unit podman-kube@-etc-containers-ansible\x2dkubernetes.d-httpd3.yml.service has begun execution. ░░ ░░ The job identifier is 2888. Jul 27 12:33:43 ip-10-31-14-141.us-east-1.aws.redhat.com podman[40973]: 2024-07-27 12:33:43.472230437 -0400 EDT m=+0.030222026 pod stop d934b8f7de34bc54d46a16a6adb769f71e105fd9b95764f463833eec9b2c48c6 (image=, name=httpd3) Jul 27 12:33:43 ip-10-31-14-141.us-east-1.aws.redhat.com systemd[1]: tmp-crun.Fi3dHc.mount: Deactivated successfully. ░░ Subject: Unit succeeded ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ The unit tmp-crun.Fi3dHc.mount has successfully entered the 'dead' state. Jul 27 12:33:43 ip-10-31-14-141.us-east-1.aws.redhat.com systemd[1]: libpod-b752438f066ce9faf605c7486f1f6ff867a81d90ea1198b0c44ff91e2c7d59d1.scope: Deactivated successfully. ░░ Subject: Unit succeeded ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ The unit libpod-b752438f066ce9faf605c7486f1f6ff867a81d90ea1198b0c44ff91e2c7d59d1.scope has successfully entered the 'dead' state. Jul 27 12:33:43 ip-10-31-14-141.us-east-1.aws.redhat.com podman[40973]: 2024-07-27 12:33:43.505576914 -0400 EDT m=+0.063568908 container died b752438f066ce9faf605c7486f1f6ff867a81d90ea1198b0c44ff91e2c7d59d1 (image=localhost/podman-pause:5.1.2-1720656000, name=d934b8f7de34-infra, PODMAN_SYSTEMD_UNIT=podman-kube@-etc-containers-ansible\x2dkubernetes.d-httpd3.yml.service, io.buildah.version=1.36.0) Jul 27 12:33:43 ip-10-31-14-141.us-east-1.aws.redhat.com kernel: podman1: port 2(veth1) entered disabled state Jul 27 12:33:43 ip-10-31-14-141.us-east-1.aws.redhat.com systemd[1]: run-r438989e88bd74f25b2e11a3ff478396f.scope: Deactivated successfully. ░░ Subject: Unit succeeded ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ The unit run-r438989e88bd74f25b2e11a3ff478396f.scope has successfully entered the 'dead' state. Jul 27 12:33:43 ip-10-31-14-141.us-east-1.aws.redhat.com kernel: veth1 (unregistering): left allmulticast mode Jul 27 12:33:43 ip-10-31-14-141.us-east-1.aws.redhat.com kernel: veth1 (unregistering): left promiscuous mode Jul 27 12:33:43 ip-10-31-14-141.us-east-1.aws.redhat.com kernel: podman1: port 2(veth1) entered disabled state Jul 27 12:33:43 ip-10-31-14-141.us-east-1.aws.redhat.com NetworkManager[678]: [1722098023.5585] device (podman1): state change: activated -> unmanaged (reason 'unmanaged-external-down', sys-iface-state: 'external') Jul 27 12:33:43 ip-10-31-14-141.us-east-1.aws.redhat.com systemd[1]: Starting NetworkManager-dispatcher.service - Network Manager Script Dispatcher Service... ░░ Subject: A start job for unit NetworkManager-dispatcher.service has begun execution ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ A start job for unit NetworkManager-dispatcher.service has begun execution. ░░ ░░ The job identifier is 2890. Jul 27 12:33:43 ip-10-31-14-141.us-east-1.aws.redhat.com systemd[1]: Started NetworkManager-dispatcher.service - Network Manager Script Dispatcher Service. ░░ Subject: A start job for unit NetworkManager-dispatcher.service has finished successfully ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ A start job for unit NetworkManager-dispatcher.service has finished successfully. ░░ ░░ The job identifier is 2890. Jul 27 12:33:43 ip-10-31-14-141.us-east-1.aws.redhat.com systemd[1]: run-netns-netns\x2d3dfb464e\x2d725e\x2d20a2\x2db8cf\x2de52b030db400.mount: Deactivated successfully. ░░ Subject: Unit succeeded ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ The unit run-netns-netns\x2d3dfb464e\x2d725e\x2d20a2\x2db8cf\x2de52b030db400.mount has successfully entered the 'dead' state. Jul 27 12:33:43 ip-10-31-14-141.us-east-1.aws.redhat.com podman[40973]: 2024-07-27 12:33:43.678893399 -0400 EDT m=+0.236884881 container cleanup b752438f066ce9faf605c7486f1f6ff867a81d90ea1198b0c44ff91e2c7d59d1 (image=localhost/podman-pause:5.1.2-1720656000, name=d934b8f7de34-infra, pod_id=d934b8f7de34bc54d46a16a6adb769f71e105fd9b95764f463833eec9b2c48c6, PODMAN_SYSTEMD_UNIT=podman-kube@-etc-containers-ansible\x2dkubernetes.d-httpd3.yml.service, io.buildah.version=1.36.0) Jul 27 12:33:44 ip-10-31-14-141.us-east-1.aws.redhat.com systemd[1]: var-lib-containers-storage-overlay-113e6845a6a3283be45d8451f3060693f0f6e8dde2b9aa84c23ee8749e9437f1-merged.mount: Deactivated successfully. ░░ Subject: Unit succeeded ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ The unit var-lib-containers-storage-overlay-113e6845a6a3283be45d8451f3060693f0f6e8dde2b9aa84c23ee8749e9437f1-merged.mount has successfully entered the 'dead' state. Jul 27 12:33:44 ip-10-31-14-141.us-east-1.aws.redhat.com systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-b752438f066ce9faf605c7486f1f6ff867a81d90ea1198b0c44ff91e2c7d59d1-userdata-shm.mount: Deactivated successfully. ░░ Subject: Unit succeeded ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ The unit var-lib-containers-storage-overlay\x2dcontainers-b752438f066ce9faf605c7486f1f6ff867a81d90ea1198b0c44ff91e2c7d59d1-userdata-shm.mount has successfully entered the 'dead' state. Jul 27 12:33:53 ip-10-31-14-141.us-east-1.aws.redhat.com podman[40973]: time="2024-07-27T12:33:53-04:00" level=warning msg="StopSignal SIGTERM failed to stop container httpd3-httpd3 in 10 seconds, resorting to SIGKILL" Jul 27 12:33:53 ip-10-31-14-141.us-east-1.aws.redhat.com podman[40973]: 2024-07-27 12:33:53.519648124 -0400 EDT m=+10.077639913 container died cf272fb64f907d0bdeb087821bcab3f874e448e85b816fb7d9962878c5c5b1f8 (image=quay.io/libpod/testimage:20210610, name=httpd3-httpd3, PODMAN_SYSTEMD_UNIT=podman-kube@-etc-containers-ansible\x2dkubernetes.d-httpd3.yml.service, app=test, created_at=2021-06-10T18:55:36Z, created_by=test/system/build-testimage, io.buildah.version=1.21.0, io.containers.autoupdate=registry) Jul 27 12:33:53 ip-10-31-14-141.us-east-1.aws.redhat.com systemd[1]: libpod-cf272fb64f907d0bdeb087821bcab3f874e448e85b816fb7d9962878c5c5b1f8.scope: Deactivated successfully. ░░ Subject: Unit succeeded ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ The unit libpod-cf272fb64f907d0bdeb087821bcab3f874e448e85b816fb7d9962878c5c5b1f8.scope has successfully entered the 'dead' state. Jul 27 12:33:53 ip-10-31-14-141.us-east-1.aws.redhat.com systemd[1]: var-lib-containers-storage-overlay-7dfeb4e5c5d86267a74cfadcc3a1936317249f7413341c2b3549606919360ba3-merged.mount: Deactivated successfully. ░░ Subject: Unit succeeded ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ The unit var-lib-containers-storage-overlay-7dfeb4e5c5d86267a74cfadcc3a1936317249f7413341c2b3549606919360ba3-merged.mount has successfully entered the 'dead' state. Jul 27 12:33:53 ip-10-31-14-141.us-east-1.aws.redhat.com podman[40973]: 2024-07-27 12:33:53.572742926 -0400 EDT m=+10.130734513 container cleanup cf272fb64f907d0bdeb087821bcab3f874e448e85b816fb7d9962878c5c5b1f8 (image=quay.io/libpod/testimage:20210610, name=httpd3-httpd3, pod_id=d934b8f7de34bc54d46a16a6adb769f71e105fd9b95764f463833eec9b2c48c6, created_at=2021-06-10T18:55:36Z, created_by=test/system/build-testimage, io.buildah.version=1.21.0, io.containers.autoupdate=registry, PODMAN_SYSTEMD_UNIT=podman-kube@-etc-containers-ansible\x2dkubernetes.d-httpd3.yml.service, app=test) Jul 27 12:33:53 ip-10-31-14-141.us-east-1.aws.redhat.com systemd[1]: Removed slice machine-libpod_pod_d934b8f7de34bc54d46a16a6adb769f71e105fd9b95764f463833eec9b2c48c6.slice - cgroup machine-libpod_pod_d934b8f7de34bc54d46a16a6adb769f71e105fd9b95764f463833eec9b2c48c6.slice. ░░ Subject: A stop job for unit machine-libpod_pod_d934b8f7de34bc54d46a16a6adb769f71e105fd9b95764f463833eec9b2c48c6.slice has finished ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ A stop job for unit machine-libpod_pod_d934b8f7de34bc54d46a16a6adb769f71e105fd9b95764f463833eec9b2c48c6.slice has finished. ░░ ░░ The job identifier is 2970 and the job result is done. Jul 27 12:33:53 ip-10-31-14-141.us-east-1.aws.redhat.com podman[40973]: 2024-07-27 12:33:53.581110894 -0400 EDT m=+10.139102531 pod stop d934b8f7de34bc54d46a16a6adb769f71e105fd9b95764f463833eec9b2c48c6 (image=, name=httpd3) Jul 27 12:33:53 ip-10-31-14-141.us-east-1.aws.redhat.com systemd[1]: machine-libpod_pod_d934b8f7de34bc54d46a16a6adb769f71e105fd9b95764f463833eec9b2c48c6.slice: Failed to open /run/systemd/transient/machine-libpod_pod_d934b8f7de34bc54d46a16a6adb769f71e105fd9b95764f463833eec9b2c48c6.slice: No such file or directory Jul 27 12:33:53 ip-10-31-14-141.us-east-1.aws.redhat.com podman[40973]: time="2024-07-27T12:33:53-04:00" level=error msg="Checking if infra needs to be stopped: removing pod d934b8f7de34bc54d46a16a6adb769f71e105fd9b95764f463833eec9b2c48c6 cgroup: Unit machine-libpod_pod_d934b8f7de34bc54d46a16a6adb769f71e105fd9b95764f463833eec9b2c48c6.slice not loaded." Jul 27 12:33:53 ip-10-31-14-141.us-east-1.aws.redhat.com podman[40973]: 2024-07-27 12:33:53.609670617 -0400 EDT m=+10.167662295 pod stop d934b8f7de34bc54d46a16a6adb769f71e105fd9b95764f463833eec9b2c48c6 (image=, name=httpd3) Jul 27 12:33:53 ip-10-31-14-141.us-east-1.aws.redhat.com systemd[1]: machine-libpod_pod_d934b8f7de34bc54d46a16a6adb769f71e105fd9b95764f463833eec9b2c48c6.slice: Failed to open /run/systemd/transient/machine-libpod_pod_d934b8f7de34bc54d46a16a6adb769f71e105fd9b95764f463833eec9b2c48c6.slice: No such file or directory Jul 27 12:33:53 ip-10-31-14-141.us-east-1.aws.redhat.com podman[40973]: time="2024-07-27T12:33:53-04:00" level=error msg="Checking if infra needs to be stopped: removing pod d934b8f7de34bc54d46a16a6adb769f71e105fd9b95764f463833eec9b2c48c6 cgroup: Unit machine-libpod_pod_d934b8f7de34bc54d46a16a6adb769f71e105fd9b95764f463833eec9b2c48c6.slice not loaded." Jul 27 12:33:53 ip-10-31-14-141.us-east-1.aws.redhat.com podman[40973]: 2024-07-27 12:33:53.625199102 -0400 EDT m=+10.183190816 container kill 3d4d558c0f2fa33094c0bd8d5e83414ebd5d14339c39e2ae713129f7e70714f8 (image=localhost/podman-pause:5.1.2-1720656000, name=b46e3d5b107f-service, PODMAN_SYSTEMD_UNIT=podman-kube@-etc-containers-ansible\x2dkubernetes.d-httpd3.yml.service) Jul 27 12:33:53 ip-10-31-14-141.us-east-1.aws.redhat.com systemd[1]: tmp-crun.bAaH6x.mount: Deactivated successfully. ░░ Subject: Unit succeeded ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ The unit tmp-crun.bAaH6x.mount has successfully entered the 'dead' state. Jul 27 12:33:53 ip-10-31-14-141.us-east-1.aws.redhat.com systemd[1]: libpod-3d4d558c0f2fa33094c0bd8d5e83414ebd5d14339c39e2ae713129f7e70714f8.scope: Deactivated successfully. ░░ Subject: Unit succeeded ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ The unit libpod-3d4d558c0f2fa33094c0bd8d5e83414ebd5d14339c39e2ae713129f7e70714f8.scope has successfully entered the 'dead' state. Jul 27 12:33:53 ip-10-31-14-141.us-east-1.aws.redhat.com podman[40973]: 2024-07-27 12:33:53.64334047 -0400 EDT m=+10.201332183 container died 3d4d558c0f2fa33094c0bd8d5e83414ebd5d14339c39e2ae713129f7e70714f8 (image=localhost/podman-pause:5.1.2-1720656000, name=b46e3d5b107f-service, PODMAN_SYSTEMD_UNIT=podman-kube@-etc-containers-ansible\x2dkubernetes.d-httpd3.yml.service) Jul 27 12:33:53 ip-10-31-14-141.us-east-1.aws.redhat.com systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully. ░░ Subject: Unit succeeded ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ The unit NetworkManager-dispatcher.service has successfully entered the 'dead' state. Jul 27 12:33:53 ip-10-31-14-141.us-east-1.aws.redhat.com podman[40973]: 2024-07-27 12:33:53.672832378 -0400 EDT m=+10.230823870 container remove cf272fb64f907d0bdeb087821bcab3f874e448e85b816fb7d9962878c5c5b1f8 (image=quay.io/libpod/testimage:20210610, name=httpd3-httpd3, pod_id=d934b8f7de34bc54d46a16a6adb769f71e105fd9b95764f463833eec9b2c48c6, PODMAN_SYSTEMD_UNIT=podman-kube@-etc-containers-ansible\x2dkubernetes.d-httpd3.yml.service, app=test, created_at=2021-06-10T18:55:36Z, created_by=test/system/build-testimage, io.buildah.version=1.21.0, io.containers.autoupdate=registry) Jul 27 12:33:53 ip-10-31-14-141.us-east-1.aws.redhat.com podman[40973]: 2024-07-27 12:33:53.706274978 -0400 EDT m=+10.264266468 container remove b752438f066ce9faf605c7486f1f6ff867a81d90ea1198b0c44ff91e2c7d59d1 (image=localhost/podman-pause:5.1.2-1720656000, name=d934b8f7de34-infra, pod_id=d934b8f7de34bc54d46a16a6adb769f71e105fd9b95764f463833eec9b2c48c6, PODMAN_SYSTEMD_UNIT=podman-kube@-etc-containers-ansible\x2dkubernetes.d-httpd3.yml.service, io.buildah.version=1.36.0) Jul 27 12:33:53 ip-10-31-14-141.us-east-1.aws.redhat.com systemd[1]: machine-libpod_pod_d934b8f7de34bc54d46a16a6adb769f71e105fd9b95764f463833eec9b2c48c6.slice: Failed to open /run/systemd/transient/machine-libpod_pod_d934b8f7de34bc54d46a16a6adb769f71e105fd9b95764f463833eec9b2c48c6.slice: No such file or directory Jul 27 12:33:53 ip-10-31-14-141.us-east-1.aws.redhat.com podman[40973]: 2024-07-27 12:33:53.716159594 -0400 EDT m=+10.274151080 pod remove d934b8f7de34bc54d46a16a6adb769f71e105fd9b95764f463833eec9b2c48c6 (image=, name=httpd3) Jul 27 12:33:53 ip-10-31-14-141.us-east-1.aws.redhat.com podman[40973]: 2024-07-27 12:33:53.766680486 -0400 EDT m=+10.324671979 container remove 3d4d558c0f2fa33094c0bd8d5e83414ebd5d14339c39e2ae713129f7e70714f8 (image=localhost/podman-pause:5.1.2-1720656000, name=b46e3d5b107f-service, PODMAN_SYSTEMD_UNIT=podman-kube@-etc-containers-ansible\x2dkubernetes.d-httpd3.yml.service) Jul 27 12:33:53 ip-10-31-14-141.us-east-1.aws.redhat.com podman[40973]: Pods stopped: Jul 27 12:33:53 ip-10-31-14-141.us-east-1.aws.redhat.com podman[40973]: d934b8f7de34bc54d46a16a6adb769f71e105fd9b95764f463833eec9b2c48c6 Jul 27 12:33:53 ip-10-31-14-141.us-east-1.aws.redhat.com podman[40973]: Pods removed: Jul 27 12:33:53 ip-10-31-14-141.us-east-1.aws.redhat.com podman[40973]: Error: removing pod d934b8f7de34bc54d46a16a6adb769f71e105fd9b95764f463833eec9b2c48c6 cgroup: removing pod d934b8f7de34bc54d46a16a6adb769f71e105fd9b95764f463833eec9b2c48c6 cgroup: Unit machine-libpod_pod_d934b8f7de34bc54d46a16a6adb769f71e105fd9b95764f463833eec9b2c48c6.slice not loaded. Jul 27 12:33:53 ip-10-31-14-141.us-east-1.aws.redhat.com podman[40973]: Secrets removed: Jul 27 12:33:53 ip-10-31-14-141.us-east-1.aws.redhat.com podman[40973]: Error: %!s() Jul 27 12:33:53 ip-10-31-14-141.us-east-1.aws.redhat.com podman[40973]: Volumes removed: Jul 27 12:33:53 ip-10-31-14-141.us-east-1.aws.redhat.com systemd[1]: podman-kube@-etc-containers-ansible\x2dkubernetes.d-httpd3.yml.service: Deactivated successfully. ░░ Subject: Unit succeeded ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ The unit podman-kube@-etc-containers-ansible\x2dkubernetes.d-httpd3.yml.service has successfully entered the 'dead' state. Jul 27 12:33:53 ip-10-31-14-141.us-east-1.aws.redhat.com systemd[1]: Stopped podman-kube@-etc-containers-ansible\x2dkubernetes.d-httpd3.yml.service - A template for running K8s workloads via podman-kube-play. ░░ Subject: A stop job for unit podman-kube@-etc-containers-ansible\x2dkubernetes.d-httpd3.yml.service has finished ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ A stop job for unit podman-kube@-etc-containers-ansible\x2dkubernetes.d-httpd3.yml.service has finished. ░░ ░░ The job identifier is 2888 and the job result is done. Jul 27 12:33:54 ip-10-31-14-141.us-east-1.aws.redhat.com python3.12[41146]: ansible-stat Invoked with path=/etc/containers/ansible-kubernetes.d/httpd3.yml follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Jul 27 12:33:54 ip-10-31-14-141.us-east-1.aws.redhat.com systemd[1]: var-lib-containers-storage-overlay-a431caa463f71f2b50b4afe74a9d2c0f739b4dcc7ff334225aeff25db75bd4f2-merged.mount: Deactivated successfully. ░░ Subject: Unit succeeded ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ The unit var-lib-containers-storage-overlay-a431caa463f71f2b50b4afe74a9d2c0f739b4dcc7ff334225aeff25db75bd4f2-merged.mount has successfully entered the 'dead' state. Jul 27 12:33:54 ip-10-31-14-141.us-east-1.aws.redhat.com systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-3d4d558c0f2fa33094c0bd8d5e83414ebd5d14339c39e2ae713129f7e70714f8-userdata-shm.mount: Deactivated successfully. ░░ Subject: Unit succeeded ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ The unit var-lib-containers-storage-overlay\x2dcontainers-3d4d558c0f2fa33094c0bd8d5e83414ebd5d14339c39e2ae713129f7e70714f8-userdata-shm.mount has successfully entered the 'dead' state. Jul 27 12:33:54 ip-10-31-14-141.us-east-1.aws.redhat.com python3.12[41261]: ansible-containers.podman.podman_play Invoked with state=absent kube_file=/etc/containers/ansible-kubernetes.d/httpd3.yml executable=podman annotation=None authfile=None build=None cert_dir=None configmap=None context_dir=None seccomp_profile_root=None username=None password=NOT_LOGGING_PARAMETER log_driver=None log_opt=None network=None tls_verify=None debug=None quiet=None recreate=None userns=None log_level=None quadlet_dir=None quadlet_filename=None quadlet_options=None Jul 27 12:33:54 ip-10-31-14-141.us-east-1.aws.redhat.com python3.12[41261]: ansible-containers.podman.podman_play version: 5.1.2, kube file /etc/containers/ansible-kubernetes.d/httpd3.yml Jul 27 12:33:54 ip-10-31-14-141.us-east-1.aws.redhat.com systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully. ░░ Subject: Unit succeeded ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ The unit var-lib-containers-storage-overlay.mount has successfully entered the 'dead' state. Jul 27 12:33:55 ip-10-31-14-141.us-east-1.aws.redhat.com python3.12[41388]: ansible-file Invoked with path=/etc/containers/ansible-kubernetes.d/httpd3.yml state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Jul 27 12:33:55 ip-10-31-14-141.us-east-1.aws.redhat.com python3.12[41501]: ansible-ansible.legacy.command Invoked with _raw_params=podman image prune -f _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Jul 27 12:33:55 ip-10-31-14-141.us-east-1.aws.redhat.com systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully. ░░ Subject: Unit succeeded ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ The unit var-lib-containers-storage-overlay.mount has successfully entered the 'dead' state. Jul 27 12:33:56 ip-10-31-14-141.us-east-1.aws.redhat.com python3.12[41622]: ansible-getent Invoked with database=passwd key=podman_basic_user fail_key=True service=None split=None Jul 27 12:33:57 ip-10-31-14-141.us-east-1.aws.redhat.com python3.12[41736]: ansible-stat Invoked with path=/run/user/3001 follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Jul 27 12:33:57 ip-10-31-14-141.us-east-1.aws.redhat.com sudo[41887]: root : TTY=pts/0 ; PWD=/root ; USER=podman_basic_user ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ehmxutokkggciktqnztiomxcuvjhjqqh ; XDG_RUNTIME_DIR=/run/user/3001 /usr/bin/python3.12 /var/tmp/ansible-tmp-1722098037.2598891-9584-137277880787959/AnsiballZ_podman_container_info.py' Jul 27 12:33:57 ip-10-31-14-141.us-east-1.aws.redhat.com sudo[41887]: pam_systemd(sudo:session): New sd-bus connection (system-bus-pam-systemd-41887) opened. Jul 27 12:33:57 ip-10-31-14-141.us-east-1.aws.redhat.com sudo[41887]: pam_unix(sudo:session): session opened for user podman_basic_user(uid=3001) by root(uid=0) Jul 27 12:33:57 ip-10-31-14-141.us-east-1.aws.redhat.com python3.12[41890]: ansible-containers.podman.podman_container_info Invoked with executable=podman name=None Jul 27 12:33:57 ip-10-31-14-141.us-east-1.aws.redhat.com systemd[23350]: Started podman-41891.scope. ░░ Subject: A start job for unit UNIT has finished successfully ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ A start job for unit UNIT has finished successfully. ░░ ░░ The job identifier is 158. Jul 27 12:33:57 ip-10-31-14-141.us-east-1.aws.redhat.com sudo[41887]: pam_unix(sudo:session): session closed for user podman_basic_user Jul 27 12:33:58 ip-10-31-14-141.us-east-1.aws.redhat.com sudo[42046]: root : TTY=pts/0 ; PWD=/root ; USER=podman_basic_user ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cyisvsnzepriocmcdxkzdstaaaohxfri ; XDG_RUNTIME_DIR=/run/user/3001 /usr/bin/python3.12 /var/tmp/ansible-tmp-1722098037.8880062-9594-269555214208535/AnsiballZ_command.py' Jul 27 12:33:58 ip-10-31-14-141.us-east-1.aws.redhat.com sudo[42046]: pam_systemd(sudo:session): New sd-bus connection (system-bus-pam-systemd-42046) opened. Jul 27 12:33:58 ip-10-31-14-141.us-east-1.aws.redhat.com sudo[42046]: pam_unix(sudo:session): session opened for user podman_basic_user(uid=3001) by root(uid=0) Jul 27 12:33:58 ip-10-31-14-141.us-east-1.aws.redhat.com python3.12[42049]: ansible-ansible.legacy.command Invoked with _raw_params=podman network ls -q _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Jul 27 12:33:58 ip-10-31-14-141.us-east-1.aws.redhat.com systemd[23350]: Started podman-42050.scope. ░░ Subject: A start job for unit UNIT has finished successfully ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ A start job for unit UNIT has finished successfully. ░░ ░░ The job identifier is 162. Jul 27 12:33:58 ip-10-31-14-141.us-east-1.aws.redhat.com sudo[42046]: pam_unix(sudo:session): session closed for user podman_basic_user Jul 27 12:33:58 ip-10-31-14-141.us-east-1.aws.redhat.com sudo[42205]: root : TTY=pts/0 ; PWD=/root ; USER=podman_basic_user ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xsvavmubqeckhpiwdvzjoqgesprjwhaf ; XDG_RUNTIME_DIR=/run/user/3001 /usr/bin/python3.12 /var/tmp/ansible-tmp-1722098038.411431-9604-37186754175288/AnsiballZ_command.py' Jul 27 12:33:58 ip-10-31-14-141.us-east-1.aws.redhat.com sudo[42205]: pam_systemd(sudo:session): New sd-bus connection (system-bus-pam-systemd-42205) opened. Jul 27 12:33:58 ip-10-31-14-141.us-east-1.aws.redhat.com sudo[42205]: pam_unix(sudo:session): session opened for user podman_basic_user(uid=3001) by root(uid=0) Jul 27 12:33:58 ip-10-31-14-141.us-east-1.aws.redhat.com python3.12[42208]: ansible-ansible.legacy.command Invoked with _raw_params=podman secret ls -n -q _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Jul 27 12:33:58 ip-10-31-14-141.us-east-1.aws.redhat.com systemd[23350]: Started podman-42209.scope. ░░ Subject: A start job for unit UNIT has finished successfully ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ A start job for unit UNIT has finished successfully. ░░ ░░ The job identifier is 166. Jul 27 12:33:58 ip-10-31-14-141.us-east-1.aws.redhat.com sudo[42205]: pam_unix(sudo:session): session closed for user podman_basic_user Jul 27 12:33:59 ip-10-31-14-141.us-east-1.aws.redhat.com python3.12[42329]: ansible-ansible.legacy.command Invoked with removes=/var/lib/systemd/linger/podman_basic_user _raw_params=loginctl disable-linger podman_basic_user _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None stdin=None Jul 27 12:33:59 ip-10-31-14-141.us-east-1.aws.redhat.com sudo[42479]: root : TTY=pts/0 ; PWD=/root ; USER=podman_basic_user ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lpxkatyrnuepulprpiuhwbfyqelyxnqc ; /usr/bin/python3.12 /var/tmp/ansible-tmp-1722098039.4663327-9624-95555905864496/AnsiballZ_command.py' Jul 27 12:33:59 ip-10-31-14-141.us-east-1.aws.redhat.com sudo[42479]: pam_systemd(sudo:session): New sd-bus connection (system-bus-pam-systemd-42479) opened. Jul 27 12:33:59 ip-10-31-14-141.us-east-1.aws.redhat.com sudo[42479]: pam_unix(sudo:session): session opened for user podman_basic_user(uid=3001) by root(uid=0) Jul 27 12:33:59 ip-10-31-14-141.us-east-1.aws.redhat.com python3.12[42482]: ansible-ansible.legacy.command Invoked with _raw_params=podman pod exists httpd1 _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Jul 27 12:33:59 ip-10-31-14-141.us-east-1.aws.redhat.com systemd[23350]: Started podman-42490.scope. ░░ Subject: A start job for unit UNIT has finished successfully ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ A start job for unit UNIT has finished successfully. ░░ ░░ The job identifier is 170. Jul 27 12:33:59 ip-10-31-14-141.us-east-1.aws.redhat.com sudo[42479]: pam_unix(sudo:session): session closed for user podman_basic_user Jul 27 12:34:00 ip-10-31-14-141.us-east-1.aws.redhat.com python3.12[42610]: ansible-ansible.legacy.command Invoked with _raw_params=podman pod exists httpd2 _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Jul 27 12:34:00 ip-10-31-14-141.us-east-1.aws.redhat.com systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully. ░░ Subject: Unit succeeded ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ The unit var-lib-containers-storage-overlay.mount has successfully entered the 'dead' state. Jul 27 12:34:00 ip-10-31-14-141.us-east-1.aws.redhat.com python3.12[42731]: ansible-ansible.legacy.command Invoked with _raw_params=podman pod exists httpd3 _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Jul 27 12:34:00 ip-10-31-14-141.us-east-1.aws.redhat.com systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully. ░░ Subject: Unit succeeded ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ The unit var-lib-containers-storage-overlay.mount has successfully entered the 'dead' state. Jul 27 12:34:01 ip-10-31-14-141.us-east-1.aws.redhat.com sudo[42888]: root : TTY=pts/0 ; PWD=/root ; USER=podman_basic_user ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dsadlykbrfvpashwwscrdqiwopancssi ; XDG_RUNTIME_DIR=/run/user/3001 /usr/bin/python3.12 /var/tmp/ansible-tmp-1722098040.8511503-9648-214609914408569/AnsiballZ_command.py' Jul 27 12:34:01 ip-10-31-14-141.us-east-1.aws.redhat.com sudo[42888]: pam_systemd(sudo:session): New sd-bus connection (system-bus-pam-systemd-42888) opened. Jul 27 12:34:01 ip-10-31-14-141.us-east-1.aws.redhat.com sudo[42888]: pam_unix(sudo:session): session opened for user podman_basic_user(uid=3001) by root(uid=0) Jul 27 12:34:01 ip-10-31-14-141.us-east-1.aws.redhat.com python3.12[42891]: ansible-ansible.legacy.command Invoked with _raw_params=set -euo pipefail systemctl --user list-units -a -l --plain | grep -E '^[ ]*podman-kube@.+-httpd1[.]yml[.]service[ ]+loaded[ ]+active ' _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Jul 27 12:34:01 ip-10-31-14-141.us-east-1.aws.redhat.com sudo[42888]: pam_unix(sudo:session): session closed for user podman_basic_user Jul 27 12:34:01 ip-10-31-14-141.us-east-1.aws.redhat.com python3.12[43007]: ansible-ansible.legacy.command Invoked with _raw_params=set -euo pipefail systemctl --system list-units -a -l --plain | grep -E '^[ ]*podman-kube@.+-httpd2[.]yml[.]service[ ]+loaded[ ]+active ' _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Jul 27 12:34:01 ip-10-31-14-141.us-east-1.aws.redhat.com python3.12[43123]: ansible-ansible.legacy.command Invoked with _raw_params=set -euo pipefail systemctl --system list-units -a -l --plain | grep -E '^[ ]*podman-kube@.+-httpd3[.]yml[.]service[ ]+loaded[ ]+active ' _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Jul 27 12:34:02 ip-10-31-14-141.us-east-1.aws.redhat.com python3.12[43239]: ansible-stat Invoked with path=/var/lib/systemd/linger/podman_basic_user follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Jul 27 12:34:04 ip-10-31-14-141.us-east-1.aws.redhat.com python3.12[43465]: ansible-ansible.legacy.command Invoked with _raw_params=podman --version _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Jul 27 12:34:05 ip-10-31-14-141.us-east-1.aws.redhat.com python3.12[43584]: ansible-getent Invoked with database=passwd key=root fail_key=False service=None split=None Jul 27 12:34:05 ip-10-31-14-141.us-east-1.aws.redhat.com python3.12[43698]: ansible-getent Invoked with database=group key=0 fail_key=False service=None split=None Jul 27 12:34:06 ip-10-31-14-141.us-east-1.aws.redhat.com python3.12[43812]: ansible-stat Invoked with path=/usr/bin/getsubids follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Jul 27 12:34:08 ip-10-31-14-141.us-east-1.aws.redhat.com python3.12[43927]: ansible-getent Invoked with database=passwd key=podman_basic_user fail_key=False service=None split=None Jul 27 12:34:09 ip-10-31-14-141.us-east-1.aws.redhat.com python3.12[44041]: ansible-getent Invoked with database=group key=3001 fail_key=False service=None split=None Jul 27 12:34:09 ip-10-31-14-141.us-east-1.aws.redhat.com python3.12[44155]: ansible-stat Invoked with path=/usr/bin/getsubids follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Jul 27 12:34:10 ip-10-31-14-141.us-east-1.aws.redhat.com python3.12[44270]: ansible-ansible.legacy.command Invoked with _raw_params=getsubids podman_basic_user _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Jul 27 12:34:10 ip-10-31-14-141.us-east-1.aws.redhat.com python3.12[44384]: ansible-ansible.legacy.command Invoked with _raw_params=getsubids -g podman_basic_user _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Jul 27 12:34:11 ip-10-31-14-141.us-east-1.aws.redhat.com python3.12[44498]: ansible-ansible.legacy.command Invoked with _raw_params=systemd-escape --template podman-kube@.service /home/podman_basic_user/.config/containers/ansible-kubernetes.d/httpd1.yml _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Jul 27 12:34:12 ip-10-31-14-141.us-east-1.aws.redhat.com python3.12[44612]: ansible-stat Invoked with path=/run/user/3001 follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Jul 27 12:34:12 ip-10-31-14-141.us-east-1.aws.redhat.com sudo[44763]: root : TTY=pts/0 ; PWD=/root ; USER=podman_basic_user ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-scdhwojqbvpxnspmbsmzrmsvzozodgxp ; XDG_RUNTIME_DIR=/run/user/3001 /usr/bin/python3.12 /var/tmp/ansible-tmp-1722098052.586484-9840-107139446055076/AnsiballZ_systemd.py' Jul 27 12:34:12 ip-10-31-14-141.us-east-1.aws.redhat.com sudo[44763]: pam_systemd(sudo:session): New sd-bus connection (system-bus-pam-systemd-44763) opened. Jul 27 12:34:12 ip-10-31-14-141.us-east-1.aws.redhat.com sudo[44763]: pam_unix(sudo:session): session opened for user podman_basic_user(uid=3001) by root(uid=0) Jul 27 12:34:13 ip-10-31-14-141.us-east-1.aws.redhat.com python3.12[44766]: ansible-systemd Invoked with name=podman-kube@-home-podman_basic_user-.config-containers-ansible\x2dkubernetes.d-httpd1.yml.service scope=user state=stopped enabled=False daemon_reload=False daemon_reexec=False no_block=False force=None masked=None Jul 27 12:34:13 ip-10-31-14-141.us-east-1.aws.redhat.com sudo[44763]: pam_unix(sudo:session): session closed for user podman_basic_user Jul 27 12:34:13 ip-10-31-14-141.us-east-1.aws.redhat.com python3.12[44881]: ansible-stat Invoked with path=/home/podman_basic_user/.config/containers/ansible-kubernetes.d/httpd1.yml follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Jul 27 12:34:13 ip-10-31-14-141.us-east-1.aws.redhat.com python3.12[44994]: ansible-file Invoked with path=/home/podman_basic_user/.config/containers/ansible-kubernetes.d/httpd1.yml state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Jul 27 12:34:15 ip-10-31-14-141.us-east-1.aws.redhat.com python3.12[45107]: ansible-getent Invoked with database=passwd key=root fail_key=False service=None split=None Jul 27 12:34:15 ip-10-31-14-141.us-east-1.aws.redhat.com python3.12[45221]: ansible-getent Invoked with database=group key=0 fail_key=False service=None split=None Jul 27 12:34:16 ip-10-31-14-141.us-east-1.aws.redhat.com python3.12[45335]: ansible-stat Invoked with path=/usr/bin/getsubids follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Jul 27 12:34:17 ip-10-31-14-141.us-east-1.aws.redhat.com python3.12[45450]: ansible-ansible.legacy.command Invoked with _raw_params=systemd-escape --template podman-kube@.service /etc/containers/ansible-kubernetes.d/httpd2.yml _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Jul 27 12:34:17 ip-10-31-14-141.us-east-1.aws.redhat.com python3.12[45564]: ansible-systemd Invoked with name=podman-kube@-etc-containers-ansible\x2dkubernetes.d-httpd2.yml.service scope=system state=stopped enabled=False daemon_reload=False daemon_reexec=False no_block=False force=None masked=None Jul 27 12:34:18 ip-10-31-14-141.us-east-1.aws.redhat.com python3.12[45679]: ansible-stat Invoked with path=/etc/containers/ansible-kubernetes.d/httpd2.yml follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Jul 27 12:34:18 ip-10-31-14-141.us-east-1.aws.redhat.com python3.12[45792]: ansible-file Invoked with path=/etc/containers/ansible-kubernetes.d/httpd2.yml state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Jul 27 12:34:20 ip-10-31-14-141.us-east-1.aws.redhat.com python3.12[45905]: ansible-getent Invoked with database=group key=0 fail_key=False service=None split=None Jul 27 12:34:20 ip-10-31-14-141.us-east-1.aws.redhat.com python3.12[46019]: ansible-stat Invoked with path=/usr/bin/getsubids follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Jul 27 12:34:21 ip-10-31-14-141.us-east-1.aws.redhat.com python3.12[46134]: ansible-ansible.legacy.command Invoked with _raw_params=systemd-escape --template podman-kube@.service /etc/containers/ansible-kubernetes.d/httpd3.yml _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Jul 27 12:34:22 ip-10-31-14-141.us-east-1.aws.redhat.com python3.12[46248]: ansible-systemd Invoked with name=podman-kube@-etc-containers-ansible\x2dkubernetes.d-httpd3.yml.service scope=system state=stopped enabled=False daemon_reload=False daemon_reexec=False no_block=False force=None masked=None Jul 27 12:34:22 ip-10-31-14-141.us-east-1.aws.redhat.com python3.12[46363]: ansible-stat Invoked with path=/etc/containers/ansible-kubernetes.d/httpd3.yml follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Jul 27 12:34:23 ip-10-31-14-141.us-east-1.aws.redhat.com python3.12[46476]: ansible-file Invoked with path=/etc/containers/ansible-kubernetes.d/httpd3.yml state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Jul 27 12:34:24 ip-10-31-14-141.us-east-1.aws.redhat.com python3.12[46589]: ansible-getent Invoked with database=passwd key=podman_basic_user fail_key=True service=None split=None Jul 27 12:34:24 ip-10-31-14-141.us-east-1.aws.redhat.com python3.12[46703]: ansible-stat Invoked with path=/run/user/3001 follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Jul 27 12:34:25 ip-10-31-14-141.us-east-1.aws.redhat.com sudo[46854]: root : TTY=pts/0 ; PWD=/root ; USER=podman_basic_user ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pekcurrahborqpcffinfbotxxkvxiurm ; XDG_RUNTIME_DIR=/run/user/3001 /usr/bin/python3.12 /var/tmp/ansible-tmp-1722098064.9924238-10055-53014654928287/AnsiballZ_podman_container_info.py' Jul 27 12:34:25 ip-10-31-14-141.us-east-1.aws.redhat.com sudo[46854]: pam_systemd(sudo:session): New sd-bus connection (system-bus-pam-systemd-46854) opened. Jul 27 12:34:25 ip-10-31-14-141.us-east-1.aws.redhat.com sudo[46854]: pam_unix(sudo:session): session opened for user podman_basic_user(uid=3001) by root(uid=0) Jul 27 12:34:25 ip-10-31-14-141.us-east-1.aws.redhat.com python3.12[46857]: ansible-containers.podman.podman_container_info Invoked with executable=podman name=None Jul 27 12:34:25 ip-10-31-14-141.us-east-1.aws.redhat.com systemd[23350]: Started podman-46858.scope. ░░ Subject: A start job for unit UNIT has finished successfully ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ A start job for unit UNIT has finished successfully. ░░ ░░ The job identifier is 174. Jul 27 12:34:25 ip-10-31-14-141.us-east-1.aws.redhat.com sudo[46854]: pam_unix(sudo:session): session closed for user podman_basic_user Jul 27 12:34:25 ip-10-31-14-141.us-east-1.aws.redhat.com sudo[47013]: root : TTY=pts/0 ; PWD=/root ; USER=podman_basic_user ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gjxwumqfxciapyfxwbxyccogfhdfckzf ; XDG_RUNTIME_DIR=/run/user/3001 /usr/bin/python3.12 /var/tmp/ansible-tmp-1722098065.5211716-10066-257583667683822/AnsiballZ_command.py' Jul 27 12:34:25 ip-10-31-14-141.us-east-1.aws.redhat.com sudo[47013]: pam_systemd(sudo:session): New sd-bus connection (system-bus-pam-systemd-47013) opened. Jul 27 12:34:25 ip-10-31-14-141.us-east-1.aws.redhat.com sudo[47013]: pam_unix(sudo:session): session opened for user podman_basic_user(uid=3001) by root(uid=0) Jul 27 12:34:25 ip-10-31-14-141.us-east-1.aws.redhat.com python3.12[47016]: ansible-ansible.legacy.command Invoked with _raw_params=podman network ls -q _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Jul 27 12:34:25 ip-10-31-14-141.us-east-1.aws.redhat.com systemd[23350]: Started podman-47017.scope. ░░ Subject: A start job for unit UNIT has finished successfully ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ A start job for unit UNIT has finished successfully. ░░ ░░ The job identifier is 178. Jul 27 12:34:25 ip-10-31-14-141.us-east-1.aws.redhat.com sudo[47013]: pam_unix(sudo:session): session closed for user podman_basic_user Jul 27 12:34:26 ip-10-31-14-141.us-east-1.aws.redhat.com sudo[47173]: root : TTY=pts/0 ; PWD=/root ; USER=podman_basic_user ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-izwtnhgfshrocejzipvlsvggqjlelypr ; XDG_RUNTIME_DIR=/run/user/3001 /usr/bin/python3.12 /var/tmp/ansible-tmp-1722098066.0516913-10076-240045495406512/AnsiballZ_command.py' Jul 27 12:34:26 ip-10-31-14-141.us-east-1.aws.redhat.com sudo[47173]: pam_systemd(sudo:session): New sd-bus connection (system-bus-pam-systemd-47173) opened. Jul 27 12:34:26 ip-10-31-14-141.us-east-1.aws.redhat.com sudo[47173]: pam_unix(sudo:session): session opened for user podman_basic_user(uid=3001) by root(uid=0) Jul 27 12:34:26 ip-10-31-14-141.us-east-1.aws.redhat.com python3.12[47176]: ansible-ansible.legacy.command Invoked with _raw_params=podman secret ls -n -q _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Jul 27 12:34:26 ip-10-31-14-141.us-east-1.aws.redhat.com systemd[23350]: Started podman-47177.scope. ░░ Subject: A start job for unit UNIT has finished successfully ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ A start job for unit UNIT has finished successfully. ░░ ░░ The job identifier is 182. Jul 27 12:34:26 ip-10-31-14-141.us-east-1.aws.redhat.com sudo[47173]: pam_unix(sudo:session): session closed for user podman_basic_user Jul 27 12:34:26 ip-10-31-14-141.us-east-1.aws.redhat.com python3.12[47297]: ansible-ansible.legacy.command Invoked with removes=/var/lib/systemd/linger/podman_basic_user _raw_params=loginctl disable-linger podman_basic_user _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None stdin=None Jul 27 12:34:27 ip-10-31-14-141.us-east-1.aws.redhat.com python3.12[47410]: ansible-file Invoked with path=/etc/containers/storage.conf state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Jul 27 12:34:27 ip-10-31-14-141.us-east-1.aws.redhat.com python3.12[47523]: ansible-file Invoked with path=/tmp/lsr_ykmfaaw1_podman state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Jul 27 12:34:31 ip-10-31-14-141.us-east-1.aws.redhat.com python3.12[47673]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Jul 27 12:34:32 ip-10-31-14-141.us-east-1.aws.redhat.com python3.12[47815]: ansible-stat Invoked with path=/run/ostree-booted follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Jul 27 12:34:33 ip-10-31-14-141.us-east-1.aws.redhat.com python3.12[48041]: ansible-ansible.legacy.command Invoked with _raw_params=podman --version _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Jul 27 12:34:34 ip-10-31-14-141.us-east-1.aws.redhat.com python3.12[48160]: ansible-getent Invoked with database=passwd key=root fail_key=False service=None split=None Jul 27 12:34:34 ip-10-31-14-141.us-east-1.aws.redhat.com python3.12[48274]: ansible-getent Invoked with database=group key=0 fail_key=False service=None split=None Jul 27 12:34:35 ip-10-31-14-141.us-east-1.aws.redhat.com python3.12[48388]: ansible-stat Invoked with path=/usr/bin/getsubids follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Jul 27 12:34:37 ip-10-31-14-141.us-east-1.aws.redhat.com python3.12[48503]: ansible-tempfile Invoked with state=directory prefix=lsr_podman_config_ suffix= path=None Jul 27 12:34:37 ip-10-31-14-141.us-east-1.aws.redhat.com python3.12[48616]: ansible-ansible.legacy.command Invoked with _raw_params=tar --ignore-failed-read -c -P -v -p -f /tmp/lsr_podman_config_w171idou/backup.tar /etc/containers/containers.conf.d/50-systemroles.conf /etc/containers/registries.conf.d/50-systemroles.conf /etc/containers/storage.conf /etc/containers/policy.json _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Jul 27 12:34:38 ip-10-31-14-141.us-east-1.aws.redhat.com python3.12[48730]: ansible-user Invoked with name=user1 state=present non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on ip-10-31-14-141.us-east-1.aws.redhat.com update_password=always uid=None group=None groups=None comment=None home=None shell=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None Jul 27 12:34:38 ip-10-31-14-141.us-east-1.aws.redhat.com useradd[48732]: new group: name=user1, GID=3002 Jul 27 12:34:38 ip-10-31-14-141.us-east-1.aws.redhat.com useradd[48732]: new user: name=user1, UID=3002, GID=3002, home=/home/user1, shell=/bin/bash, from=/dev/pts/0 Jul 27 12:34:38 ip-10-31-14-141.us-east-1.aws.redhat.com rsyslogd[613]: imjournal: journal files changed, reloading... [v8.2312.0-2.el10 try https://www.rsyslog.com/e/0 ] Jul 27 12:34:40 ip-10-31-14-141.us-east-1.aws.redhat.com python3.12[48963]: ansible-ansible.legacy.command Invoked with _raw_params=podman --version _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Jul 27 12:34:40 ip-10-31-14-141.us-east-1.aws.redhat.com python3.12[49082]: ansible-getent Invoked with database=passwd key=user1 fail_key=False service=None split=None Jul 27 12:34:41 ip-10-31-14-141.us-east-1.aws.redhat.com python3.12[49196]: ansible-getent Invoked with database=group key=3002 fail_key=False service=None split=None Jul 27 12:34:41 ip-10-31-14-141.us-east-1.aws.redhat.com python3.12[49310]: ansible-stat Invoked with path=/usr/bin/getsubids follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Jul 27 12:34:41 ip-10-31-14-141.us-east-1.aws.redhat.com python3.12[49425]: ansible-ansible.legacy.command Invoked with _raw_params=getsubids user1 _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Jul 27 12:34:42 ip-10-31-14-141.us-east-1.aws.redhat.com python3.12[49539]: ansible-ansible.legacy.command Invoked with _raw_params=getsubids -g user1 _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Jul 27 12:34:43 ip-10-31-14-141.us-east-1.aws.redhat.com python3.12[49653]: ansible-file Invoked with path=/home/user1/.config/containers/containers.conf.d state=directory owner=user1 mode=0755 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None group=None seuser=None serole=None selevel=None setype=None attributes=None Jul 27 12:34:43 ip-10-31-14-141.us-east-1.aws.redhat.com python3.12[49766]: ansible-ansible.legacy.stat Invoked with path=/home/user1/.config/containers/containers.conf.d/50-systemroles.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True Jul 27 12:34:44 ip-10-31-14-141.us-east-1.aws.redhat.com python3.12[49856]: ansible-ansible.legacy.copy Invoked with src=/root/.ansible/tmp/ansible-tmp-1722098083.4056299-10394-219612045307049/.source.conf dest=/home/user1/.config/containers/containers.conf.d/50-systemroles.conf owner=user1 mode=0644 follow=False _original_basename=toml.j2 checksum=94370d6e765779f1c58daf02f667b8f0b74d91f6 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None group=None seuser=None serole=None selevel=None setype=None attributes=None Jul 27 12:34:44 ip-10-31-14-141.us-east-1.aws.redhat.com python3.12[49969]: ansible-file Invoked with path=/home/user1/.config/containers/registries.conf.d state=directory owner=user1 mode=0755 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None group=None seuser=None serole=None selevel=None setype=None attributes=None Jul 27 12:34:45 ip-10-31-14-141.us-east-1.aws.redhat.com python3.12[50082]: ansible-ansible.legacy.stat Invoked with path=/home/user1/.config/containers/registries.conf.d/50-systemroles.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True Jul 27 12:34:45 ip-10-31-14-141.us-east-1.aws.redhat.com python3.12[50172]: ansible-ansible.legacy.copy Invoked with src=/root/.ansible/tmp/ansible-tmp-1722098084.747849-10418-146732810650340/.source.conf dest=/home/user1/.config/containers/registries.conf.d/50-systemroles.conf owner=user1 mode=0644 follow=False _original_basename=toml.j2 checksum=dfb9cd7094a81b3d1bb06512cc9b49a09c75639b backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None group=None seuser=None serole=None selevel=None setype=None attributes=None Jul 27 12:34:45 ip-10-31-14-141.us-east-1.aws.redhat.com python3.12[50285]: ansible-file Invoked with path=/home/user1/.config/containers state=directory owner=user1 mode=0755 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None group=None seuser=None serole=None selevel=None setype=None attributes=None Jul 27 12:34:46 ip-10-31-14-141.us-east-1.aws.redhat.com python3.12[50398]: ansible-ansible.legacy.stat Invoked with path=/home/user1/.config/containers/storage.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True Jul 27 12:34:46 ip-10-31-14-141.us-east-1.aws.redhat.com python3.12[50488]: ansible-ansible.legacy.copy Invoked with src=/root/.ansible/tmp/ansible-tmp-1722098085.95819-10442-131737842277408/.source.conf dest=/home/user1/.config/containers/storage.conf owner=user1 mode=0644 follow=False _original_basename=toml.j2 checksum=d08574b6a1df63dbe1c939ff0bcc7c0b61d03044 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None group=None seuser=None serole=None selevel=None setype=None attributes=None Jul 27 12:34:47 ip-10-31-14-141.us-east-1.aws.redhat.com python3.12[50601]: ansible-file Invoked with path=/home/user1/.config/containers state=directory owner=user1 mode=0755 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None group=None seuser=None serole=None selevel=None setype=None attributes=None Jul 27 12:34:47 ip-10-31-14-141.us-east-1.aws.redhat.com python3.12[50714]: ansible-stat Invoked with path=/home/user1/.config/containers/policy.json follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Jul 27 12:34:47 ip-10-31-14-141.us-east-1.aws.redhat.com python3.12[50827]: ansible-ansible.legacy.stat Invoked with path=/home/user1/.config/containers/policy.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True Jul 27 12:34:48 ip-10-31-14-141.us-east-1.aws.redhat.com python3.12[50917]: ansible-ansible.legacy.copy Invoked with dest=/home/user1/.config/containers/policy.json owner=user1 mode=0644 src=/root/.ansible/tmp/ansible-tmp-1722098087.5861802-10475-262484125048733/.source.json _original_basename=.82v8i689 follow=False checksum=6746c079ad563b735fc39f73d4876654b80b0a0d backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None group=None seuser=None serole=None selevel=None setype=None attributes=None Jul 27 12:34:48 ip-10-31-14-141.us-east-1.aws.redhat.com python3.12[51030]: ansible-stat Invoked with path=/home/user1/.config/containers/containers.conf.d/50-systemroles.conf follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Jul 27 12:34:49 ip-10-31-14-141.us-east-1.aws.redhat.com python3.12[51145]: ansible-stat Invoked with path=/home/user1/.config/containers/registries.conf.d/50-systemroles.conf follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Jul 27 12:34:49 ip-10-31-14-141.us-east-1.aws.redhat.com python3.12[51260]: ansible-stat Invoked with path=/home/user1/.config/containers/storage.conf follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Jul 27 12:34:50 ip-10-31-14-141.us-east-1.aws.redhat.com python3.12[51375]: ansible-stat Invoked with path=/home/user1/.config/containers/policy.json follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Jul 27 12:34:51 ip-10-31-14-141.us-east-1.aws.redhat.com python3.12[51603]: ansible-ansible.legacy.command Invoked with _raw_params=podman --version _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Jul 27 12:34:52 ip-10-31-14-141.us-east-1.aws.redhat.com python3.12[51722]: ansible-getent Invoked with database=group key=3002 fail_key=False service=None split=None Jul 27 12:34:52 ip-10-31-14-141.us-east-1.aws.redhat.com python3.12[51836]: ansible-stat Invoked with path=/usr/bin/getsubids follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Jul 27 12:34:53 ip-10-31-14-141.us-east-1.aws.redhat.com python3.12[51951]: ansible-ansible.legacy.command Invoked with _raw_params=getsubids user1 _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Jul 27 12:34:53 ip-10-31-14-141.us-east-1.aws.redhat.com python3.12[52065]: ansible-ansible.legacy.command Invoked with _raw_params=getsubids -g user1 _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Jul 27 12:34:54 ip-10-31-14-141.us-east-1.aws.redhat.com python3.12[52179]: ansible-file Invoked with path=/home/user1/.config/containers/containers.conf.d state=directory owner=user1 mode=0755 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None group=None seuser=None serole=None selevel=None setype=None attributes=None Jul 27 12:34:54 ip-10-31-14-141.us-east-1.aws.redhat.com python3.12[52292]: ansible-ansible.legacy.stat Invoked with path=/home/user1/.config/containers/containers.conf.d/50-systemroles.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True Jul 27 12:34:55 ip-10-31-14-141.us-east-1.aws.redhat.com python3.12[52349]: ansible-ansible.legacy.file Invoked with owner=user1 mode=0644 dest=/home/user1/.config/containers/containers.conf.d/50-systemroles.conf _original_basename=toml.j2 recurse=False state=file path=/home/user1/.config/containers/containers.conf.d/50-systemroles.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None group=None seuser=None serole=None selevel=None setype=None attributes=None Jul 27 12:34:55 ip-10-31-14-141.us-east-1.aws.redhat.com python3.12[52462]: ansible-file Invoked with path=/home/user1/.config/containers/registries.conf.d state=directory owner=user1 mode=0755 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None group=None seuser=None serole=None selevel=None setype=None attributes=None Jul 27 12:34:56 ip-10-31-14-141.us-east-1.aws.redhat.com python3.12[52575]: ansible-ansible.legacy.stat Invoked with path=/home/user1/.config/containers/registries.conf.d/50-systemroles.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True Jul 27 12:34:56 ip-10-31-14-141.us-east-1.aws.redhat.com python3.12[52632]: ansible-ansible.legacy.file Invoked with owner=user1 mode=0644 dest=/home/user1/.config/containers/registries.conf.d/50-systemroles.conf _original_basename=toml.j2 recurse=False state=file path=/home/user1/.config/containers/registries.conf.d/50-systemroles.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None group=None seuser=None serole=None selevel=None setype=None attributes=None Jul 27 12:34:56 ip-10-31-14-141.us-east-1.aws.redhat.com python3.12[52745]: ansible-file Invoked with path=/home/user1/.config/containers state=directory owner=user1 mode=0755 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None group=None seuser=None serole=None selevel=None setype=None attributes=None Jul 27 12:34:57 ip-10-31-14-141.us-east-1.aws.redhat.com python3.12[52858]: ansible-ansible.legacy.stat Invoked with path=/home/user1/.config/containers/storage.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True Jul 27 12:34:57 ip-10-31-14-141.us-east-1.aws.redhat.com python3.12[52915]: ansible-ansible.legacy.file Invoked with owner=user1 mode=0644 dest=/home/user1/.config/containers/storage.conf _original_basename=toml.j2 recurse=False state=file path=/home/user1/.config/containers/storage.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None group=None seuser=None serole=None selevel=None setype=None attributes=None Jul 27 12:34:58 ip-10-31-14-141.us-east-1.aws.redhat.com python3.12[53028]: ansible-file Invoked with path=/home/user1/.config/containers state=directory owner=user1 mode=0755 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None group=None seuser=None serole=None selevel=None setype=None attributes=None Jul 27 12:34:58 ip-10-31-14-141.us-east-1.aws.redhat.com python3.12[53141]: ansible-stat Invoked with path=/home/user1/.config/containers/policy.json follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Jul 27 12:34:59 ip-10-31-14-141.us-east-1.aws.redhat.com python3.12[53256]: ansible-slurp Invoked with path=/home/user1/.config/containers/policy.json src=/home/user1/.config/containers/policy.json Jul 27 12:34:59 ip-10-31-14-141.us-east-1.aws.redhat.com python3.12[53369]: ansible-stat Invoked with path=/home/user1/.config/containers/containers.conf.d/50-systemroles.conf follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Jul 27 12:35:00 ip-10-31-14-141.us-east-1.aws.redhat.com python3.12[53484]: ansible-stat Invoked with path=/home/user1/.config/containers/registries.conf.d/50-systemroles.conf follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Jul 27 12:35:00 ip-10-31-14-141.us-east-1.aws.redhat.com python3.12[53599]: ansible-stat Invoked with path=/home/user1/.config/containers/storage.conf follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Jul 27 12:35:00 ip-10-31-14-141.us-east-1.aws.redhat.com python3.12[53714]: ansible-stat Invoked with path=/home/user1/.config/containers/policy.json follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Jul 27 12:35:02 ip-10-31-14-141.us-east-1.aws.redhat.com python3.12[53942]: ansible-ansible.legacy.command Invoked with _raw_params=podman --version _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Jul 27 12:35:03 ip-10-31-14-141.us-east-1.aws.redhat.com python3.12[54061]: ansible-getent Invoked with database=passwd key=root fail_key=False service=None split=None Jul 27 12:35:03 ip-10-31-14-141.us-east-1.aws.redhat.com python3.12[54175]: ansible-getent Invoked with database=group key=0 fail_key=False service=None split=None Jul 27 12:35:04 ip-10-31-14-141.us-east-1.aws.redhat.com python3.12[54289]: ansible-stat Invoked with path=/usr/bin/getsubids follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Jul 27 12:35:05 ip-10-31-14-141.us-east-1.aws.redhat.com python3.12[54404]: ansible-file Invoked with path=/etc/containers/containers.conf.d state=directory owner=root mode=0755 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None group=None seuser=None serole=None selevel=None setype=None attributes=None Jul 27 12:35:05 ip-10-31-14-141.us-east-1.aws.redhat.com python3.12[54517]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/containers.conf.d/50-systemroles.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True Jul 27 12:35:05 ip-10-31-14-141.us-east-1.aws.redhat.com python3.12[54607]: ansible-ansible.legacy.copy Invoked with src=/root/.ansible/tmp/ansible-tmp-1722098105.21415-10803-193350502916514/.source.conf dest=/etc/containers/containers.conf.d/50-systemroles.conf owner=root mode=0644 follow=False _original_basename=toml.j2 checksum=94370d6e765779f1c58daf02f667b8f0b74d91f6 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None group=None seuser=None serole=None selevel=None setype=None attributes=None Jul 27 12:35:06 ip-10-31-14-141.us-east-1.aws.redhat.com python3.12[54720]: ansible-file Invoked with path=/etc/containers/registries.conf.d state=directory owner=root mode=0755 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None group=None seuser=None serole=None selevel=None setype=None attributes=None Jul 27 12:35:06 ip-10-31-14-141.us-east-1.aws.redhat.com python3.12[54833]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/registries.conf.d/50-systemroles.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True Jul 27 12:35:07 ip-10-31-14-141.us-east-1.aws.redhat.com python3.12[54923]: ansible-ansible.legacy.copy Invoked with src=/root/.ansible/tmp/ansible-tmp-1722098106.4464886-10827-228693572255732/.source.conf dest=/etc/containers/registries.conf.d/50-systemroles.conf owner=root mode=0644 follow=False _original_basename=toml.j2 checksum=dfb9cd7094a81b3d1bb06512cc9b49a09c75639b backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None group=None seuser=None serole=None selevel=None setype=None attributes=None Jul 27 12:35:07 ip-10-31-14-141.us-east-1.aws.redhat.com python3.12[55036]: ansible-file Invoked with path=/etc/containers state=directory owner=root mode=0755 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None group=None seuser=None serole=None selevel=None setype=None attributes=None Jul 27 12:35:07 ip-10-31-14-141.us-east-1.aws.redhat.com python3.12[55149]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/storage.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True Jul 27 12:35:08 ip-10-31-14-141.us-east-1.aws.redhat.com python3.12[55239]: ansible-ansible.legacy.copy Invoked with src=/root/.ansible/tmp/ansible-tmp-1722098107.6736925-10851-11656137412744/.source.conf dest=/etc/containers/storage.conf owner=root mode=0644 follow=False _original_basename=toml.j2 checksum=d08574b6a1df63dbe1c939ff0bcc7c0b61d03044 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None group=None seuser=None serole=None selevel=None setype=None attributes=None Jul 27 12:35:08 ip-10-31-14-141.us-east-1.aws.redhat.com python3.12[55352]: ansible-file Invoked with path=/etc/containers state=directory owner=root mode=0755 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None group=None seuser=None serole=None selevel=None setype=None attributes=None Jul 27 12:35:09 ip-10-31-14-141.us-east-1.aws.redhat.com python3.12[55465]: ansible-stat Invoked with path=/etc/containers/policy.json follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Jul 27 12:35:09 ip-10-31-14-141.us-east-1.aws.redhat.com python3.12[55580]: ansible-slurp Invoked with path=/etc/containers/policy.json src=/etc/containers/policy.json Jul 27 12:35:10 ip-10-31-14-141.us-east-1.aws.redhat.com python3.12[55693]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/policy.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True Jul 27 12:35:10 ip-10-31-14-141.us-east-1.aws.redhat.com python3.12[55785]: ansible-ansible.legacy.copy Invoked with dest=/etc/containers/policy.json owner=root mode=0644 src=/root/.ansible/tmp/ansible-tmp-1722098109.7571483-10891-203787733281543/.source.json _original_basename=.l1nrv80h follow=False checksum=6746c079ad563b735fc39f73d4876654b80b0a0d backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None group=None seuser=None serole=None selevel=None setype=None attributes=None Jul 27 12:35:11 ip-10-31-14-141.us-east-1.aws.redhat.com python3.12[55898]: ansible-stat Invoked with path=/etc/containers/containers.conf.d/50-systemroles.conf follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Jul 27 12:35:11 ip-10-31-14-141.us-east-1.aws.redhat.com python3.12[56013]: ansible-stat Invoked with path=/etc/containers/registries.conf.d/50-systemroles.conf follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Jul 27 12:35:11 ip-10-31-14-141.us-east-1.aws.redhat.com python3.12[56128]: ansible-stat Invoked with path=/etc/containers/storage.conf follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Jul 27 12:35:12 ip-10-31-14-141.us-east-1.aws.redhat.com python3.12[56243]: ansible-stat Invoked with path=/etc/containers/policy.json follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Jul 27 12:35:13 ip-10-31-14-141.us-east-1.aws.redhat.com python3.12[56471]: ansible-ansible.legacy.command Invoked with _raw_params=podman --version _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Jul 27 12:35:14 ip-10-31-14-141.us-east-1.aws.redhat.com python3.12[56590]: ansible-getent Invoked with database=group key=0 fail_key=False service=None split=None Jul 27 12:35:15 ip-10-31-14-141.us-east-1.aws.redhat.com python3.12[56704]: ansible-stat Invoked with path=/usr/bin/getsubids follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Jul 27 12:35:16 ip-10-31-14-141.us-east-1.aws.redhat.com python3.12[56819]: ansible-file Invoked with path=/etc/containers/containers.conf.d state=directory owner=root mode=0755 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None group=None seuser=None serole=None selevel=None setype=None attributes=None Jul 27 12:35:16 ip-10-31-14-141.us-east-1.aws.redhat.com python3.12[56932]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/containers.conf.d/50-systemroles.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True Jul 27 12:35:16 ip-10-31-14-141.us-east-1.aws.redhat.com python3.12[56989]: ansible-ansible.legacy.file Invoked with owner=root mode=0644 dest=/etc/containers/containers.conf.d/50-systemroles.conf _original_basename=toml.j2 recurse=False state=file path=/etc/containers/containers.conf.d/50-systemroles.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None group=None seuser=None serole=None selevel=None setype=None attributes=None Jul 27 12:35:17 ip-10-31-14-141.us-east-1.aws.redhat.com python3.12[57102]: ansible-file Invoked with path=/etc/containers/registries.conf.d state=directory owner=root mode=0755 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None group=None seuser=None serole=None selevel=None setype=None attributes=None Jul 27 12:35:17 ip-10-31-14-141.us-east-1.aws.redhat.com python3.12[57215]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/registries.conf.d/50-systemroles.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True Jul 27 12:35:17 ip-10-31-14-141.us-east-1.aws.redhat.com python3.12[57272]: ansible-ansible.legacy.file Invoked with owner=root mode=0644 dest=/etc/containers/registries.conf.d/50-systemroles.conf _original_basename=toml.j2 recurse=False state=file path=/etc/containers/registries.conf.d/50-systemroles.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None group=None seuser=None serole=None selevel=None setype=None attributes=None Jul 27 12:35:18 ip-10-31-14-141.us-east-1.aws.redhat.com python3.12[57385]: ansible-file Invoked with path=/etc/containers state=directory owner=root mode=0755 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None group=None seuser=None serole=None selevel=None setype=None attributes=None Jul 27 12:35:18 ip-10-31-14-141.us-east-1.aws.redhat.com python3.12[57498]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/storage.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True Jul 27 12:35:19 ip-10-31-14-141.us-east-1.aws.redhat.com python3.12[57555]: ansible-ansible.legacy.file Invoked with owner=root mode=0644 dest=/etc/containers/storage.conf _original_basename=toml.j2 recurse=False state=file path=/etc/containers/storage.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None group=None seuser=None serole=None selevel=None setype=None attributes=None Jul 27 12:35:19 ip-10-31-14-141.us-east-1.aws.redhat.com python3.12[57668]: ansible-file Invoked with path=/etc/containers state=directory owner=root mode=0755 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None group=None seuser=None serole=None selevel=None setype=None attributes=None Jul 27 12:35:20 ip-10-31-14-141.us-east-1.aws.redhat.com python3.12[57781]: ansible-stat Invoked with path=/etc/containers/policy.json follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Jul 27 12:35:20 ip-10-31-14-141.us-east-1.aws.redhat.com python3.12[57896]: ansible-slurp Invoked with path=/etc/containers/policy.json src=/etc/containers/policy.json Jul 27 12:35:21 ip-10-31-14-141.us-east-1.aws.redhat.com python3.12[58009]: ansible-stat Invoked with path=/etc/containers/containers.conf.d/50-systemroles.conf follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Jul 27 12:35:21 ip-10-31-14-141.us-east-1.aws.redhat.com python3.12[58124]: ansible-stat Invoked with path=/etc/containers/registries.conf.d/50-systemroles.conf follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Jul 27 12:35:21 ip-10-31-14-141.us-east-1.aws.redhat.com python3.12[58239]: ansible-stat Invoked with path=/etc/containers/storage.conf follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Jul 27 12:35:22 ip-10-31-14-141.us-east-1.aws.redhat.com python3.12[58354]: ansible-stat Invoked with path=/etc/containers/policy.json follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Jul 27 12:35:22 ip-10-31-14-141.us-east-1.aws.redhat.com python3.12[58469]: ansible-slurp Invoked with path=/home/user1/.config/containers/containers.conf.d/50-systemroles.conf src=/home/user1/.config/containers/containers.conf.d/50-systemroles.conf Jul 27 12:35:23 ip-10-31-14-141.us-east-1.aws.redhat.com python3.12[58582]: ansible-slurp Invoked with path=/home/user1/.config/containers/registries.conf.d/50-systemroles.conf src=/home/user1/.config/containers/registries.conf.d/50-systemroles.conf Jul 27 12:35:23 ip-10-31-14-141.us-east-1.aws.redhat.com python3.12[58695]: ansible-slurp Invoked with path=/home/user1/.config/containers/storage.conf src=/home/user1/.config/containers/storage.conf Jul 27 12:35:24 ip-10-31-14-141.us-east-1.aws.redhat.com python3.12[58808]: ansible-slurp Invoked with path=/etc/containers/containers.conf.d/50-systemroles.conf src=/etc/containers/containers.conf.d/50-systemroles.conf Jul 27 12:35:24 ip-10-31-14-141.us-east-1.aws.redhat.com python3.12[58921]: ansible-slurp Invoked with path=/etc/containers/registries.conf.d/50-systemroles.conf src=/etc/containers/registries.conf.d/50-systemroles.conf Jul 27 12:35:25 ip-10-31-14-141.us-east-1.aws.redhat.com python3.12[59034]: ansible-slurp Invoked with path=/etc/containers/storage.conf src=/etc/containers/storage.conf Jul 27 12:35:25 ip-10-31-14-141.us-east-1.aws.redhat.com python3.12[59147]: ansible-file Invoked with state=absent path=/etc/containers/containers.conf.d/50-systemroles.conf recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Jul 27 12:35:26 ip-10-31-14-141.us-east-1.aws.redhat.com python3.12[59260]: ansible-file Invoked with state=absent path=/etc/containers/registries.conf.d/50-systemroles.conf recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Jul 27 12:35:26 ip-10-31-14-141.us-east-1.aws.redhat.com python3.12[59373]: ansible-file Invoked with state=absent path=/etc/containers/storage.conf recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Jul 27 12:35:26 ip-10-31-14-141.us-east-1.aws.redhat.com python3.12[59486]: ansible-file Invoked with state=absent path=/etc/containers/policy.json recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Jul 27 12:35:27 ip-10-31-14-141.us-east-1.aws.redhat.com python3.12[59599]: ansible-file Invoked with state=absent path=/home/user1/.config/containers/containers.conf.d/50-systemroles.conf recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Jul 27 12:35:27 ip-10-31-14-141.us-east-1.aws.redhat.com python3.12[59712]: ansible-file Invoked with state=absent path=/home/user1/.config/containers/registries.conf.d/50-systemroles.conf recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Jul 27 12:35:27 ip-10-31-14-141.us-east-1.aws.redhat.com python3.12[59825]: ansible-file Invoked with state=absent path=/home/user1/.config/containers/storage.conf recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Jul 27 12:35:28 ip-10-31-14-141.us-east-1.aws.redhat.com python3.12[59938]: ansible-file Invoked with state=absent path=/home/user1/.config/containers/policy.json recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Jul 27 12:35:28 ip-10-31-14-141.us-east-1.aws.redhat.com python3.12[60051]: ansible-ansible.legacy.command Invoked with _raw_params=tar xfvpP /tmp/lsr_podman_config_w171idou/backup.tar _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Jul 27 12:35:29 ip-10-31-14-141.us-east-1.aws.redhat.com python3.12[60165]: ansible-file Invoked with state=absent path=/tmp/lsr_podman_config_w171idou recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Jul 27 12:35:31 ip-10-31-14-141.us-east-1.aws.redhat.com python3.12[60315]: ansible-setup Invoked with gather_subset=['!all', '!min', 'distribution', 'distribution_major_version', 'distribution_version', 'os_family'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Jul 27 12:35:32 ip-10-31-14-141.us-east-1.aws.redhat.com python3.12[60430]: ansible-stat Invoked with path=/run/ostree-booted follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Jul 27 12:35:34 ip-10-31-14-141.us-east-1.aws.redhat.com python3.12[60656]: ansible-ansible.legacy.command Invoked with _raw_params=podman --version _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Jul 27 12:35:34 ip-10-31-14-141.us-east-1.aws.redhat.com python3.12[60775]: ansible-getent Invoked with database=passwd key=root fail_key=False service=None split=None Jul 27 12:35:35 ip-10-31-14-141.us-east-1.aws.redhat.com python3.12[60889]: ansible-getent Invoked with database=group key=0 fail_key=False service=None split=None Jul 27 12:35:35 ip-10-31-14-141.us-east-1.aws.redhat.com python3.12[61003]: ansible-stat Invoked with path=/usr/bin/getsubids follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Jul 27 12:35:39 ip-10-31-14-141.us-east-1.aws.redhat.com python3.12[61155]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Jul 27 12:35:42 ip-10-31-14-141.us-east-1.aws.redhat.com python3.12[61297]: ansible-stat Invoked with path=/run/ostree-booted follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Jul 27 12:35:43 ip-10-31-14-141.us-east-1.aws.redhat.com python3.12[61523]: ansible-ansible.legacy.command Invoked with _raw_params=podman --version _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Jul 27 12:35:44 ip-10-31-14-141.us-east-1.aws.redhat.com python3.12[61642]: ansible-getent Invoked with database=passwd key=root fail_key=False service=None split=None Jul 27 12:35:45 ip-10-31-14-141.us-east-1.aws.redhat.com python3.12[61756]: ansible-getent Invoked with database=group key=0 fail_key=False service=None split=None Jul 27 12:35:45 ip-10-31-14-141.us-east-1.aws.redhat.com python3.12[61870]: ansible-stat Invoked with path=/usr/bin/getsubids follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Jul 27 12:35:50 ip-10-31-14-141.us-east-1.aws.redhat.com python3.12[62022]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Jul 27 12:35:51 ip-10-31-14-141.us-east-1.aws.redhat.com python3.12[62164]: ansible-stat Invoked with path=/run/ostree-booted follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Jul 27 12:35:53 ip-10-31-14-141.us-east-1.aws.redhat.com python3.12[62390]: ansible-ansible.legacy.command Invoked with _raw_params=podman --version _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Jul 27 12:35:53 ip-10-31-14-141.us-east-1.aws.redhat.com python3.12[62509]: ansible-getent Invoked with database=passwd key=root fail_key=False service=None split=None Jul 27 12:35:54 ip-10-31-14-141.us-east-1.aws.redhat.com python3.12[62623]: ansible-getent Invoked with database=group key=0 fail_key=False service=None split=None Jul 27 12:35:54 ip-10-31-14-141.us-east-1.aws.redhat.com python3.12[62737]: ansible-stat Invoked with path=/usr/bin/getsubids follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Jul 27 12:35:56 ip-10-31-14-141.us-east-1.aws.redhat.com python3.12[62852]: ansible-getent Invoked with database=group key=0 fail_key=False service=None split=None Jul 27 12:35:57 ip-10-31-14-141.us-east-1.aws.redhat.com python3.12[62966]: ansible-stat Invoked with path=/usr/bin/getsubids follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Jul 27 12:35:58 ip-10-31-14-141.us-east-1.aws.redhat.com python3.12[63081]: ansible-file Invoked with path=/etc/containers/systemd state=directory owner=root group=0 mode=0755 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Jul 27 12:35:59 ip-10-31-14-141.us-east-1.aws.redhat.com python3.12[63194]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/systemd/nopull.container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True Jul 27 12:35:59 ip-10-31-14-141.us-east-1.aws.redhat.com python3.12[63284]: ansible-ansible.legacy.copy Invoked with src=/root/.ansible/tmp/ansible-tmp-1722098158.794963-11918-214056238671650/.source.container dest=/etc/containers/systemd/nopull.container owner=root group=0 mode=0644 follow=False _original_basename=systemd.j2 checksum=670d64fc68a9768edb20cad26df2acc703542d85 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Jul 27 12:36:01 ip-10-31-14-141.us-east-1.aws.redhat.com python3.12[63510]: ansible-ansible.legacy.command Invoked with _raw_params=podman --version _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Jul 27 12:36:02 ip-10-31-14-141.us-east-1.aws.redhat.com python3.12[63629]: ansible-getent Invoked with database=group key=0 fail_key=False service=None split=None Jul 27 12:36:02 ip-10-31-14-141.us-east-1.aws.redhat.com python3.12[63743]: ansible-stat Invoked with path=/usr/bin/getsubids follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Jul 27 12:36:04 ip-10-31-14-141.us-east-1.aws.redhat.com python3.12[63858]: ansible-getent Invoked with database=group key=0 fail_key=False service=None split=None Jul 27 12:36:04 ip-10-31-14-141.us-east-1.aws.redhat.com python3.12[63972]: ansible-stat Invoked with path=/usr/bin/getsubids follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Jul 27 12:36:06 ip-10-31-14-141.us-east-1.aws.redhat.com systemd[23350]: Starting grub-boot-success.service - Mark boot as successful... ░░ Subject: A start job for unit UNIT has begun execution ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ A start job for unit UNIT has begun execution. ░░ ░░ The job identifier is 186. Jul 27 12:36:06 ip-10-31-14-141.us-east-1.aws.redhat.com systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully. ░░ Subject: Unit succeeded ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ The unit var-lib-containers-storage-overlay.mount has successfully entered the 'dead' state. Jul 27 12:36:06 ip-10-31-14-141.us-east-1.aws.redhat.com systemd[23350]: Finished grub-boot-success.service - Mark boot as successful. ░░ Subject: A start job for unit UNIT has finished successfully ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ A start job for unit UNIT has finished successfully. ░░ ░░ The job identifier is 186. Jul 27 12:36:06 ip-10-31-14-141.us-east-1.aws.redhat.com systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully. ░░ Subject: Unit succeeded ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ The unit var-lib-containers-storage-overlay.mount has successfully entered the 'dead' state. Jul 27 12:36:06 ip-10-31-14-141.us-east-1.aws.redhat.com podman[64106]: 2024-07-27 12:36:06.489138737 -0400 EDT m=+0.031029409 image pull-error this_is_a_bogus_image:latest short-name resolution enforced but cannot prompt without a TTY Jul 27 12:36:06 ip-10-31-14-141.us-east-1.aws.redhat.com python3.12[64226]: ansible-file Invoked with path=/etc/containers/systemd state=directory owner=root group=0 mode=0755 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Jul 27 12:36:07 ip-10-31-14-141.us-east-1.aws.redhat.com systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully. ░░ Subject: Unit succeeded ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ The unit var-lib-containers-storage-overlay.mount has successfully entered the 'dead' state. Jul 27 12:36:07 ip-10-31-14-141.us-east-1.aws.redhat.com python3.12[64339]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/systemd/bogus.container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True Jul 27 12:36:07 ip-10-31-14-141.us-east-1.aws.redhat.com python3.12[64429]: ansible-ansible.legacy.copy Invoked with src=/root/.ansible/tmp/ansible-tmp-1722098167.0931985-12082-65274845531540/.source.container dest=/etc/containers/systemd/bogus.container owner=root group=0 mode=0644 follow=False _original_basename=systemd.j2 checksum=1d087e679d135214e8ac9ccaf33b2222916efb7f backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Jul 27 12:36:09 ip-10-31-14-141.us-east-1.aws.redhat.com python3.12[64655]: ansible-ansible.legacy.command Invoked with _raw_params=podman --version _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Jul 27 12:36:10 ip-10-31-14-141.us-east-1.aws.redhat.com systemd[23350]: Created slice background.slice - User Background Tasks Slice. ░░ Subject: A start job for unit UNIT has finished successfully ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ A start job for unit UNIT has finished successfully. ░░ ░░ The job identifier is 199. Jul 27 12:36:10 ip-10-31-14-141.us-east-1.aws.redhat.com systemd[23350]: Starting systemd-tmpfiles-clean.service - Cleanup of User's Temporary Files and Directories... ░░ Subject: A start job for unit UNIT has begun execution ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ A start job for unit UNIT has begun execution. ░░ ░░ The job identifier is 198. Jul 27 12:36:10 ip-10-31-14-141.us-east-1.aws.redhat.com python3.12[64774]: ansible-getent Invoked with database=group key=0 fail_key=False service=None split=None Jul 27 12:36:10 ip-10-31-14-141.us-east-1.aws.redhat.com systemd[23350]: Finished systemd-tmpfiles-clean.service - Cleanup of User's Temporary Files and Directories. ░░ Subject: A start job for unit UNIT has finished successfully ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ A start job for unit UNIT has finished successfully. ░░ ░░ The job identifier is 198. Jul 27 12:36:10 ip-10-31-14-141.us-east-1.aws.redhat.com python3.12[64890]: ansible-stat Invoked with path=/usr/bin/getsubids follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Jul 27 12:36:12 ip-10-31-14-141.us-east-1.aws.redhat.com python3.12[65005]: ansible-getent Invoked with database=group key=0 fail_key=False service=None split=None Jul 27 12:36:13 ip-10-31-14-141.us-east-1.aws.redhat.com python3.12[65119]: ansible-stat Invoked with path=/usr/bin/getsubids follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Jul 27 12:36:14 ip-10-31-14-141.us-east-1.aws.redhat.com python3.12[65234]: ansible-systemd Invoked with name=nopull.service scope=system state=stopped enabled=False force=True daemon_reload=False daemon_reexec=False no_block=False masked=None Jul 27 12:36:15 ip-10-31-14-141.us-east-1.aws.redhat.com python3.12[65348]: ansible-stat Invoked with path=/etc/containers/systemd/nopull.container follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Jul 27 12:36:16 ip-10-31-14-141.us-east-1.aws.redhat.com python3.12[65576]: ansible-file Invoked with path=/etc/containers/systemd/nopull.container state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Jul 27 12:36:16 ip-10-31-14-141.us-east-1.aws.redhat.com python3.12[65689]: ansible-systemd Invoked with daemon_reload=True scope=system daemon_reexec=False no_block=False name=None state=None enabled=None force=None masked=None Jul 27 12:36:16 ip-10-31-14-141.us-east-1.aws.redhat.com systemd[1]: Reload requested from client PID 65690 ('systemctl') (unit session-5.scope)... Jul 27 12:36:16 ip-10-31-14-141.us-east-1.aws.redhat.com systemd[1]: Reloading... Jul 27 12:36:17 ip-10-31-14-141.us-east-1.aws.redhat.com systemd[1]: Reloading finished in 222 ms. Jul 27 12:36:17 ip-10-31-14-141.us-east-1.aws.redhat.com systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully. ░░ Subject: Unit succeeded ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ The unit var-lib-containers-storage-overlay.mount has successfully entered the 'dead' state. Jul 27 12:36:19 ip-10-31-14-141.us-east-1.aws.redhat.com python3.12[66091]: ansible-ansible.legacy.command Invoked with _raw_params=podman --version _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Jul 27 12:36:20 ip-10-31-14-141.us-east-1.aws.redhat.com python3.12[66210]: ansible-getent Invoked with database=group key=0 fail_key=False service=None split=None Jul 27 12:36:21 ip-10-31-14-141.us-east-1.aws.redhat.com python3.12[66324]: ansible-stat Invoked with path=/usr/bin/getsubids follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Jul 27 12:36:23 ip-10-31-14-141.us-east-1.aws.redhat.com python3.12[66439]: ansible-getent Invoked with database=group key=0 fail_key=False service=None split=None Jul 27 12:36:23 ip-10-31-14-141.us-east-1.aws.redhat.com python3.12[66553]: ansible-stat Invoked with path=/usr/bin/getsubids follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Jul 27 12:36:24 ip-10-31-14-141.us-east-1.aws.redhat.com python3.12[66668]: ansible-systemd Invoked with name=bogus.service scope=system state=stopped enabled=False force=True daemon_reload=False daemon_reexec=False no_block=False masked=None Jul 27 12:36:24 ip-10-31-14-141.us-east-1.aws.redhat.com systemd[1]: Reload requested from client PID 66671 ('systemctl') (unit session-5.scope)... Jul 27 12:36:24 ip-10-31-14-141.us-east-1.aws.redhat.com systemd[1]: Reloading... Jul 27 12:36:24 ip-10-31-14-141.us-east-1.aws.redhat.com quadlet-generator[65715]: Warning: bogus.container specifies the image "this_is_a_bogus_image" which not a fully qualified image name. This is not ideal for performance and security reasons. See the podman-pull manpage discussion of short-name-aliases.conf for details. Jul 27 12:36:25 ip-10-31-14-141.us-east-1.aws.redhat.com systemd[1]: Reloading finished in 224 ms. Jul 27 12:36:25 ip-10-31-14-141.us-east-1.aws.redhat.com python3.12[66838]: ansible-stat Invoked with path=/etc/containers/systemd/bogus.container follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Jul 27 12:36:26 ip-10-31-14-141.us-east-1.aws.redhat.com python3.12[67066]: ansible-file Invoked with path=/etc/containers/systemd/bogus.container state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Jul 27 12:36:26 ip-10-31-14-141.us-east-1.aws.redhat.com python3.12[67179]: ansible-systemd Invoked with daemon_reload=True scope=system daemon_reexec=False no_block=False name=None state=None enabled=None force=None masked=None Jul 27 12:36:26 ip-10-31-14-141.us-east-1.aws.redhat.com systemd[1]: Reload requested from client PID 67180 ('systemctl') (unit session-5.scope)... Jul 27 12:36:26 ip-10-31-14-141.us-east-1.aws.redhat.com systemd[1]: Reloading... Jul 27 12:36:27 ip-10-31-14-141.us-east-1.aws.redhat.com systemd[1]: Reloading finished in 212 ms. Jul 27 12:36:27 ip-10-31-14-141.us-east-1.aws.redhat.com systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully. ░░ Subject: Unit succeeded ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ The unit var-lib-containers-storage-overlay.mount has successfully entered the 'dead' state. Jul 27 12:36:28 ip-10-31-14-141.us-east-1.aws.redhat.com python3.12[67468]: ansible-user Invoked with name=user_quadlet_basic uid=1111 state=present non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on ip-10-31-14-141.us-east-1.aws.redhat.com update_password=always group=None groups=None comment=None home=None shell=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None Jul 27 12:36:28 ip-10-31-14-141.us-east-1.aws.redhat.com useradd[67470]: new group: name=user_quadlet_basic, GID=1111 Jul 27 12:36:28 ip-10-31-14-141.us-east-1.aws.redhat.com useradd[67470]: new user: name=user_quadlet_basic, UID=1111, GID=1111, home=/home/user_quadlet_basic, shell=/bin/bash, from=/dev/pts/0 Jul 27 12:36:30 ip-10-31-14-141.us-east-1.aws.redhat.com python3.12[67700]: ansible-ansible.legacy.command Invoked with _raw_params=podman --version _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Jul 27 12:36:31 ip-10-31-14-141.us-east-1.aws.redhat.com python3.12[67819]: ansible-getent Invoked with database=passwd key=user_quadlet_basic fail_key=False service=None split=None Jul 27 12:36:32 ip-10-31-14-141.us-east-1.aws.redhat.com python3.12[67933]: ansible-getent Invoked with database=group key=1111 fail_key=False service=None split=None Jul 27 12:36:32 ip-10-31-14-141.us-east-1.aws.redhat.com python3.12[68047]: ansible-stat Invoked with path=/usr/bin/getsubids follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Jul 27 12:36:32 ip-10-31-14-141.us-east-1.aws.redhat.com python3.12[68162]: ansible-ansible.legacy.command Invoked with _raw_params=getsubids user_quadlet_basic _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Jul 27 12:36:33 ip-10-31-14-141.us-east-1.aws.redhat.com python3.12[68276]: ansible-ansible.legacy.command Invoked with _raw_params=getsubids -g user_quadlet_basic _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Jul 27 12:36:34 ip-10-31-14-141.us-east-1.aws.redhat.com python3.12[68390]: ansible-ansible.legacy.command Invoked with _raw_params=set -x set -o pipefail exec 1>&2 #podman volume rm --all #podman network prune -f podman volume ls podman network ls podman secret ls podman container ls podman pod ls podman images systemctl list-units | grep quadlet _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Jul 27 12:36:34 ip-10-31-14-141.us-east-1.aws.redhat.com systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully. ░░ Subject: Unit succeeded ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ The unit var-lib-containers-storage-overlay.mount has successfully entered the 'dead' state. Jul 27 12:36:34 ip-10-31-14-141.us-east-1.aws.redhat.com systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully. ░░ Subject: Unit succeeded ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ The unit var-lib-containers-storage-overlay.mount has successfully entered the 'dead' state. Jul 27 12:36:35 ip-10-31-14-141.us-east-1.aws.redhat.com systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully. ░░ Subject: Unit succeeded ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ The unit var-lib-containers-storage-overlay.mount has successfully entered the 'dead' state. Jul 27 12:36:36 ip-10-31-14-141.us-east-1.aws.redhat.com python3.12[68665]: ansible-ansible.legacy.command Invoked with _raw_params=podman --version _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Jul 27 12:36:37 ip-10-31-14-141.us-east-1.aws.redhat.com python3.12[68784]: ansible-getent Invoked with database=group key=1111 fail_key=False service=None split=None Jul 27 12:36:37 ip-10-31-14-141.us-east-1.aws.redhat.com python3.12[68898]: ansible-stat Invoked with path=/usr/bin/getsubids follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Jul 27 12:36:38 ip-10-31-14-141.us-east-1.aws.redhat.com python3.12[69013]: ansible-ansible.legacy.command Invoked with _raw_params=getsubids user_quadlet_basic _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Jul 27 12:36:38 ip-10-31-14-141.us-east-1.aws.redhat.com python3.12[69127]: ansible-ansible.legacy.command Invoked with _raw_params=getsubids -g user_quadlet_basic _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Jul 27 12:36:40 ip-10-31-14-141.us-east-1.aws.redhat.com python3.12[69241]: ansible-ansible.legacy.command Invoked with _raw_params=journalctl -ex _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None PLAY RECAP ********************************************************************* managed_node1 : ok=201 changed=6 unreachable=0 failed=2 skipped=297 rescued=2 ignored=0 Saturday 27 July 2024 12:36:40 -0400 (0:00:00.562) 0:00:50.564 ********* =============================================================================== Gathering Facts --------------------------------------------------------- 1.22s /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/tests/podman/tests_quadlet_basic.yml:3 fedora.linux_system_roles.podman : Gather the package facts ------------- 1.01s /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:6 fedora.linux_system_roles.podman : Ensure quadlet file is present ------- 0.85s /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/create_update_quadlet_spec.yml:70 fedora.linux_system_roles.podman : Stop and disable service ------------- 0.83s /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/cleanup_quadlet_spec.yml:12 fedora.linux_system_roles.podman : Stop and disable service ------------- 0.83s /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/cleanup_quadlet_spec.yml:12 fedora.linux_system_roles.podman : Gather the package facts ------------- 0.81s /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:6 fedora.linux_system_roles.podman : Gather the package facts ------------- 0.81s /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:6 fedora.linux_system_roles.podman : Gather the package facts ------------- 0.81s /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:6 fedora.linux_system_roles.podman : Gather the package facts ------------- 0.81s /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:6 fedora.linux_system_roles.podman : Gather the package facts ------------- 0.80s /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:6 fedora.linux_system_roles.podman : Refresh systemd ---------------------- 0.80s /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/cleanup_quadlet_spec.yml:48 fedora.linux_system_roles.podman : Refresh systemd ---------------------- 0.78s /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/cleanup_quadlet_spec.yml:48 fedora.linux_system_roles.podman : Ensure quadlet file is present ------- 0.75s /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/create_update_quadlet_spec.yml:70 fedora.linux_system_roles.podman : Ensure container images are present --- 0.70s /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/create_update_quadlet_spec.yml:18 Create user for testing ------------------------------------------------- 0.66s /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/tests/podman/tests_quadlet_basic.yml:109 Debug3 ------------------------------------------------------------------ 0.60s /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/tests/podman/tests_quadlet_basic.yml:253 Dump journal ------------------------------------------------------------ 0.56s /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/tests/podman/tests_quadlet_basic.yml:319 fedora.linux_system_roles.podman : Slurp quadlet file ------------------- 0.54s /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/parse_quadlet_file.yml:6 fedora.linux_system_roles.podman : Ensure the quadlet directory is present --- 0.52s /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/create_update_quadlet_spec.yml:39 fedora.linux_system_roles.podman : Get podman version ------------------- 0.50s /tmp/tmp.kBSwYGXc3s/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:22